Rise against the machine: Amazon and automated recruitment

July 21, 2022

With the use of automation growing across industries, recruitment hasn’t been an exception. Yet while elsewhere it may lead to more efficient services or greater security, it is hardly a treat to have your job application rejected by an algorithm. If this is the case, which rights of the data subjects are affected? NOYB’s complaint against Amazon, submitted to the Luxembourg National Commission for Data Protection (NCDP), provides answers.   

 

Background 

In this case, NOYB represents a job applicant rejected by Amazon Mechanical Turk (MTurk). MTurk is a crowdsourcing platform where distributed workforce performs tasks such as data validation, content moderation, and survey participation. The applicant attempted to create a worker account on MTurk and received an automated email shortly after, stating that their application was rejected. The email referred to ‘account review criteria’ without naming any of those or providing a further insight into the decision-making. The applicant then made several attempts to contact Amazon with an access request, to no avail.  

 

Rights at stake 

The situation clearly concerns the right not to be subject to automated decision-making, while the right to information and access are further involved.  

 

GDPR prohibits solely automated decisions which produce legal effects on the data subjects or similarly significantly affect them1. According to Article 29 WP Guidelines on automated decision-making and profiling, to ‘significantly affect’ should be read as to produce decisions which ‘have the potential to: 

  • significantly affect the circumstances, behaviour or choices of the individuals concerned; 
  • have a prolonged or permanent impact on the data subject; or 
  • at its most extreme, lead to the exclusion or discrimination of individuals’.  

 

The Guidelines mention denied employment as an example of significant effect. Indeed, such rejection may seriously influence the applicant’s financial and professional circumstances, and, as a result, virtually all choices that they make, from consumer to social ones. Depending on the situation surrounding the rejection, a long-lasting or permanent psychological impact cannot be ruled out. And, in the absence of transparency about the decision-making criteria, one cannot be sure that, for instance, special data categories are not involved.  

 

The GDPR also provides exceptions to this general prohibition2. Firstly, automated decision-making must be necessary to enter into or perform a contract between the data subject and the data controller3. In the given case, as the complaint provides, it is not even clear which entity acts as controller. Yet, on a general note, certain automated recruitment operations may be allowed on this ground.  This may be the case of a first selection of candidates by popular (international) businesses, where applications come in thousands. In that case, as the Guidelines provide, ‘[r]outine human involvement can sometimes be impractical or impossible due to the sheer quantity of data being processed’ The controller should demonstrate, however, that no less intrusive option was available and that recruitment is ineffective otherwise.  

 

Another exception is the explicit consent of the data subject4. In the case at hand, consent was not asked for. Yet this ground can’t be claimed anyway due to the imbalance of power between the employer and the applicants. Indeed, as GDPR provides, consent should be, first and foremost, freely given5. According to Article 29 WP Guidelines on consent, ‘consent can only be valid if the data subject is able to exercise a real choice […]. Consent will not be free in cases where there is any element of […] inability to exercise free will’. In case of recruitment, the asymmetry of power is so great that consent can hardly be freely given. In the given case, if applicants wish to be considered, they have no other choice but to consent to automated decision-making, which goes against the very essence of consent under GDPR.  

   

Next, the right to information and access are involved. Both where data is obtained directly from and indirectly about individuals6, they must be informed about automated decision-making and given meaningful information on its logic. This might be especially important in recruitment - to ensure that the logic of the algorithm excludes special data categories, such as trade union membership. In the given case, no information on automated decision-making and its criteria was provided whatsoever. As an example, the generic Amazon privacy notice, applicable also to MTurk, does not mention automated decision-making at all. And, as the applicant’s access requests remained unanswered, the breach of the right to access and the transparency principle is clear7

 

Conclusion 

With the ever expanding use of automation, algorithmic decision-making in recruitment will become more popular. While we await the decision of the Luxembourg NCDP, it is clear that the principles of lawfulness, fairness and transparency8 will become key to similar future complaints. If companies are to avoid them, what steps should they take?  

 

First of all, a DPIA should be performed. According to EDPS Decision on DPIAs by EU institutions and bodies, which we can use here by analogy, DPIAs should be triggered where at least two data processing aspects entail high risk for data subjects. The Decision provides that, in automated recruitment, the following such aspects are in play: ‘systematic and extensive evaluation of personal aspects […], including profiling and predicting’, ‘automated-decision making with legal or similar significant effect’, and ‘technological or organisational solutions that can involve novel forms of data collection and usage’. A DPIA should include a detailed description of the envisaged automated decision-making, assess and document compliance, for instance, regarding data subject rights, and determine further mitigating measures.  

 

Further, right to information is key. Besides the usual attributes of a privacy statement9, the company should at least provide applicants with information about the existence of automated decision-making and a meaningful idea of the criteria used10. This can be done, for instance, as part of said privacy statement or via an email the applicant receives when they create a candidate account on the company website. Similarly, the right of access should be complied with, with access provided not only to the data and information commonly required11, but also to the logic and criteria of automated decision-making12. The company should designate person(s) responsible for access requests and establish a clear, step-by-step procedure for addressing them.  

 

Finally, candidates should be able to obtain human intervention on part of the hiring company13. After being rejected in a first-stage automated shortlisting, the applicants should be able to contest that decision. For these cases, the company may, for example, develop a procedure of application review by an HR (panel). That review should be documented and sent to the applicant upon request, if necessary with the assessments aggregated and/or in a redacted form. The right to contest automated decision-making further includes an opportunity for the applicant to ‘express his or her point of view’. This may entail a possibility to send a motivation letter, if previously not required, or to present arguments in one’s favor in a conversation with HR. 

 

The importance of observing the rules of automated decision-making GDPR establishes cannot be overestimated. After all, being provided with clear and transparent information, contesting an automated decision and obtaining its human review give applicants a voice and, to an extent, correct the power imbalance of the hiring process. In this way, a cornerstone of all fundamental rights is protected –  human dignity. If we uphold its value consistently, there will be no need for applicants to rise against the algorithmic machine.  

1 Article 22(1) GDPR.

2 Article 22(2) GDPR.

3 Article 22(2)(a) GDPR. 

4 Article 22(2)(c) GDPR.

5 Article 2(11) GDPR.

6 Article 13(2)(f) and 14(2)(g) GDPR,respectively.

7 Article 15(1) and 12(3) GDPR, respectively.

8 Article 5(1)(a) GDPR.

9 Article 13(1) and 14(1) GDPR.

10 Article 13(2)(f) and 14(2)(g) GDPR.

11 Article 15(1)(a-g) GDPR.

12 Article 15(h) GDPR.

13 Article 22(3) GDPR.

Download