The EEOC has been focused on AI discrimination in the workplace for some time. It has been a particular focus, though not exclusively with him, for outgoing EEOC Commissioner Keith Sonderling, who actually wrote a law review article on the topic (see ¶ 1 of thoughts/takeaways §). During the last academic year, I was part of a Ohio Northern University Law Review symposium on artificial intelligence and spoke on AI in employment and what that means for people with disabilities. That presentation will become a law review article coming out in the fall.
The case of the day, Mobley v. Workday, Inc., here, deals with AI used in employment decisions. As usual, the blog entry is divided into categories and they are: facts; court’s reasoning that Workday is an employer; court’s reasoning that Workday is not an employment agency; court’s reasoning that plaintiff’s disparate impact claim can proceed; court’s reasoning that plaintiff’s disparate treatment claim cannot proceed; court’s reasoning that plaintiff’s aiding and abetting claim under the California antidiscrimination law cannot proceed; and thoughts/takeaways. Of course, the reader is free to focus on any or all of the categories.
I
Facts
Workday provides its customers with a platform on the customer’s website to collect, process, and screen job applications. Workday’s website states that it can “reduce time to hire by automatically dispositioning or moving candidates forward in the recruiting process.” Workday allegedly “embeds artificial intelligence (‘AI’) and machine learning (‘ML’) into its algorithmic decision-making tools, enabling these applications to make hiring decisions.” In addition, Workday’s applicant screening tools allegedly integrate “pymetrics” that “use neuroscience data and AI,” in combination with existing employee referrals and recommendations. According to Mobley, these tools “determine whether an employer should accept or reject an application” and are designed in a manner that reflects employer biases and relies on biased training data. An applicant can advance in the hiring process only if they get past Workday’s screening algorithms.
Mobley is an African American male over the age of forty with a bachelor’s degree in finance from Morehouse College—an all-male Historically Black College and University (“HBCU”). He is also an honors graduate of ITT Technical Institute and Server+ certified. Mobley suffers from anxiety and depression. Since 2010, he has worked in various financial, IT help-desk, and customer-service oriented jobs. For example, Mobley has experience as an Advanced Solutions Engineer with Hewlett Packard Enterprise, a Customer Service Representative with the Internal Revenue Service, and a Support Specialist, Level 1A Manager with AT&T Digital Life.
Mobley has allegedly applied to over 100 positions with companies that use Workday’s screening tools for talent acquisition and hiring since 2017. Numerous positions also required him to take a Workday-branded assessment and/or personality test. Mobley alleges that these assessments and personality tests are likely to reveal mental health disorders or cognitive impairments, and that those like Mobley who suffer from depression and anxiety are likely to perform worse on these assessments and be screened out. Workday’s screening tools then allegedly use the information from those tests and assessments to evaluate an applicant’s qualifications and recommend whether the applicant should be accepted or rejected.
Despite his qualifications, Mobley was allegedly denied employment for every one of the 100-plus applications that he submitted to companies using Workday’s platform. For example, when Mobley was working for Hewlett Packard on a contract basis, he applied via hpe@myWorkday.com for a Service Solutions Technical Consultant position, the qualifications for which allegedly mirrored those for the role he was already in. His application was rejected the next month. On another occasion, Mobley applied for a Customer Services Specialist position with Unum via unum@myWorkday.com at 12:55 a.m., but his application was rejected less than an hour later. Other applications for customer service roles submitted through Workday were also rejected. For the positions to which he applied, Mobley alleges that he met their experiential and educational requirements.
Mobley alleges that Workday’s algorithmic decision-making tools discriminate against job applicants who are African American, over the age of 40, and/or disabled.
II
Court’s Reasoning That Workday Is an Employer And May Be Held Liable As an Agent of the Employer
- The antidiscrimination laws under which Mobley sued all prohibit discrimination not just by employers themselves but also by their agents.
- Employers cannot escape liability for discrimination by delegating their traditional functions, such as hiring, to a third party.
- Federal appellate courts outside of the Ninth Circuit have held that an employer’s agent may be independently liable when the employer has delegated to the agent functions traditionally exercised by an employer.
- Where the employer has delegated control of some of the employer’s traditional rights, such as hiring or firing, to a third party, the third party has been found to be an employer by virtue of the agency relationship.
- The antidiscrimination statutes that Mobley sued under define the term “employer,” as a person engaged in an industry affecting commerce who has at least 15 employees (it is twenty for the age discrimination statute- ADEA), for each working day in each of 20 or more calendar weeks in the current or preceding calendar year, and any agent of such a person.
- All the relevant statutes prohibit employers (ADA refers to covered entities, which includes employers), from engaging in certain acts of discrimination.
- An employer and an employment agency are not at all the same thing.
- Employment agencies procure employees for an employer, which means they find candidate for an employer’s positions. They do not actually employ those employees.
- Employment agencies under the applicable laws face a different set of restrictions from employers. They are liable when they fail or refuse to refer individuals for consideration by employers on prohibited bases, but they are not subject to the prohibitions applicable to employers in carrying out their traditional functions, such as hiring, discharging, compensating, or promoting employees.
- An entity liable as an employment agency is not necessarily liable as an agent of an employer.
- Agent of an employer and employment agency have very distinct meanings.
- It simply doesn’t make any sense that companies can be allowed to escape liability for hiring decisions by saying that the function has been handed over to someone else, in this case artificial intelligence. Congress actually anticipated such a problem and crafted a solution by including the term “agent,” in the definition of employer and by making that a separate term from “employment agency.”
- Workday’s software according to the complaint, participates in the decision-making process by recommending some candidate to move forward and rejecting others. It allegedly incorporates artificial intelligence and machine learning into its algorithmic decision-making tools to make hiring decisions, and it software can automatically terminate or move candidates forward in the recruiting process. This is illustrated by the rejection emails that Mobley allegedly received in the middle of the night.
- The applicable statutes all provide that an employer may not refuse to hire employees based upon prohibited characteristics, such as race, disability, or age. In the case of the ADA, it prohibits employers from discriminating against a qualified individual on the basis of disability in regards to job application procedures and the hiring of employees.
- Given Workday’s allegedly crucial role in deciding which applicants get their foot in the door for an interview, its tools are engaged in conduct that is at the heart of equal access to employment opportunities.
- Nothing in the language of the federal antidiscrimination statutes or the case law interpreting those statutes, distinguishes between delegating functions to an automated agent v. a live human one. In fact, courts applying the agency exception have uniformly focused on the function the employer had delegated to the agent and not the manner in which the agent carries out the delegated function.
- Drawing an artificial distinction between software decision-makers and human decision makers potentially completely upends antidiscrimination laws in the modern era. Such a distinction would allow employers to delegate hiring, firing, promotion, compensation, benefits, and a myriad of other employment decisions to third-party algorithmic decision-making tools. Although outside human decision-makers would be required to comply with antidiscrimination laws under the agency liability doctrine, outside software tools created by those same humans would not. Such a distinction would allow employers to delegate discriminatory programs to third-party software tools with job applicants and employees having little recourse to challenge the discrimination from those tools, which just doesn’t make any sense.
- Workday qualifies as an agent because it tools are alleged to perform a traditional hiring function of rejecting candidates at the screening stage and recommending who to advance to subsequent stages, through the use of artificial intelligence and machine learning.
- Software vendors would not qualify as agents if they have not been delegated responsibility over traditional employment functions. For example, if they are not participating in the decision over whom to hire or whom to reject, they would not be an agent of the employer.
III
Court’s Reasoning That Workday Is Not an Employment Agency
- The applicable statutes define employment agency in pretty much the same way. More specifically, any person regularly undertaking with or without compensation to procure employees for an employer or to procure for employees opportunities to work for an employer. The wording among the statutes is not precisely the same but the meaning is.
- There are no allegations in the complaint that Workday brings job listing to the attention of those looking for employment. In fact, it is just the opposite where job applicants have to find positions on their own and then Workday takes it from there once they apply.
- Screening applicant using discriminatory algorithmic tools is not the same thing as alleging an entity finds candidates for employers.
IV
Court’s Reasoning That Plaintiff’s Disparate Impact Claim Can Proceed
- To make a prima facie case of disparate impact, a plaintiff has to show: 1) a significant disparate impact on a protected class or group; 2) identify the specific employment practices or selection criteria at issue; and 3) show a causal relationship between the challenged practices or criteria and the disparate impact.
- Plaintiff has sufficiently alleged the specific employment practice, i.e. the use of algorithmic decision-making tools in a discriminatory manner that screen out applicants. In particular, the amended complaint alleges that these tools rely on biased training data and information obtained from pymetrics and personality tests on which applicants with mental health and cognitive disorders perform more poorly.
- The complaint alleges that there is a common component discriminating against applicants based on a protected trait, which is supported by allegations that Mobley was rejected from over 100 jobs that he was allegedly qualified for, across many different industries and employers.
- Mobley applied to and was rejected from over 100 jobs for which he was allegedly qualified. The common denominator in those rejections was Workday, which provided the hiring companies with a platform for application intake and screening. In a traditional employment discrimination case, this kind of data would be analogous to having over 100 qualified applicants like Mobley (African-American, over 40, and suffering from depression and anxiety), all strike out for jobs with one employer.
- Mobley’s situation is even more compelling because he struck out with a whole range of employers across multiple industries using Workday’s platform, including for a job with the company that he was already doing as a contractor. The 0% success rate at passing Workday’s initial screening, combined with the complaint’s allegations regarding bias and Workday’s training data and tools reliance on information from pymetrics and personality tests, plausibly supports an inference that the algorithmic tools disproportionally reject applicants based on factors other than qualifications.
- Causation is present in light of the sheer number of rejections and the timing of those decisions when combined with the complaint’s allegations that Workday’s AI systems rely on biased training data. The causation element is also supported by the complaint’s citation to academic and other literature about bias in data models and algorithms, as well as Amazon’s since abandoned the attempt at using a facially neutral hiring algorithm that had a disparate impact on female candidates.
V
Court’s Reasoning That Plaintiff’s Disparate Treatment Claim Cannot Proceed
- To state a claim for disparate treatment, a plaintiff has to show: 1) he is a member of a protected class; 2) he would qualify for his position; 3) he experienced an adverse employment action; and 4) similarly situated individuals outside his protected class were treated more favorably, or other circumstances surrounding the adverse employment action give rise to an inference of discrimination.
- Complaint certainly made sufficient allegations showing that Mobley was qualified for the position for which he was rejected from.
- Mobley sufficiently alleged that he disclosed his protected traits when applying for the positions when he claimed that Workday can discern an applicant’s demographic information based on other inputs correlated with race or another protected classification. For example, he disclosed his degree from a historically black college university and his age.
- Mobley simply cannot show that Workday intended it tools to be discriminatory. However, if discovery should reveal otherwise, Mobley is free to amend his complaint at that time.
VI
Court’s Reasoning That Plaintiff’s Aiding And Abetting Claim Under The California Antidiscrimination Law Cannot Proceed
- Mobley sued claiming that Workday aided and abetted the discrimination. To prove up such a claim, a plaintiff has to allege: 1) is employer subjected him to discrimination; 2) the alleged aider and abettor knew that the employer’s conduct violated the California antidiscrimination law; and 3) the alleged aider and abettor gave the employer substantial assistance or encouragement to violate the California antidiscrimination law. Those claims don’t work because he does not allege that any of the specific companies he applied to discriminated against him nor that Workday allegedly knew that the conduct of those employers were discriminatory.
VII
Thoughts/Takeaways.
- What kind of preventive steps can be taken by AI companies to minimize these kinds of lawsuits. For that, you should read Keith E. Sonderling (outgoing EEOC Commissioner), Bradford J. Kelley, and Lance Casimir, The Promise and The Peril: Artificial Intelligence and Employment Discrimination, 77 U. MIA L. Rev. 1 (2022). Available at: https://repository.law.miami.edu/umlr/vol77/iss1/3 . On pages 75-80 of that article, Commissioner Keith Sonderling and his co-authors set forth several steps that are well worth keeping in mind. Those steps are: 1) know your data. That is, be vigilant about developing, applying, and modifying the data utilized to train and run the recruiting programs and algorithms used to screen and evaluate potential candidates and applicants. The data should be as complete as possible with no missing or unreliable factors, but the questions needing the answers, and also be transparent enough to provide statistically relevant results. Also, if using AI for an employment decision making, avoid potentially biased data from sources such as social media and data brokers as those can be error-prone; 2) make sure you are transparent and explain everything. Transparency promotes the visibility of processes, the accessibility of systems, and the reporting of meaningful information, and explainability fosters trust in the process; 3) monitor and audit AI uses. That is, monitor both qualitatively and quantitatively continually and/or at least once a year, memorialize the findings; 4) supervise the process. That is, charge a person or team of people with overseeing the processes and results of AI tools in order to ensure the tools are not only performing legitimate objectives, but also avoiding improper outcomes; 5) understand vendor liability. Employers need to carefully review and negotiate any contracts they have with vendors providing the services. It is particularly important for companies purchasing AI hiring tools to ensure that vendors attest to the fairness and integrity of the product while negotiating the proper indemnification clauses that anticipate potential government investigation. Employers need to be aware that they could be held liable if the vendors discriminate against candidates based on protected characteristics while using AI tools; and 6) employers need to be aware of the emerging patchwork of federal, state, and local laws, rules, and regulations regulating AI use.
- This case holds that if a software company takes on decisions-such as hiring, promotion, termination of employees, etc.- they may have independent liability as an employer for discrimination based upon a protected characteristic.
- Also, keep in mind that the ADA is a nondelegable duty, as we discussed here. In other words, an employer cannot delegate to others duties that it is responsible for. If it does, then the entity getting the delegation may also be liable as an employer.
- The ADA as well as the other nondiscrimination statute discussed in this blog entry allow for both disparate impact and disparate treatment claims. While disparate treatment is far more common in the ADA world than disparate impact, disparate impact is still a viable claim, especially in a situation such as the one involved here.
- While the state law claim did not work out here, attorneys on the plaintiff side should always keep in mind the possibilities of state law claims being involved as well.
- Starting in my early editions of Understanding the ADA, I raised the issue of personality tests as being violative of the ADA. The seminal case on that is Karraker v. Rent-a-Center Inc., 411 F.3d 831 (7th Cir. 2005), which can be found here.