Skip to content

Artificial Intelligence in Employment Decisions

By Kellie Hand

Artificial Intelligence (AI) has revolutionized the way many businesses operate, and the realm of employment decisions is no exception. AI tools are increasingly being utilized to streamline and automate aspects of the hiring process. For instance, In February 2022, the Society of Human Resources Management found that 79% of employers use Artificial Intelligence (AI) and/or automation for recruiting and hiring. Although AI and automation can significantly reduce bias when implemented correctly, AI algorithms are only as good as the data they are trained on. As a result, if an AI tool incorporates biased information or reflects historical disparities, the AI tool may inadvertently perpetuate those biases, leading to workplace discrimination. 

In October 2021, U.S. Equal Employment Opportunity Commission (EEOC) Chair Charlotte A. Burrows announced the Artificial Intelligence and Algorithmic Fairness Initiative, an agency-wide initiative to ensure that the use of software, including artificial intelligence (AI), machine learning, and other emerging technologies used in hiring and other employment decisions comply with the federal civil rights laws that the EEOC enforces. On May 18, 2023, the EEOC released a technical assistance document, “Assessing Adverse Impact in Software, Algorithms, and Artificial Intelligence Used in Employment Selection Procedures Under Title VII of the Civil Rights Act of 1964,” wherein it affirmed that if an employer administers a selection procedure, it may be responsible under Title VII if the procedure discriminates on a basis prohibited by Title VII, even if the test was developed by an outside vendor. Additionally, employers may be held responsible for the actions of their agents, which may include entities such as software vendors, if the employer has given them authority to act on the employer’s behalf. However, in most cases, it can be impossible for an employee to know whether an employer is using a discriminatory AI selection procedure. 

In 2021, the New York City Department of Consumer and Worker Protection (“DCWP” or “Department”) sought to increase transparency by implementing new legislation regarding automated employment decision tools (“AEDT”). Local Law 144 of 2021, which will go into effect on July 5, 2023, “prohibits employers and employment agencies from using an automated employment decision tool unless the tool has been subject to a bias audit within one year of the use of the tool, information about the bias audit is publicly available, and certain notices have been provided to employees or job candidates” who reside in New York City. The law defines an “employment decision” as the act of screening “candidates for employment or employees for promotion within [New York City].” 

A bias audit of an AEDT must calculate the selection rate for each race/ethnicity and sex category (i.e., how often individuals in each race/ethnicity and sex category are chosen by the tool) and compare the selection rates to the most selected category to determine an impact ratio. The impact ratio shows if there is a significant difference in selection rates between groups. A large difference may indicate that the tool is biased.

To comply with the Code, an employer or employment agency may provide notice to a candidate for employment or promotion who resides in New York City by doing any of the following: 

(1) Provide notice on the employment section of its website in a clear and conspicuous manner at least 10 business days before the use of an AEDT; 

(2) Provide notice in a job posting at least 10 business days before use of an AEDT; or 

(3) Provide notice to candidates for employment via U.S. mail or e-mail at least 10 business days before the use of an AEDT. 

Additionally, Local Law 144 requires that employers provide instructions for how an individual can request an alternative selection process or a reasonable accommodation under other laws, if available. 

*It is important to note that while the law covers bias regarding race, ethnicity, and sex, it does NOT apply to older or disabled workers. 

Valli Kane & Vagnini LLP - Press & News