California Seeks to Regulate Employer Use of AI
Time 4 Minute Read
California Seeks to Regulate Employer Use of AI

On February 15, 2024, California lawmakers introduced the bill AB 2930.  AB 2930 seeks to regulate use of artificial intelligence (“AI”) in various industries to combat “algorithmic discrimination.”  The proposed bill defines “algorithmic discrimination” as a “condition in which an automated decision tool contributes to unjustified differential treatment or impacts disfavoring people” based on various protected characteristics including actual or perceived race, color, ethnicity, sex, national origin, disability, and veteran status. 

Specifically, AB 2930 seeks to regulate “automated decision tools,” that make “consequential decisions.” An “automated decision tool” is any system that uses AI which has been developed to make, or be a controlling factor in making, “consequential decisions.”  And a “consequential decision” is defined as a decision or judgment that has a legal, material, or similarly significant effect on an individual’s life relating to the impact of, access to, or the cost, terms, or availability of, any of the following:  1) employment, including any decisions regarding pay or promotion, hiring or termination, and automated task collection; 2) education; 3) housing or lodging; 4) essential utilities, 5) family planning, 6) adoption services, reproductive services, or assessments related to child protective services; 7) health care or health insurance; 8) financial services; 9) the criminal justice system; 10) legal services; 11) private arbitration; 12) mediation and 13) voting. 

AB 2930 aims to prevent algorithmic discrimination through impact assessments, notice requirements, governance programs, policy disclosure requirements, and providing for civil liability. 

Impact Assessments

Any employers or developers using or developing automated decision tools, by January 1, 2026, will be required to perform annual impact assessments.  The annual impact assessment requirements are largely the same for both employers and developers and include, among other things, a statement of purpose for the automated decision tool; descriptions of the automated decision tool’s outputs and how they are used in making a consequential decision; and analysis of potential adverse impacts.  Employers, but not developers, are required to:  1) describe the safeguards in place to address reasonably foreseeable risks of algorithmic discrimination and 2) provide a statement of the extent to which the employer’s use of the automated decision tool is consistent with or varies from the developer’s statement of the intended use of the automated decision tool (which developers are required to provide under Section 22756.3 of the proposed bill).  Employers with fewer than 25 employees will not be required to perform this assessment, unless the automated system impacted more than 999 people in the calendar year. 

Notice Requirements

Employers using automated decision tools are required to notify any person subject to a consequential decision that the automated decision tool is being used to make a consequential decision.  The notice is required to include:  1) a statement of the purpose of the automated decision tool; 2) contact information of the employer; and 3) a plain language description of the automated decision tool.  Also, if the consequential decision is made solely based on the output on the automated decision tool, the employer is required to, if technically feasible, accommodate a person’s request to be subject to an alternative selection process. 

Governance Programs

Employers using automated decision tools are required to establish a governance program to address any reasonable foreseeable risks of algorithmic discrimination associated with the use of an automated decision tool  The governance program must, among other things, designate at least one employee responsible for overseeing and maintaining the governance program and compliance with AB 2930; implement safeguards to address reasonably foreseeable risks of algorithmic discrimination; conduct an annual and comprehensive review of policies, practices, and procedures to ensure compliance with AB 2930; and maintain results of impact assessments for at least two years.  Employers with fewer than 25 employees will not be required to form a governance program, unless the automated system impacted more than 999 people in the calendar year. 

Policy Disclosure Requirements

Any employers or developers using or developing automated decision tools are also required to make publicly available a clear policy that provides a summary of both of the following:  1) the types of automated decision tools currently in use and 2) how the employer or developer manages the reasonably foreseeable risks of algorithmic discrimination that may arise from the use of the automated decision tools it uses.

Civil Liability

A person may bring a civil action against an employer for violating AB 2930 if the person is able to demonstrate that the automated decision tool caused actual harm to the person.  A prevailing plaintiff may be able to receive compensatory damages, declaratory relief, and reasonable attorney’s fees.  Public attorneys, including district attorneys and city prosecutors, may also bring civil actions against employers for violating AB 2930. 

****

Although AB 2930 still has not been passed by the legislature, Hunton will continue monitoring any future developments related to the proposed bill.  

  • Associate

    Jesse focuses his practice on labor and employment law. Jesse litigates wage and hour class and collective actions, California Private Attorneys General Act (PAGA) actions, trade secret matters, and single and multi-plaintiff ...

  • Partner

    Kevin is co-chair of the firm’s labor and employment team and co-chair of the firm’s Retail and Consumer Products Industry practice group. He has a national practice that focuses on complex employment litigation, employment ...

Search

Subscribe Arrow

Recent Posts

Categories

Tags

Authors

Archives

Jump to Page