Final-Text-regulations-automated-employment-decision-systems
CA · State · USA
CA
USA
● Pending
Proposed Effective Date
2025-07-01
California Code of Regulations, Title 2, Division 4.1, Chapter 5, Subchapter 2 — Final Text of Proposed Employment Regulations Regarding Automated-Decision Systems (Civil Rights Council)
These proposed California regulations, promulgated by the Civil Rights Council under the Fair Employment and Housing Act, extend existing employment anti-discrimination law to cover automated-decision systems used in hiring, promotion, termination, and other employment decisions. Employers and other covered entities are prohibited from using automated-decision systems or selection criteria (including qualification standards, employment tests, or proxies) that discriminate against applicants or employees on any basis protected by FEHA. The regulations clarify that anti-bias testing efforts (or lack thereof) are relevant evidence in discrimination claims. Recordkeeping requirements are extended to include automated-decision system data, with a four-year retention period. Enforcement follows the existing FEHA framework — administrative complaints to the CRD or private civil actions in Superior Court.
Summary

These proposed California regulations, promulgated by the Civil Rights Council under the Fair Employment and Housing Act, extend existing employment anti-discrimination law to cover automated-decision systems used in hiring, promotion, termination, and other employment decisions. Employers and other covered entities are prohibited from using automated-decision systems or selection criteria (including qualification standards, employment tests, or proxies) that discriminate against applicants or employees on any basis protected by FEHA. The regulations clarify that anti-bias testing efforts (or lack thereof) are relevant evidence in discrimination claims. Recordkeeping requirements are extended to include automated-decision system data, with a four-year retention period. Enforcement follows the existing FEHA framework — administrative complaints to the CRD or private civil actions in Superior Court.

Enforcement & Penalties
Enforcement Authority
California Civil Rights Department (CRD, formerly DFEH) enforces these regulations under the Fair Employment and Housing Act (FEHA). Enforcement is agency-initiated through investigation upon complaint or on the Department's own initiative. Individuals may file administrative complaints with the CRD or may obtain an immediate right-to-sue notice and pursue civil action in Superior Court. Private right of action is available under the FEHA (Government Code § 12965).
Penalties
Remedies available under the FEHA include: hiring, reinstatement, back pay, front pay, promotion, restoration of membership in a labor organization, compensatory damages (including emotional distress), punitive damages, injunctive relief, reasonable attorney's fees and costs, and any other appropriate relief. No statutory minimum per violation is specified in these regulations; remedies are governed by the FEHA's existing remedial framework. Actual monetary harm is not required for all forms of relief (e.g., injunctive relief, emotional distress damages).
Who Is Covered
"Employer or Other Covered Entity." Any employer, employment agency, labor organization or apprenticeship training program as defined herein and subject to the provisions of the Act.
"Employer." Any person or individual engaged in any business or enterprise regularly employing five or more individuals, including individuals performing any service under any appointment, contract of hire or apprenticeship, express or implied, oral or written.
"Employment Agency." An employment agency includes any person undertaking, for compensation, the procurement of job applicants, employees or opportunities to work, including persons undertaking these services through the use of an automated-decision system.
What Is Covered
"Automated-Decision System." A computational process that makes a decision or facilitates human decision making regarding an employment benefit, as defined in section 11008(i) of these regulations. An Automated-Decision System may be derived from and/or use artificial intelligence, machine-learning algorithms, statistics, and/or other data processing techniques.
Compliance Obligations 23 obligations · click obligation ID to open requirement page
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · EmploymentAutomated Decisionmaking
2 CCR § 11009(f)
Plain Language
Employers and other covered entities may not use an automated-decision system, qualification standard, employment test, or proxy that discriminates against applicants or employees on any FEHA-protected basis. In any discrimination claim or defense, evidence of anti-bias testing (or the absence of such testing) is relevant — including the quality, efficacy, recency, scope, results, and the entity's response to those results. This means that failure to conduct anti-bias testing on an automated-decision system can itself be used as evidence of discrimination, and conversely, robust testing may support a defense.
Statutory Text
(f) It is unlawful for an employer or other covered entity to use an automated-decision system or selection criteria (including a qualification standard, employment test, or proxy) that discriminates against an applicant or employee or a class of applicants or employees on a basis protected by the Act, subject to any available defense. Relevant to any such claim or available defense is evidence, or the lack of evidence, of anti-bias testing or similar proactive efforts to avoid unlawful discrimination, including the quality, efficacy, recency, and scope of such effort, the results of such testing or other effort, and the response to the results.
G-01 AI Governance Program & Documentation · G-01.3G-01.4 · Deployer · EmploymentAutomated Decisionmaking
2 CCR § 11013(c)
Plain Language
Employers and other covered entities must preserve all employment records — now explicitly including automated-decision system data, selection criteria, and all application and personnel records — for at least four years from the date the record was made or the personnel action occurred, whichever is later. This is a significant expansion: the retention period has been increased from two years to four years, and the scope of records now expressly includes data used in or generated by automated-decision systems. Records must be available to CRD investigators and to support any administrative or judicial proceeding.
Statutory Text
(c) Preservation of Records. Any personnel or other employment records created or received by any employer or other covered entity dealing with any employment practice and affecting any employment benefit of any applicant or employee shall be preserved by the employer or other covered entity for a period of four years from the date of the making of the record or the date of the personnel action involved, whichever occurs later. This includes all applications, personnel records, membership records, employment referral records, selection criteria, automated-decision system data, and other records created or received by the employer or other covered entity dealing with any employment practice and affecting any employment benefit of any applicant or employee.
H-02 Non-Discrimination & Bias Assessment · H-02.1H-02.3 · Deployer · EmploymentAutomated Decisionmaking
2 CCR § 11016(a)(2)
Plain Language
Employers may not use automated-decision systems (or any other method) in recruitment that restricts, excludes, classifies, or expresses preference for candidates on a FEHA-protected basis, or that uses advertising methods to communicate employment availability in a discriminatory manner. This extends to AI-driven job ad targeting, resume screening, and any other automated recruitment tool. The only exception is a permissible defense such as a bona fide occupational qualification.
Statutory Text
(2) Prohibited Recruitment Practices. An employer or other covered entity shall not, unless pursuant to a permissible defense, engage in any recruitment activity, including but not limited to practices accomplished through the use of an automated-decision system, that: (A) Restricts, excludes, or classifies individuals on a basis enumerated in the Act; (B) Expresses a preference for individuals on a basis enumerated in the Act; or (C) Communicates or uses advertising methods to communicate the availability of employment benefits in a manner intended to discriminate on a basis enumerated in the Act.
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · EmploymentAutomated Decisionmaking
2 CCR § 11016(b)(1)
Plain Language
Pre-employment inquiries, including those conducted through automated-decision systems, must not directly or indirectly identify individuals on a FEHA-protected basis unless a permissible defense applies. This means that automated screening questions, AI-driven assessments, and chatbot-based pre-employment inquiries must be designed to avoid eliciting or inferring protected-class information. Employers bear the burden of ensuring their automated systems do not function as proxy identifiers for protected characteristics.
Statutory Text
(1) Limited Permissible Inquiries. An employer or other covered entity may make any pre-employment inquiries that do not discriminate on a basis enumerated in the Act. Inquiries, including but not limited to inquiries made through the use of an automated-decision system, that directly or indirectly identify an individual on a basis enumerated in the Act are unlawful unless made pursuant to a permissible defense.
H-02 Non-Discrimination & Bias Assessment · H-02.1H-02.3 · Deployer · EmploymentAutomated Decisionmaking
2 CCR § 11016(c)(3)(A), (c)(5)
Plain Language
Employers using online application technology or automated-decision systems that screen, rank, or prioritize candidates based on scheduling availability, skills, dexterity, reaction time, or other characteristics must ensure these systems do not discriminate against individuals with disabilities, religious creed, or medical conditions. When an ADS has an adverse impact on protected groups, it is unlawful unless job-related and consistent with business necessity. Employers may need to provide reasonable accommodations — for example, including mechanisms in online applications for applicants to request accommodations, or adjusting ADS assessments for applicants with disabilities.
Statutory Text
(3)(A) The use of online application technology that limits, screens out, ranks, or prioritizes applicants based on their schedule may discriminate against applicants based on their religious creed, disability, or medical condition. Such a practice having an adverse impact is unlawful unless job-related and consistent with business necessity and the online application technology includes a mechanism for the applicant to request an accommodation. (5) Automated-Decision Systems. The use of an automated-decision system that, for example, measures an applicant's skill, dexterity, reaction time, and/or other abilities or characteristics may discriminate against individuals with certain disabilities or other characteristics protected under the Act. To avoid unlawful discrimination, an employer or other covered entity may need to provide reasonable accommodation to an applicant as required by Article 8 (religious creed) or Article 9 (disability) of these regulations.
H-02 Non-Discrimination & Bias Assessment · H-02.1H-02.3 · Deployer · EmploymentAutomated Decisionmaking
2 CCR § 11017(a), (d)(1), (e)
Plain Language
Any employment selection policy, practice, or automated-decision system that has an adverse impact on applicants or employees based on FEHA-protected characteristics is unlawful unless the employer can demonstrate it is job-related and consistent with business necessity. The regulations incorporate the federal Uniform Guidelines on Employee Selection Procedures (29 C.F.R. 1607). ADS that analyze tone of voice, facial expressions, or other physical characteristics may discriminate based on race, national origin, gender, or disability. Employers must provide reasonable accommodations during testing and may need to modify ADS-administered assessments. Facially neutral ADS with adverse impact are only permissible upon a showing of job-relatedness and business necessity.
Statutory Text
(a) Selection and Testing. Any policy or practice of an employer or other covered entity that has an adverse impact on employment opportunities of individuals on a basis enumerated in the Act is unlawful unless the policy or practice is job-related and consistent with business necessity (business necessity is defined in section 11010(b)). The Council herein adopts the Uniform Guidelines on Employee Selection Procedures promulgated by various federal agencies, including the EEOC and Department of Labor. [29 C.F.R. 1607 (1978)]. (d)(1) Automated-Decision Systems. An automated-decision system that, for example, analyzes an applicant's tone of voice, facial expressions or other physical characteristics or behavior may discriminate against individuals based on race, national origin, gender, disability, or other characteristics protected under the Act. To avoid unlawful discrimination, an employer or other covered entity may need to provide reasonable accommodation to an applicant as required by Article 8 (religious creed) or Article 9 (disability) of these regulations. (e) Permissible Selection Devices. A testing device, automated-decision system, or other means of selection that is facially neutral, but that has an adverse impact (as defined in the Uniform Guidelines on Employee Selection Procedures (29 C.F.R. 1607 (1978))) upon persons on a basis enumerated in the Act, is permissible only upon a showing that the selection practice is job-related and consistent with business necessity (business necessity is defined in section 11010(b)).
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · EmploymentAutomated Decisionmaking
2 CCR § 11017.1(a)(1)
Plain Language
Automated-decision systems may not be used to inquire into an applicant's criminal history prior to making a conditional offer of employment. This extends the Fair Chance Act's prohibition on pre-offer criminal history inquiries to cover ADS-conducted background checks and automated screening. Employers must ensure that any ADS used in pre-employment screening does not access or consider criminal history information before a conditional offer has been extended.
Statutory Text
(1) Prohibited consideration under this subsection includes, but is not limited to, inquiring about criminal history through an employment application, background check, or internet searches, or the use of an automated-decision system.
H-02 Non-Discrimination & Bias Assessment · H-02.1 · DeployerDeveloper · EmploymentAutomated Decisionmaking
2 CCR § 11020(b)
Plain Language
All aiding and abetting prohibitions — including assisting in unlawful discrimination, inciting or soliciting violations, coercing discriminatory conduct, concealing evidence, and advertising on a prohibited basis — apply equally when the prohibited practice is conducted through an automated-decision system. This means that vendors who develop or deploy ADS tools that facilitate discriminatory employment practices may also be liable for aiding and abetting discrimination.
Statutory Text
(b) The prohibited practices set forth in subsection (a) include any such practice conducted in whole or in part through the use of an automated-decision system.
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · EmploymentAutomated Decisionmaking
2 CCR § 11028(b), (c), (m)
Plain Language
Automated-decision systems that discriminate based on accent, English proficiency, or national origin (or a proxy thereof) are unlawful. ADS used in screening that penalizes accents, non-native English proficiency, or national origin characteristics must be justified by business necessity. Anti-bias testing (or its absence) is relevant evidence. This covers voice analysis AI, language proficiency screening tools, and any ADS that uses linguistic characteristics as selection criteria.
Statutory Text
(b) Discrimination based on an applicant's or employee's accent is unlawful unless the employer proves that the individual's accent interferes materially with the applicant's or employee's ability to perform the job in question. This prohibition also applies where such discrimination resulted, in whole or in part, from an employer's or other covered entity's use of an automated-decision system or selection criteria (including a qualification standard, employment test, or proxy). (c) Discrimination based on an applicant's or employee's English proficiency is unlawful unless the employer is necessary to effectively fulfill the job duties of the position.) In determining business necessity in this context, relevant factors include, but are not limited to, the type of proficiency required (e.g., spoken, written, aural, and/or reading comprehension), the degree of proficiency required, and the nature and job duties of the position. This prohibition also applies where such discrimination resulted, in whole or in part, from an employer's or other covered entity's use of an automated-decision system or selection criteria (including a qualification standard, employment test, or proxy). (m) It is unlawful for an employer or other covered entity to use an automated-decision system or selection criteria (including a qualification standard, employment test, or proxy) that discriminates against an applicant or employee or a class of applicants or employees on the basis of national origin or a proxy of national origin, subject to any available defense. Relevant to any such claim or available defense is evidence, or the lack of evidence, of anti-bias testing or similar proactive efforts to avoid unlawful discrimination, including the quality, efficacy, recency, and scope of such effort, the results of such testing or other effort, and the response to the results.
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · EmploymentAutomated Decisionmaking
2 CCR § 11032(b)(4), (f)
Plain Language
Automated-decision systems and selection criteria (including qualification standards, employment tests, or proxies) that discriminate on the basis of sex are unlawful. This covers sex-based discrimination in pre-employment inquiries, applications, and employee selection. Anti-bias testing (or its absence) is relevant evidence in any such claim or defense. Employers using ADS for resume screening, interview analysis, or candidate scoring must ensure their systems do not produce discriminatory outcomes based on sex.
Statutory Text
(4) It is unlawful for an employer or other covered entity to use an automated-decision system or selection criteria (including a qualification standard, employment test, or proxy) that discriminates against an applicant or employee or a class of applicants or employees on the basis of sex, subject to any available defense. Relevant to any such claim or available defense is evidence, or the lack of evidence, of anti-bias testing or similar proactive efforts to avoid unlawful discrimination, including the quality, efficacy, recency, and scope of such effort, the results of such testing or other effort, and the response to the results. (f) It is unlawful for an employer or other covered entity to use an automated-decision system or selection criteria (including a qualification standard, employment test, or proxy) that discriminates against an applicant or employee or a class of applicants or employees on the basis of sex or any basis prohibited in subsections in (a) through (e) of this section, subject to any available defense. Relevant to any such claim or available defense is evidence, or the lack of evidence, of anti-bias testing or similar proactive efforts to avoid unlawful discrimination, including the quality, efficacy, recency, and scope of such effort, the results of such testing or other effort, and the response to the results.
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · EmploymentAutomated Decisionmaking
2 CCR § 11038(b)
Plain Language
Automated-decision systems that discriminate against applicants or employees on the basis of pregnancy or perceived pregnancy are unlawful. Anti-bias testing evidence (or lack thereof) is relevant to any claim or defense. Employers must ensure ADS used in employment decisions do not disadvantage pregnant individuals or those perceived as pregnant.
Statutory Text
(b) It is unlawful for an employer or other covered entity to use an automated-decision system or selection criteria (including a qualification standard, employment test, or proxy) that discriminates against an applicant or employee or a class of applicants or employees on the basis of pregnancy or perceived pregnancy, subject to any available defense. Relevant to any such claim or available defense is evidence, or the lack of evidence, of anti-bias testing or similar proactive efforts to avoid unlawful discrimination, including the quality, efficacy, recency, and scope of such effort, the results of such testing or other effort, and the response to the results.
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · EmploymentAutomated Decisionmaking
2 CCR § 11039(a)(1)(J)
Plain Language
Employers may not use automated-decision systems or selection criteria that discriminate on the basis of pregnancy or perceived pregnancy in any employment decision including hiring, training, promotion, discharge, or terms and conditions of employment. Anti-bias testing evidence is relevant to claims and defenses. This is the employer-specific parallel to the broader covered-entity provision in § 11038(b).
Statutory Text
(J) use an automated-decision system or selection criteria (including a qualification standard, employment test, or proxy) that discriminates against an applicant or employee or a class of applicants or employees on the basis of pregnancy or perceived pregnancy, subject to any available defense. Relevant to any such claim or available defense is evidence, or the lack of evidence, of anti-bias testing or similar proactive efforts to avoid unlawful discrimination, including the quality, efficacy, recency, and scope of such effort, the results of such testing or other effort, and the response to the results; or
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · EmploymentAutomated Decisionmaking
2 CCR § 11056(a)
Plain Language
Automated-decision systems used in pre-employment inquiries must not ask applicants to disclose their marital status. This extends the existing prohibition on marital-status inquiries to ADS-mediated screening. The only exception is a permissible defense.
Statutory Text
(a) Impermissible Inquiries. It is unlawful to ask an applicant to disclose their marital status as part of a pre-employment inquiry, including an inquiry made through the use of an automated-decision system, unless pursuant to a permissible defense.
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · EmploymentAutomated Decisionmaking
2 CCR § 11063(b)
Plain Language
Automated-decision systems that discriminate on the basis of religious creed are unlawful. Anti-bias testing evidence is relevant to claims and defenses. Employers using ADS for scheduling, screening, or selection must ensure systems do not disadvantage individuals based on religious observance, practice, or belief.
Statutory Text
(b) It is unlawful for an employer or other covered entity to use an automated-decision system or selection criteria (including a qualification standard, employment test, or proxy) that discriminates against an applicant or employee or a class of applicants or employees on the basis of religion, subject to any available defense. Relevant to any such claim or available defense is evidence, or the lack of evidence, of anti-bias testing or similar proactive efforts to avoid unlawful discrimination, including the quality, efficacy, recency, and scope of such effort, the results of such testing or other effort, and the response to the results.
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · EmploymentAutomated Decisionmaking
2 CCR § 11070(a)(2), (b)(2)
Plain Language
Employers may not use automated-decision systems to advertise employment in ways that discourage applicants with disabilities. ADS-mediated pre-employment screening, application forms, and questionnaires must not ask questions that elicit disability information before a job offer is made. This includes questions about medical history, workers' compensation, hospitalization, medical leave, and physical or mental limitations — whether asked by a human interviewer or an automated system.
Statutory Text
(a)(2) It is unlawful to advertise or publicize, including but not limited to through the use of an automated-decision system, an employment benefit in any way that discourages or is designed to discourage applicants with disabilities from applying to a greater extent than individuals without disabilities. (b)(2) Prohibited Inquiries. It is unlawful to ask general questions on disability or questions likely to elicit information about a disability in an application form, automated-decision system, or pre-employment questionnaire or at any time before a job offer is made. Examples of prohibited inquiries are: [list of examples]
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · EmploymentAutomated Decisionmaking
2 CCR § 11071(e)
Plain Language
Any medical or psychological examination or disability-related inquiry conducted through an automated-decision system is subject to the same restrictions as those conducted by humans. This means ADS-administered tests, games, puzzles, or challenges that are likely to elicit information about a disability are treated as medical or psychological examinations and are subject to pre-offer prohibition, post-offer conditions, and confidentiality requirements under the FEHA disability discrimination framework.
Statutory Text
(e) As used in this article, "medical or psychological examination" (a term that is defined in section 11065 of these regulations) or a disability-related inquiry includes any such examination or inquiry administered through the use of an automated-decision system. Such examination or inquiry may include a test, question, puzzle, game, or other challenge that is likely to elicit information about a disability.
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · EmploymentAutomated Decisionmaking
2 CCR § 11072(b)(1)-(3)
Plain Language
Employers may not use qualification standards, employment tests, proxies, or other selection criteria — including those administered through automated-decision systems — that screen out or have an adverse impact on individuals with disabilities. This covers ADS that use uncorrected vision or hearing assessments, skill tests, and any other automated selection mechanism. Such criteria are only permissible if job-related and no less discriminatory alternative serves the employer's goals equally effectively. Employers bear the burden of demonstrating both job-relatedness and the unavailability of less discriminatory alternatives.
Statutory Text
(1) In general. It is unlawful for an employer or other covered entity to use a qualification standards, employment tests, proxy, or other selection criteria — including but not limited to those administered through the use of an automated-decision system — that screens out, tends to screen out, or otherwise has an adverse impact on an applicant or employee with a disability or a class of applicants or employees with disabilities, on the basis of disability. However, such standards, tests, or other selection criteria, as used by the employer or other covered entity, is not unlawful under this subsection when job-related for the position in question, and there is no less discriminatory standard, test, or other selection criteria that serves the employer's goals as effectively as the challenged standard, test, or other selection criteria. (2) Qualification Standards and Tests Related to Uncorrected Vision or Uncorrected Hearing. An employer or other covered entity shall not use a qualification standards, employment tests, proxy, or other selection criteria — including but not limited to those administered through the use of an automated-decision system — that discriminates against an applicant or employee based on uncorrected vision or uncorrected hearing. However, such standards, tests, or other selection criteria, as used by the employer or other covered entity, is not unlawful under this subsection when job-related for the position in question, and there is no less discriminatory standard, test, or other selection criteria that serves the employer's goals as effectively as the challenged standard, test, or other selection criteria. (3) An employer or other covered entity shall not make use of any testing criterion, including but not limited to through the use of an automated-decision system, that discriminates against applicants or employees with disabilities, unless: (A) the test score or other selection criterion used is shown to be job-related for the position in question; and (B) an alternative job-related test or criterion that does not discriminate against applicants or employees with disabilities is unavailable or would impose an undue hardship on the employer.
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · EmploymentAutomated Decisionmaking
2 CCR § 11076(a)
Plain Language
A presumption of age discrimination arises whenever a facially neutral practice — including the use of an automated-decision system — has an adverse impact on applicants or employees age 40 or older. Employers must demonstrate the practice is job-related and consistent with business necessity. Even if that showing is made, the practice may still be unlawful if a less discriminatory alternative exists. In layoff or salary reduction contexts, preferring lower-paid workers alone does not overcome the presumption. Employers using ADS that screen based on experience levels, graduation dates, or other age-correlated factors should ensure these do not produce adverse impact.
Statutory Text
(a) Employers. Discrimination on the basis of age may be established by showing that a job applicant's or employee's age of 40 or older was considered in the denial of employment or an employment benefit. There is a presumption of discrimination whenever a facially neutral practice, including but not limited to the use of an automated-decision system, has an adverse impact on an applicant(s) or employee(s) age 40 or older, unless the practice is job-related and consistent with business necessity as defined in section 11010(b). In the context of layoffs or salary reduction efforts that have an adverse impact on an employee(s) age 40 or older, an employer's preference to retain a lower paid worker(s), alone, is insufficient to negate the presumption. The practice may still be impermissible, even where it is job-related and consistent with business necessity, where it is shown that an alternative practice could accomplish the business purpose equally well with a lesser discriminatory impact.
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · EmploymentAutomated Decisionmaking
2 CCR § 11079(b), (c)(1)
Plain Language
Pre-employment inquiries through automated-decision systems that directly or indirectly identify applicants by age are unlawful unless age is a bona fide occupational qualification. Online job applications may not require age entry, use drop-down menus with age-based cutoffs, or employ automated selection criteria that screen out applicants age 40 and older. This covers ADS that use graduation dates, years of experience caps, or other age-correlated fields as screening criteria.
Statutory Text
(b) Pre-employment Inquiries. Unless age is a bona fide occupational qualification for the position at issue, pre-employment inquiries that would result in the direct or indirect identification of persons on the basis of age, including, but not limited to, inquiries made through the use of an automated-decision system, are unlawful. Examples of prohibited inquiries are requests for age, date of birth, or graduation dates, except where age is a bona fide occupational qualification. This provision applies to oral and written inquiries and interviews. (c)(1) Subsection (c) prohibits the use of online job applications that require entry of age in order to access or complete an application, or the use of drop-down menus that contain age-based cut-off dates or utilize automated selection criteria or algorithms that have the effect of screening out applicants age 40 and older. Use of online application technology or an automated-decision system that limits or screens out older applicants is discriminatory unless age is a bona fide occupational qualification. (See section 11010(a).)
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · EmploymentAutomated Decisionmaking
2 CCR § 11028(g)
Plain Language
Employers may not use automated-decision systems to discriminate against applicants or employees based on their possession of an AB 60 driver's license (issued to undocumented immigrants). This extends the existing prohibition to ADS-mediated screening. Automated background check or document verification systems must not flag or penalize AB 60 licenses.
Statutory Text
(g) It is unlawful for an employer or other covered entity to discriminate against an applicant or employee because they hold or present a driver's license issued under section 12801.9 of the Vehicle Code. This prohibition also applies where such discrimination resulted, in whole or in part, from an employer's or other covered entity's use of an automated-decision system or selection criteria (including a qualification standard, employment test, or proxy).
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · EmploymentAutomated Decisionmaking
2 CCR § 11028(h)
Plain Language
Automated-decision systems that enforce citizenship requirements in a way that discriminates based on national origin or ancestry are unlawful unless a permissible defense applies. Employers using ADS for eligibility screening must ensure citizenship criteria are not pretextual or have discriminatory effects on national origin or ancestry-protected groups.
Statutory Text
(h) Citizenship requirements. Citizenship requirements that are a pretext for discrimination or have the purpose or effect of discriminating against applicants or employees on the basis of national origin or ancestry are unlawful, unless pursuant to a permissible defense. This prohibition also applies where such discrimination resulted, in whole or in part, from an employer's or other covered entity's use of an automated-decision system or selection criteria (including a qualification standard, employment test, or proxy).
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · EmploymentAutomated Decisionmaking
2 CCR § 11072(b)(5)
Plain Language
Employers must ensure that employment tests — including those administered through automated-decision systems — accurately measure the skills, aptitude, or criteria they purport to measure, rather than reflecting an applicant's or employee's disability. Reasonable accommodations must be made in testing conditions. For ADS-administered assessments, this means gamified tests, puzzle-based assessments, and timed evaluations must not inadvertently measure disability rather than job-relevant competencies. Accommodations include accessible test sites, Braille or digital formats, screen readers, voice recognition, additional time, interpreters, and other modifications.
Statutory Text
(5) An employer or other covered entity shall select and administer tests concerning employment so as to ensure that, when administered to any applicant or employee, including an applicant or employee with a disability, the test results accurately reflect the applicant's or employee's job skills, aptitude, or whatever other criteria the test purports to measure, rather than reflecting the applicant's or employee's disability, except those skills affected by disability are the criteria that the tests purport to measure. Tests concerning employment include, but are not limited to, those administered through the use of an automated-decision system. To accomplish this end, reasonable accommodation shall be made in testing conditions.
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · EmploymentAutomated Decisionmaking
2 CCR § 11072(b)(5)(F)
Plain Language
When modifying an ADS-administered test is inappropriate, employers may need to use alternate tests or individualized assessments. Importantly, simply running a candidate through an automated-decision system — without additional human review or process — does not constitute an individualized assessment for purposes of disability accommodation. This means employers cannot rely solely on ADS output as a substitute for the individualized assessment required when disability accommodation is at issue.
Statutory Text
(F) Alternate tests or individualized assessments may be necessary where test modification is inappropriate. Competent expert advice may be sought before attempting such modification since the validity of the test may be affected. The use of an automated-decision system, in the absence of additional process or actions, does not constitute an individualized assessment.