N.J.A.C.-13-16-Disparate-Impact-Discrimination
NJ · State · USA
NJ
USA
● Enacted
Effective Date
2025-12-15
New Jersey Administrative Code Title 13, Chapter 16 — Rules Pertaining to Disparate Impact Discrimination (R.2025 d.150)
These administrative rules, adopted by the New Jersey Division on Civil Rights, implement the disparate impact liability provisions of the New Jersey Law Against Discrimination (LAD). They apply to all covered entities — employers, housing providers, real estate brokers, lending institutions, places of public accommodation, and contractors — and prohibit facially neutral practices or policies that actually or predictably result in a disproportionately negative effect on members of a protected class, unless the practice is necessary to achieve a substantial, legitimate, nondiscriminatory interest and there is no less discriminatory alternative. The rules specifically address automated employment decision tools, establishing that their use may create disparate impact liability and requiring adequate pre-deployment testing. Enforcement is through the Division on Civil Rights or Superior Court, with standing for aggrieved individuals, the Attorney General, and other designated officials. The rules also impose a vendor oversight obligation: covered entities relying on outside vendors' products or systems must take reasonable steps to ensure those systems comply with the LAD.
Summary

These administrative rules, adopted by the New Jersey Division on Civil Rights, implement the disparate impact liability provisions of the New Jersey Law Against Discrimination (LAD). They apply to all covered entities — employers, housing providers, real estate brokers, lending institutions, places of public accommodation, and contractors — and prohibit facially neutral practices or policies that actually or predictably result in a disproportionately negative effect on members of a protected class, unless the practice is necessary to achieve a substantial, legitimate, nondiscriminatory interest and there is no less discriminatory alternative. The rules specifically address automated employment decision tools, establishing that their use may create disparate impact liability and requiring adequate pre-deployment testing. Enforcement is through the Division on Civil Rights or Superior Court, with standing for aggrieved individuals, the Attorney General, and other designated officials. The rules also impose a vendor oversight obligation: covered entities relying on outside vendors' products or systems must take reasonable steps to ensure those systems comply with the LAD.

Enforcement & Penalties
Enforcement Authority
New Jersey Division on Civil Rights (DCR), within the Office of the Attorney General. Enforcement may be agency-initiated by the Director of the Division, the Attorney General, the Commissioner of the Department of Labor and Workforce Development, or the Commissioner of the Department of Education. Complaints may also be filed by any person claiming to be aggrieved by an unlawful employment practice or unlawful discrimination, or by any other person or organization authorized by the Division's Rules of Practice and Procedure. Complainants may alternatively initiate suit in Superior Court.
Penalties
Remedies are governed by the underlying New Jersey Law Against Discrimination (N.J.S.A. 10:5-1 et seq.), which provides for compensatory damages, equitable relief including injunctive relief, back pay, front pay, emotional distress damages, punitive damages in appropriate cases, and reasonable attorney's fees and costs. The rules themselves do not specify separate penalty amounts; they implement the LAD's existing remedial framework.
Who Is Covered
"Covered entity" means an employer; labor organization; employment agency; housing provider; real estate broker, agent, or salesperson; lending institution; place of public accommodation; or person who is required to comply with N.J.S.A. 10:5-12.
What Is Covered
"Automated employment decision tools" are any software, system, or process that aims to automate, aid, or replace human decision-making relevant to employment. "Automated employment decision tools" include tools that analyze datasets to generate scores, rankings, predictions, classifications, or some recommended action(s) that are used by employers to make decisions regarding employees, contractors, and job candidates, or potential job candidates, including decisions related to advertising, recruiting, screening, interviewing, hiring, placement, promotion, and compensation, or any other term, condition, or privilege of employment.
Compliance Obligations 8 obligations · click obligation ID to open requirement page
H-02 Non-Discrimination & Bias Assessment · H-02.1H-02.2H-02.3 · Deployer · Automated DecisionmakingEmployment
N.J.A.C. 13:16-2.1(a)-(b)
Plain Language
All covered entities in New Jersey must ensure that their practices and policies — even facially neutral ones adopted without discriminatory intent — do not actually or predictably result in a disproportionately negative effect on members of any protected class. A practice causing such disparate impact violates the LAD unless the entity can show it is necessary to achieve a substantial, legitimate, nondiscriminatory interest and no less discriminatory alternative exists. Notably, liability can attach before a policy is implemented if it has been approved, announced, or finalized and there is evidence of predictable disparate impact. Policies still in internal deliberation cannot be challenged.
Statutory Text
(a) Practices and policies that have a disparate impact, as defined at (b) below, on members of a protected class, even if these practices and policies are not discriminatory on their face (that is, facially neutral) and are not motivated by discriminatory intent, will be considered discriminatory and a violation of the Act, unless it is shown that such practices and policies are necessary to achieve a substantial, legitimate, nondiscriminatory interest and there is no less discriminatory alternative that would achieve the same interest. (b) A practice or policy has a disparate impact where it actually or predictably results in a disproportionately negative effect on members of a protected class. A practice or policy predictably can have a disparate impact when there is evidence that the practice or policy will have a disparate impact even though the practice or policy has not yet been implemented, if the practice or policy has been approved, announced, or otherwise finalized. However, a practice or policy that is simply being debated or deliberated internally by a covered entity cannot be challenged pursuant to this chapter before it is implemented, approved, announced, or otherwise finalized.
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · EmploymentAutomated Decisionmaking
N.J.A.C. 13:16-3.2(c)(1)-(3)
Plain Language
Employers using automated employment decision tools — including AI resume screeners, facial analysis technology for interviews, and scheduling filters — must ensure these tools do not create disparate impact on protected classes. The rules establish that tools trained on non-representative data (e.g., a mostly white, cisgender male workforce) may produce biased outputs. Critically, subsection (c)(3) effectively requires pre-deployment testing: an employer's use of an automated tool that has not been adequately tested and shown to not adversely affect members of a protected class before use may itself constitute a disparate impact violation. Scheduling-based automated tools must also include a reasonable accommodation request mechanism. Facial analysis technology is flagged as particularly high-risk for bias against people with darker skin, disabilities, or religious headwear/facial hair.
Statutory Text
(c) Automated employment decision technology practices are as follows: 1. The use of automated employment decision tools to make employment decisions, including, but not limited to, decisions related to advertising, recruiting, screening, interviewing, hiring, and compensation, or any other terms, conditions, or privileges of employment, may have a disparate impact on applicants and employees based on their race, national origin, gender, disability, religion, and other protected characteristics. By way of example, but not limitation, an automated employment decision tool that uses data on a company's current employees to inform a search for candidates may have a disparate impact on members of protected classes that are not well represented in that company or industry. If most current employees at a computer science company are white, cisgender men, an automated employment decision tool that assesses applicants based on that pool may score women applicants lower because their resumes list "women's field hockey" rather than "football," or score Black applicants lower because their resumes list "Black Student Alliance," an organization in which the company's current employees are less likely to have been involved; 2. The use of an automated employment decision tool that limits or screens out applicants based on their schedule may have a disparate impact on applicants based on their religion, disability, or medical condition and must include a mechanism for applicants to request a reasonable accommodation. By way of example, but not limitation, an application asking if an applicant is available to work a proposed schedule of Monday through Saturday may screen out applicants who answer the question in the negative due to religious practices they engage in on Saturdays; and 3. An employer's use of an automated employment decision tool that has not been adequately tested and shown to not adversely affect people in a protected class before its use may have a disparate impact on members of that protected class. By way of example, but not limitation, an employer's use of facial analysis technology to detect personality traits during virtual interviews is likely to result in lower scores for interviewees whose facial expressions the tools have not been tested on and designed to read. If the technology was tested exclusively or predominantly on white people with no disabilities, then use of the technology may disproportionately impact interviewees with darker skin or interviewees with disabilities because the technology cannot match their facial expressions to those programmed into the tool and may not account for interviewees who cannot make certain facial expressions. i. The use of facial analysis technology may disproportionately impact interviewees wearing religious headwear or maintaining religiously mandated facial hair if the technology has not been tested on people with similar religious practices.
H-02 Non-Discrimination & Bias Assessment · Deployer · Automated DecisionmakingEmployment
N.J.A.C. 13:16-2.4(e)
Plain Language
If a covered entity uses an outside vendor's products, systems, or procedures — including third-party AI tools, scoring algorithms, or screening products — and those products cause a disparate impact, the entity cannot disclaim liability by pointing to the vendor. The covered entity must take reasonable steps to ensure that the vendor's tools comply with the LAD and these rules. This creates a vendor due diligence obligation: employers, housing providers, and other covered entities must affirmatively evaluate whether third-party tools they adopt produce discriminatory outcomes before and during use.
Statutory Text
(e) If a respondent's practice or policy that results in a disparate impact based on a protected characteristic relies on conduct, standards, products, procedures, or systems of an outside person or vendor, the respondent must take reasonable steps to ensure that the outside person or vendor's conduct, standards, products, procedures, or systems are consistent with the Act and this chapter.
H-02 Non-Discrimination & Bias Assessment · H-02.1H-02.2 · Deployer · Automated DecisionmakingEmployment
N.J.A.C. 13:16-2.2(a)-(f)
Plain Language
In employment, public accommodations, and contracting contexts, a three-step burden-shifting framework applies to disparate impact claims. First, the complainant must show empirical (not speculative) evidence that the challenged practice has a disparate impact. Second, the respondent must demonstrate the practice is necessary to achieve a substantial, legitimate, nondiscriminatory interest — in employment, this means job-related and consistent with business necessity. Third, even if justified, the practice is unlawful if the complainant can identify a less discriminatory alternative. For product counsel, this means that any AI or automated system deployed in employment, public accommodations, or contracting must be defensible under all three steps: you need empirical evidence the tool does not cause disparate impact, a documented business necessity justification, and analysis showing no less discriminatory alternative was available.
Statutory Text
(a) A complainant challenging a practice or policy of a covered entity must show the practice or policy challenged has a disparate impact on members of a protected class. (b) In the employment, public accommodations, and contracting contexts, if the complainant meets the burden of proof at (a) above, the respondent has the burden of showing that the challenged practice or policy is necessary to achieve a substantial, legitimate, nondiscriminatory interest. In the employment context, whether a practice or policy is necessary to achieve a substantial, legitimate, nondiscriminatory interest is equivalent to whether the practice or policy is job related and consistent with a legitimate business necessity. A practice or policy is job related when it bears a demonstrable relationship to successful performance of the job and measures the person's fitness for the specific job. (c) In the employment, public accommodations, and contracting contexts, if the respondent meets the burden at (b) above, the complainant has the burden of showing that there is a less discriminatory alternative means of achieving the substantial, legitimate, nondiscriminatory interest. (d) To meet its burden of proof at (a), (b), or (c) above, a party must provide empirical evidence, meaning evidence that is not hypothetical or speculative, to support its allegations. For example, a complainant would not meet its burden to show an employment policy has a disparate impact on job applicants based on gender by speculating that the policy harms women more than men, but could meet its burden by providing empirical evidence, which could include applicant files or data or applicant selection rates by gender. Anecdotal evidence, while not sufficient on its own, may be introduced along with empirical evidence. For example, a complainant would not meet its burden to show an employment policy has a disparate impact on job applicants based on gender by solely providing that they know women who applied and did not receive a position but men who did. However, a complainant could introduce anecdotal evidence along with empirical evidence, such as applicant selection rates by gender. (e) The opposing party may rebut whether the party with the burden of proof at (a), (b), or (c) above has met its burden. (f) Additional proof may be required when challenging or defending particular practices or policies. Such requirements are noted in this chapter, where relevant.
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · Automated DecisionmakingEmployment
N.J.A.C. 13:16-2.4(b)(1)-(2), (c)
Plain Language
When defending a practice that has been shown to cause disparate impact, the covered entity must prove two things: (1) the practice serves a core interest directly related to the entity's function that is genuine and non-pretextual and does not itself discriminate, and (2) the practice actually carries out that interest effectively. This is a case-specific, fact-based inquiry — generic justifications will not suffice. Notably, pursuing diversity or increasing access for underrepresented groups can itself constitute a legitimate justification. For AI tool deployers, this means you must be prepared to demonstrate with evidence that each automated tool serves a genuine business function and actually achieves its stated purpose.
Statutory Text
(b) To establish that a challenged practice or policy is necessary to achieve a substantial, legitimate, nondiscriminatory interest, a respondent must establish that: 1. The practice or policy is necessary to achieve one or more substantial, legitimate, nondiscriminatory interests, where "substantial interest" means a core interest of the entity that has a direct relationship to the function of that entity, "legitimate" means that a justification is genuine and not false or pretextual, and "nondiscriminatory" means that the justification for a challenged practice or policy does not itself discriminate based on a protected characteristic; and 2. The practice or policy effectively carries out the identified interest. (c) The determination of whether an interest is substantial, legitimate, and nondiscriminatory requires a case-specific, fact-based inquiry. An interest in achieving diversity or increasing access for underrepresented or underserved members of a protected class may constitute a substantial, legitimate, nondiscriminatory interest.
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · EmploymentAutomated Decisionmaking
N.J.A.C. 13:16-3.1(a)-(c)
Plain Language
All employment practices — from hiring and screening to compensation and termination — are subject to disparate impact analysis. Employers, labor organizations, and employment agencies must ensure their practices are job-related and consistent with business necessity if challenged, and must be prepared to show no less discriminatory alternative exists. Affirmative recruitment efforts to attract underrepresented groups are expressly permitted and will not create liability under this chapter. For AI tool deployers in the employment context, every automated screening, scoring, or decision tool must be defensible as job-related and necessary.
Statutory Text
(a) Employment practices and policies may be unlawful if they have a disparate impact on members of a protected class. An employment practice or policy that has a disparate impact is prohibited unless, in accordance with N.J.A.C. 13:16-2.2, a respondent shows it is necessary to achieve a substantial, legitimate, nondiscriminatory interest. Whether an employment practice or policy is necessary to achieve a substantial, legitimate, nondiscriminatory interest is equivalent to whether the practice or policy is job related and consistent with a legitimate business necessity. An employment practice or policy may still be prohibited if necessary to achieve a substantial, legitimate, nondiscriminatory interest if a complainant shows there is a less discriminatory alternative that would achieve the same interest. (b) Nothing in this subchapter shall preclude affirmative efforts to utilize recruitment practices to attract an individual who is a member of an underrepresented or underserved member of a protected class covered by the Act. (c) This subchapter applies to the practices and policies of employers, labor organizations, employment agencies, and other covered entities.
Other · EmploymentAutomated Decisionmaking
N.J.A.C. 13:16-3.3
Plain Language
The federal Uniform Guidelines on Employee Selection Procedures (the 'four-fifths rule' framework for adverse impact analysis) are incorporated by reference and extended to all NJ LAD protected characteristics — a broader set than federal Title VII covers. Where the NJ rules conflict with the federal guidelines, the NJ rules control. This is significant because it means automated employment decision tools in NJ are evaluated under the UGESP statistical framework for all NJ protected classes, including those not covered by federal law (e.g., gender identity, marital status, domestic partnership status).
Statutory Text
The guidelines set forth in the Uniform Guidelines on Employee Selection Procedures, 29 CFR 1607 (1978), which are incorporated herein by reference, are applied to all protected characteristics listed in the Act. Where there is a conflict between such guidelines and this chapter, the rules in this chapter shall control. Upon request, the Division will make the guidelines available for public inspection and make available a printed copy of the guidelines.
Other · Automated DecisionmakingEmployment
N.J.A.C. 13:16-2.1(c)
Plain Language
Disparate impact claims may be brought by aggrieved individuals, the Attorney General, the Director of the Division on Civil Rights, the Commissioners of Labor and Education, or other authorized persons. Complaints may be filed with the Division or pursued as a lawsuit in Superior Court. This provision confirms the enforcement channels but does not create a new compliance obligation for covered entities.
Statutory Text
(c) Any person claiming to be aggrieved by an unlawful employment practice or an unlawful discrimination, the Attorney General, the Director of the Division, the Commissioner of the Department of Labor and Workforce Development, or the Commissioner of the Department of Education, or any other person or organization authorized by the Division's Rules of Practice and Procedure, N.J.A.C. 13:4, or the LAD, may bring a complaint of discrimination based on disparate impact liability pursuant to the Act to the Division or initiate suit in Superior Court.