A-09654
NY · State · USA
NY
USA
● Pending
Proposed Effective Date
2027-01-01
New York Assembly Bill 9654 — An Act to amend the civil rights law, in relation to enacting the "New York Artificial Intelligence Civil Rights Act"
The NY AI Civil Rights Act imposes comprehensive obligations on developers and deployers of 'covered algorithms' — AI and ML systems used in 'consequential actions' spanning employment, education, housing, healthcare, credit, criminal justice, government services, elections, and public accommodations. The bill prohibits algorithmic discrimination and disparate impact based on an extensive list of protected characteristics, and requires independently audited pre-deployment evaluations and annual post-deployment impact assessments. Developers and deployers must publish detailed public disclosures, provide short-form notices to individuals, maintain records for 10 years, and submit evaluations and assessments to the Division of Consumer Protection. Enforcement is through both AG parens patriae actions (with civil penalties of $15,000 per violation or 4% of average gross annual revenue) and a private right of action (treble damages or $15,000 per violation, plus punitive damages), with pre-dispute arbitration agreements rendered unenforceable.
Summary

The NY AI Civil Rights Act imposes comprehensive obligations on developers and deployers of 'covered algorithms' — AI and ML systems used in 'consequential actions' spanning employment, education, housing, healthcare, credit, criminal justice, government services, elections, and public accommodations. The bill prohibits algorithmic discrimination and disparate impact based on an extensive list of protected characteristics, and requires independently audited pre-deployment evaluations and annual post-deployment impact assessments. Developers and deployers must publish detailed public disclosures, provide short-form notices to individuals, maintain records for 10 years, and submit evaluations and assessments to the Division of Consumer Protection. Enforcement is through both AG parens patriae actions (with civil penalties of $15,000 per violation or 4% of average gross annual revenue) and a private right of action (treble damages or $15,000 per violation, plus punitive damages), with pre-dispute arbitration agreements rendered unenforceable.

Enforcement & Penalties
Enforcement Authority
Attorney General enforcement as parens patriae on behalf of state residents when the AG has reason to believe state residents' interests have been or are threatened by a violation. The Division of Consumer Protection has rulemaking and regulatory authority, receives pre-deployment evaluations and impact assessments, maintains a public repository, and has the right to intervene in private actions. Private right of action available to any individual or class of individuals alleging a violation; prior to filing, the individual must notify the Division and AG in writing with a description of the allegations, after which the Division and AG have 60 days to determine whether to intervene. Pre-dispute arbitration agreements and pre-dispute joint-action waivers are unenforceable with respect to disputes arising under this article.
Penalties
AG enforcement: civil penalties of $15,000 per violation or 4% of the defendant's average gross annual revenue over the preceding three years, whichever is greater; injunctive relief (permanent, temporary, or preliminary); damages, restitution, or other compensation on behalf of residents; reasonable attorneys' fees and litigation costs. Private right of action: treble damages or $15,000 per violation, whichever is greater; nominal damages; punitive damages; reasonable attorneys' fees and litigation costs; equitable or declaratory relief. Statutory damages do not require proof of actual monetary harm.
Who Is Covered
"Deployer" means any person that uses a covered algorithm for a commercial act. The terms "deployer" and "developer" shall not be interpreted to be mutually exclusive.
(a) "Developer" means any person that designs, codes, customizes, produces, or substantially modifies an algorithm that is intended or reasonably likely to be used as a covered algorithm for such person's own use, or use by a third party, in connection with a commercial act, or for use by a government entity. (b) In the event that a deployer uses an algorithm as a covered algorithm, and no person is considered the developer of the algorithm for purposes of paragraph (a) of this subdivision, the deployer shall be considered the developer of the covered algorithm for the purposes of this article. (c) The terms "deployer" and "developer" shall not be interpreted to be mutually exclusive.
What Is Covered
"Covered algorithm" means: (a) a computational process derived from machine learning, natural language processing, artificial intelligence techniques, or other computational processing techniques of similar or greater complexity, that, with respect to a consequential action: (i) creates or facilitates the creation of a product or information that is used as an integral part of the consequential action; (ii) promotes, recommends, ranks, or otherwise affects the display or delivery of information that is used as an integral part of the consequential action; (iii) makes a decision; or (iv) facilitates human decision making; or (b) any other computational process deemed appropriate by the division through rules.
Compliance Obligations 20 obligations · click obligation ID to open requirement page
H-02 Non-Discrimination & Bias Assessment · H-02.1H-02.2H-02.3 · DeveloperDeployer · Automated Decisionmaking
Civil Rights Law § 102(1)-(2)
Plain Language
Developers and deployers are prohibited from offering, licensing, or using a covered algorithm in any way that causes disparate impact or discrimination based on protected characteristics in connection with consequential actions. The prohibition covers both intentional discrimination and unjustified differential effects. The disparate impact standard requires the developer or deployer to prove a substantial, legitimate, nondiscriminatory interest, and even if proven, a less discriminatory alternative defeats the defense. The algorithm is presumed to be analyzed holistically (not component by component) unless the developer or deployer proves separability by preponderance of the evidence. Exemptions exist for self-testing to identify or mitigate discrimination, diversity pool expansion, good-faith non-commercial research, and private clubs.
Statutory Text
1. A developer or deployer shall not offer, license, promote, sell, or use a covered algorithm in a manner that: (a) causes or contributes to a disparate impact in a manner that prevents; (b) otherwise discriminates in a manner that prevents; or (c) otherwise makes unavailable, the equal enjoyment of goods, services, or other activities or opportunities, related to a consequential action, on the basis of a protected characteristic. 2. This section shall not apply to: (a) the offer, licensing, or use of a covered algorithm for the sole purpose of: (i) a developer's or deployer's self-testing (or auditing by an independent auditor at a developer's or deployer's request) to identify, prevent, or mitigate discrimination, or otherwise to ensure compliance with obligations, under federal or state law; (ii) expanding an applicant, participant, or customer pool to raise the likelihood of increasing diversity or redressing historic discrimination; or (iii) conducting good faith security research, or other research, if conducting the research is not part or all of a commercial act; or (b) any private club or other establishment not in fact open to the public, as described in section 201(e) of the Civil Rights Act of 1964 (42 U.S.C. 2000a(e)).
H-02 Non-Discrimination & Bias Assessment · H-02.3H-02.6 · Developer · Automated Decisionmaking
Civil Rights Law § 103(1)-(3)
Plain Language
Before deploying, licensing, or offering a covered algorithm for any consequential action — including material changes to previously-deployed algorithms — both developers and deployers must conduct a preliminary evaluation of whether harm is plausible. If no harm is plausible, they must record and submit that finding to the Division. If harm is plausible, they must engage a qualified independent auditor (who cannot have any employment, financial, or development relationship with the developer or deployer) to conduct a comprehensive pre-deployment evaluation. The developer's evaluation covers design methodology, training and testing data, performance metrics, demographic representation, stakeholder consultation, and harm potential. The deployer's evaluation (§ 103(4), mapped separately) covers deployment context, necessity, proportionality, and deployment-specific harm potential. For material changes to existing algorithms, the evaluation scope may be limited to the changes.
Statutory Text
1. Prior to deploying, licensing, or offering a covered algorithm (including deploying a material change to a previously-deployed covered algorithm or a material change made prior to deployment) for a consequential action, a developer or deployer shall conduct a pre-deployment evaluation in accordance with this section. 2. (a) The developer shall conduct a preliminary evaluation of the plausibility that any expected use of the covered algorithm may result in a harm. (b) The deployer shall conduct a preliminary evaluation of the plausibility that any intended use of the covered algorithm may result in a harm. (c) Based on the results of the preliminary evaluation, the developer or deployer shall: (i) in the event that a harm is not plausible, record a finding of no plausible harm, including a description of the developer's expected use or the deployer's intended use of the covered algorithm, how the preliminary evaluation was conducted, and an explanation for the finding, and submit such record to the division; and (ii) in the event that a harm is plausible, conduct a full pre-deployment evaluation as described in subdivision three or subdivision four of this section, as applicable. (d) When conducting a preliminary evaluation of a material change to, or new use of, a previously-deployed covered algorithm, the developer or deployer may limit the scope of the evaluation to whether use of the covered algorithm may result in a harm as a result of the material change or new use. 3. (a) If a developer determines a harm is plausible during the preliminary evaluation described in subdivision two of this section, the developer shall engage an independent auditor to conduct a pre-deployment evaluation. The evaluation required by this subdivision shall include a detailed review and description, sufficient for an individual having ordinary skill in the art to understand the functioning, risks, uses, benefits, limitations, and other pertinent attributes of the covered algorithm, including: (i) the covered algorithm's design and methodology, including the inputs the covered algorithm is designed to use to produce an output and the outputs the covered algorithm is designed to produce; (ii) how the covered algorithm was created, trained, and tested, including: (A) any metric used to test the performance of the covered algorithm; (B) defined benchmarks and goals that correspond to such metrics, including whether there was sufficient representation of demographic groups that are reasonably likely to use or be affected by the covered algorithm in the data used to create or train the algorithm, and whether there was reasonable testing, if any, across such demographic groups; (C) the outputs the covered algorithm actually produces in testing; (D) a description of any consultation with relevant stakeholders, including any communities that will be impacted by the covered algorithm, regarding the development of the covered algorithm, or a disclosure that no such consultation occurred; (E) a description of which protected characteristics, if any, were used for testing and evaluation, and how and why such characteristics were used, including: (1) whether the testing occurred in comparable contextual conditions to the conditions in which the covered algorithm is expected to be used; and (2) if protected characteristics were not available to conduct such testing, a description of alternative methods the developer used to conduct the required assessment; (F) any other computational algorithm incorporated into the development of the covered algorithm, regardless of whether such precursor computational algorithm involves a consequential action; (G) a description of the data and information used to develop, test, maintain, or update the covered algorithm, including: (1) each type of personal data used, each source from which the personal data was collected, and how each type of personal data was inferred and processed; (2) the legal authorization for collecting and processing the personal data; and (3) an explanation of how the data (including personal data) used is representative, proportional, and appropriate to the development and intended uses of the covered algorithm; and (H) a description of the training process for the covered algorithm which includes the training, validation, and test data utilized to confirm the intended outputs; (iii) the potential for the covered algorithm to produce a harm or to have a disparate impact in the equal enjoyment of goods, services, or other activities or opportunities, and a description of such potential harm or disparate impact; (iv) alternative practices and recommendations to prevent or mitigate harm and recommendations for how the developer could monitor for harm after offering, licensing, or deploying the covered algorithm; and (v) any other information the division deems pertinent to prevent the covered algorithm from causing harm or having a disparate impact in the equal enjoyment of goods, services, or other activities or opportunities, as prescribed by rules promulgated by the division. (b) The independent auditor shall submit to the developer a report on the evaluation conducted under this subdivision, including the findings and recommendations of such independent auditor.
H-02 Non-Discrimination & Bias Assessment · H-02.3H-02.6 · Deployer · Automated Decisionmaking
Civil Rights Law § 103(4)
Plain Language
When a deployer's preliminary evaluation identifies plausible harm, the deployer must engage an independent auditor to conduct a full pre-deployment evaluation covering deployment-specific factors: the algorithm's role in the consequential action, necessity and proportionality relative to the baseline process being replaced, data inputs and their representativeness, testing results in the deployment context, stakeholder consultation, potential for harm and disparate impact, and mitigation recommendations. The independent auditor must submit a report with findings and recommendations. This is the deployer's parallel obligation to the developer's pre-deployment evaluation — each party must independently satisfy its own evaluation requirements.
Statutory Text
4. (a) If a deployer determines a harm is plausible during the preliminary evaluation described in subdivision two of this section, the deployer shall engage an independent auditor to conduct a pre-deployment evaluation. The evaluation required by this subdivision shall include a detailed review and description, sufficient for an individual having ordinary skill in the art to understand the functioning, risks, uses, benefits, limitations, and other pertinent attributes of the covered algorithm, including: (i) the manner in which the covered algorithm makes or contributes to a consequential action and the purpose for which the covered algorithm will be deployed; (ii) the necessity and proportionality of the covered algorithm in relation to its planned use, including the intended benefits and limitations of the covered algorithm and a description of the baseline process being enhanced or replaced by the covered algorithm, if applicable; (iii) the inputs that the deployer plans to use to produce an output, including: (A) the type of personal data and information used and how the personal data and information will be collected, inferred, and processed; (B) the legal authorization for collecting and processing the personal data; and (C) an explanation of how the data used is representative, proportional, and appropriate to the deployment of the covered algorithm; (iv) the outputs the covered algorithm is expected to produce and the outputs the covered algorithm actually produces in testing; (v) a description of any additional testing or training completed by the deployer for the context in which the covered algorithm will be deployed; (vi) a description of any consultation with relevant stakeholders, including any communities that will be impacted by the covered algorithm, regarding the deployment of the covered algorithm; (vii) the potential for the covered algorithm to produce a harm or to have a disparate impact in the equal enjoyment of goods, services, or other activities or opportunities in the context in which the covered algorithm will be deployed and a description of such potential harm or disparate impact; (viii) alternative practices and recommendations to prevent or mitigate harm in the context in which the covered algorithm will be deployed and recommendations for how the deployer could monitor for harm after offering, licensing, or deploying the covered algorithm; and (ix) any other information the division deems pertinent to prevent the covered algorithm from causing harm or having a disparate impact in the equal enjoyment of goods, services, or other activities or opportunities as prescribed by rules promulgated by the division. (b) The independent auditor shall submit to the deployer a report on the evaluation conducted under this subdivision, including the findings and recommendations of such independent auditor.
H-02 Non-Discrimination & Bias Assessment · H-02.6H-02.7H-02.8H-02.10 · Deployer · Automated Decisionmaking
Civil Rights Law § 104(1)-(3)
Plain Language
Deployers must conduct annual post-deployment impact assessments of each covered algorithm. A preliminary assessment identifies whether harm occurred during the reporting period. If no harm is identified, the deployer records and submits that finding to the Division. If harm is identified, the deployer must engage an independent auditor for a full impact assessment covering: actual harms and disparate impact with methodology, data inputs, expected vs. actual outputs, how the algorithm was used in consequential actions, and mitigation steps including staff training. The auditor's report goes to the deployer, who must then share a summary with the developer within 30 days (subject to trade secret and privacy protections). This creates a continuous annual cycle of post-deployment monitoring with independent oversight when harm is found.
Statutory Text
1. After the deployment of a covered algorithm, a deployer shall, on an annual basis, conduct an impact assessment in accordance with this section. The deployer shall conduct a preliminary impact assessment of the covered algorithm to identify any harm that resulted from the covered algorithm during the reporting period and: (a) if no resulting harm is identified by such assessment, shall record a finding of no harm, including a description of the developer's expected use or the deployer's intended use of the covered algorithm, how the preliminary evaluation was conducted, and an explanation for such finding, and submit such finding to the division; and (b) if a resulting harm is identified by such assessment, shall conduct a full impact assessment as described in subdivision two of this section. 2. In the event that the covered algorithm resulted in a harm during the reporting period, the deployer shall engage an independent auditor to conduct a full impact assessment with respect to the reporting period, including: (a) an assessment of the harm that resulted or was reasonably likely to have been produced during the reporting period; (b) a description of the extent to which the covered algorithm produced a disparate impact in the equal enjoyment of goods, services, or other activities or opportunities, including the methodology for such evaluation, of how the covered algorithm produced or likely produced such disparity; (c) a description of the types of data input into the covered algorithm during the reporting period to produce an output, including: (i) documentation of how data input into the covered algorithm to produce an output is represented and complete descriptions of each field of data; and (ii) whether and to what extent the data input into the covered algorithm to produce an output was used to train or otherwise modify the covered algorithm; (d) whether and to what extent the covered algorithm produced the outputs it was expected to produce; (e) a detailed description of how the covered algorithm was used to make a consequential action; (f) any action taken to prevent or mitigate harms, including how relevant staff are informed of, trained about, and implement harm mitigation policies and practices, and recommendations for how the deployer could monitor for and prevent harm after offering, licensing, or deploying the covered algorithm; and (g) any other information the division deems pertinent to prevent the covered algorithm from causing harm or having a disparate impact in the equal enjoyment of goods, services, or other activities or opportunities as prescribed by rules promulgated by the division. 3. (a) After the engagement of the independent auditor, the independent auditor shall submit to the deployer a report on the impact assessment conducted under subdivision two of this section, including the findings and recommendations of such independent auditor. (b) Not later than thirty days after the submission of a report on an impact assessment under this section, a deployer shall submit to the developer of the covered algorithm a summary of such report, subject to the trade secret and privacy protections described in subdivision six of this section.
H-02 Non-Discrimination & Bias Assessment · H-02.8 · Developer · Automated Decisionmaking
Civil Rights Law § 104(4)
Plain Language
Developers must annually review all impact assessment summaries submitted by deployers of their covered algorithms. The review must cover how deployers are using the algorithm, the data being inputted, whether deployers are complying with contractual terms, real-world performance versus pre-deployment testing, whether the algorithm is causing or is likely causing harm or disparate impact, and whether the algorithm needs modification. This creates a feedback loop requiring developers to remain actively engaged in monitoring the downstream use of their algorithms and to take corrective action when warranted.
Statutory Text
4. A developer shall, on an annual basis, review each impact assessment summary submitted by a deployer of its covered algorithm under subdivision three of this section for the following purposes: (a) to assess how the deployer is using the covered algorithm, including the methodology for assessing such use; (b) to assess the type of data the deployer is inputting into the covered algorithm to produce an output and the types of outputs the covered algorithm is producing; (c) to assess whether the deployer is complying with any relevant contractual agreement with the developer and whether any remedial action is necessary; (d) to compare the covered algorithm's performance in real-world conditions versus pre-deployment testing, including the methodology used to evaluate such performance; (e) to assess whether the covered algorithm is causing harm or is reasonably likely to be causing harm; (f) to assess whether the covered algorithm is causing, or is reasonably likely to be causing, a disparate impact in the equal enjoyment of goods, services, or other activities or opportunities, and, if so, how and with respect to which protected characteristic; (g) to determine whether the covered algorithm needs modification; (h) to determine whether any other action is appropriate to ensure that the covered algorithm remains safe and effective; and (i) to undertake any other assessment or responsive action the division deems pertinent to prevent the covered algorithm from causing harm or having a disparate impact in the equal enjoyment of goods, services, or other activities or opportunities, as prescribed by rules promulgated by the division.
R-02 Regulatory Disclosure & Submissions · R-02.1 · DeveloperDeployer · Automated Decisionmaking
Civil Rights Law § 104(6)
Plain Language
Within 30 days of completing any full pre-deployment evaluation, full impact assessment, or developer annual review, the developer or deployer must: (1) submit the complete evaluation/assessment/review to the Division of Consumer Protection; (2) publish a summary on their website in an easily accessible manner; and (3) submit that summary to the Division. Upon legislative request, the full evaluation/assessment/review must be made available. All evaluations, assessments, and reviews must be retained for at least 10 years. Trade secrets may be redacted from public disclosures, and personal data must be redacted from public disclosures.
Statutory Text
6. (a) A developer or deployer that conducts a full pre-deployment evaluation, full impact assessment, or developer annual review of assessments shall: (i) not later than thirty days after completion, submit the evaluation, assessment, or review to the division; (ii) upon request, make the evaluation, assessment, or review available to the legislature; and (iii) not later than thirty days after completion: (A) publish a summary of the evaluation, assessment, or review on the website of the developer or deployer in a manner that is easily accessible to individuals; and (B) submit such summary to the division. (b) A developer or deployer shall retain all evaluations, assessments, and reviews described in this section for a period of not fewer than ten years. (c) A developer or deployer: (i) may redact and segregate any trade secret (as defined in section 1839 of title 18, United States Code) from public disclosure under this subdivision; and (ii) shall redact and segregate personal data from public disclosure under this section.
G-02 Public Transparency & Documentation · G-02.4 · DeveloperDeployer · Automated Decisionmaking
Civil Rights Law § 104(6)(a)(iii)
Plain Language
Developers and deployers must publish a summary of each full pre-deployment evaluation, full impact assessment, and developer annual review on their website within 30 days of completion, in a manner easily accessible to individuals. This is the public-facing disclosure component of the evaluation/assessment submission process — ensuring that the public has access to information about how covered algorithms have been evaluated for harm and disparate impact.
Statutory Text
(iii) not later than thirty days after completion: (A) publish a summary of the evaluation, assessment, or review on the website of the developer or deployer in a manner that is easily accessible to individuals; and (B) submit such summary to the division.
S-01 AI System Safety Program · S-01.5S-01.7 · DeveloperDeployer · Automated Decisionmaking
Civil Rights Law § 106(1)
Plain Language
Developers and deployers must take affirmative steps to maintain covered algorithm safety and performance. Specifically, they must: (1) take reasonable measures to prevent and mitigate harms identified in pre-deployment evaluations and impact assessments; (2) ensure independent auditors have all necessary information for accurate evaluations; (3) consult impacted stakeholders and communities before deploying; (4) certify that the algorithm is not likely to cause harm, disparate impact, or deceptive practices, and that benefits outweigh harms; (5) ensure the algorithm performs at a reasonable standard consistent with its publicly-advertised purpose; (6) ensure data used is relevant and appropriate to the deployment context; and (7) ensure the algorithm's intended use is not likely to violate the article. The certification requirement is a meaningful compliance gate — developers and deployers must affirmatively attest based on evaluation results.
Statutory Text
1. A developer or deployer shall do the following: (a) take reasonable measures to prevent and mitigate any harm identified by a pre-deployment evaluation described in section one hundred three or an impact assessment described in section one hundred four of this article; (b) take reasonable measures to ensure that an independent auditor has all necessary information to complete an accurate and effective pre-deployment evaluation described in section one hundred three or an impact assessment described in section one hundred four of this article; (c) with respect to a covered algorithm, consult stakeholders, including any communities that will be impacted by the covered algorithm, regarding the development or deployment of the covered algorithm prior to the deploying, licensing, or offering the covered algorithm; (d) with respect to a covered algorithm, certify that, based on the results of a pre-deployment evaluation described in section one hundred three or an impact assessment described in section one hundred four of this article: (i) use of the covered algorithm is not likely to result in harm or disparate impact in the equal enjoyment of goods, services, or other activities or opportunities; (ii) the benefits from the use of the covered algorithm to individuals affected by the covered algorithm likely outweigh the harms from the use of the covered algorithm to such individuals; and (iii) use of the covered algorithm is not likely to result in a deceptive act or practice; (e) ensure that any covered algorithm of the developer or deployer functions at a level that would be considered reasonable performance by an individual with ordinary skill in the art; and in a manner that is consistent with its expected and publicly-advertised performance, purpose, or use; (f) ensure any data used in the design, development, deployment, or use of the covered algorithm is relevant and appropriate to the deployment context and the publicly-advertised purpose or use; and (g) ensure use of the covered algorithm as intended is not likely to result in a violation of this article.
CP-01 Deceptive & Manipulative AI Conduct · CP-01.3 · DeveloperDeployer · Automated Decisionmaking
Civil Rights Law § 106(2)(a)
Plain Language
Developers and deployers are prohibited from engaging in any false, deceptive, or misleading advertising, marketing, or publicizing of their covered algorithms. This is a standalone prohibition that goes beyond the general performance certification in § 106(1)(d)(iii) — it specifically targets marketing representations and creates an independent basis for liability if a covered algorithm is advertised in a way that does not accurately represent its capabilities, limitations, or effects.
Statutory Text
2. (a) It shall be unlawful for a developer or deployer to engage in false, deceptive, or misleading advertising, marketing, or publicizing of a covered algorithm of the developer or deployer.
S-01 AI System Safety Program · S-01.1 · DeveloperDeployer · Automated Decisionmaking
Civil Rights Law § 106(2)(b)-(c)
Plain Language
Developers may not knowingly offer or license a covered algorithm for any consequential action that was not evaluated in the pre-deployment evaluation. Deployers may not knowingly use a covered algorithm for any unevaluated consequential action unless the deployer assumes full developer responsibilities under the article. This effectively gates each covered algorithm's permissible uses to those specifically evaluated for harm — if a new use case arises, the algorithm cannot be deployed for that use until a new evaluation is completed.
Statutory Text
(b) It shall be unlawful for a developer to knowingly offer or license a covered algorithm for any consequential action other than those evaluated in the pre-deployment evaluation described in section one hundred three of this article. (c) It shall be unlawful for a deployer to knowingly use a covered algorithm for any consequential action other than a use evaluated in the pre-deployment evaluation described in section one hundred three of this article, unless the deployer agrees to assume the responsibilities of a developer required by this article.
G-01 AI Governance Program & Documentation · G-01.3G-01.4 · Developer · Automated Decisionmaking
Civil Rights Law § 107(1)-(3)
Plain Language
Developers must support deployer compliance by: (1) providing pre-deployment evaluation reports and information necessary for deployers to conduct their own evaluations upon reasonable request; and (2) either cooperating with deployer-initiated assessments or arranging independent auditor assessments of the developer's policies and practices. When developers license covered algorithms to deployers, the written contract must specify data processing procedures, deployment instructions, data types, processing duration, mutual rights and obligations, and material change notification methods. Contracts must prohibit data combination across parties, may not relieve either party of statutory liability, and may not prohibit either party from reporting concerns to enforcement agencies. Developers must retain all deployer contracts for at least 10 years. The government entity provision (§ 107(4)) extends developer obligations to apply equally when the downstream user is a government entity.
Statutory Text
1. A developer shall do the following: (a) upon the reasonable request of the deployer, make available to the deployer information necessary to demonstrate the compliance of the deployer with the requirements of this article, including: (i) making available a report of the pre-deployment evaluation described in section one hundred three of this article or the annual review of assessments conducted by the developer under section one hundred four of this article; and (ii) providing information necessary to enable the deployer to conduct and document a pre-deployment evaluation under section one hundred three or an impact assessment described in section one hundred four of this article; and (b) either: (i) allow and cooperate with reasonable assessments conducted by the deployer or the deployer's designated independent auditor; or (ii) arrange for an independent auditor to conduct an assessment of the developer's policies and practices in support of the obligations under this article using an appropriate and accepted control standard or framework and assessment procedure for such assessments and provide a report of such assessment to the deployer upon request. 2. A developer may offer or license a covered algorithm to a deployer pursuant to a written contract between the developer and deployer, provided that the contract: (a) clearly sets forth the data processing procedures of the developer with respect to any collection, processing, or transfer of data performed on behalf of the deployer; (b) clearly sets forth: (i) instructions for collecting, processing, transferring, or disposing of data by the developer or deployer in the context of the use of the covered algorithm; (ii) instructions for deploying the covered algorithm as intended; (iii) the nature and purpose of any collection, processing, or transferring of data; (iv) the type of data subject to such collection, processing, or transferring; (v) the duration of such processing of data; and (vi) the rights and obligations of both parties, including a method by which the developer shall notify the deployer of material changes to its covered algorithm; (c) shall not relieve a developer or deployer of any requirement or liability imposed on such developer or deployer under this article; (d) prohibits both the developer and deployer from combining data received from or collected on behalf of the other party with data the developer or deployer received from or collected on behalf of another party; and (e) shall not prohibit a developer or deployer from raising concerns to any relevant enforcement agency with respect to the other party. 3. Each developer shall retain for a period of ten years a copy of each contract entered into with a deployer to which it provides requested products or services.
H-01 Human Oversight of Automated Decisions · H-01.4 · DeployerDeveloper · Automated Decisionmaking
Civil Rights Law § 108(1)-(2)
Plain Language
The Division must promulgate regulations within two years of the effective date specifying when and how deployers must provide individuals a right to opt out of algorithmic decision-making and elect a human-only alternative for consequential actions. The regulations must consider notice design, which consequential actions warrant a human alternative, feasibility, and the public interest. Separately, developers and deployers are immediately prohibited from using deceptive statements, dark patterns, or manipulative interface design to discourage individuals from exercising any right under the article. The opt-out/human alternative right itself will not take effect until implementing regulations are promulgated, but the prohibition on conditioning rights exercise through deception or manipulative design is operative from the effective date.
Statutory Text
1. Not later than two years after the effective date of this article, the division shall promulgate regulations in accordance with specifying the circumstances and manner in which a deployer shall provide to an individual a means to opt-out of the use of a covered algorithm for a consequential action and to elect to have the consequential action concerning the individual undertaken by a human without the use of a covered algorithm. In promulgating the regulations under this subdivision, the division shall consider the following: (a) how to ensure that any notice or request from a deployer regarding the right to a human alternative is clear and conspicuous, in plain language, easy to execute, and at no cost to an individual; (b) how to ensure that any such notice to individuals is effective, timely, and useful; (c) the specific types of consequential actions for which a human alternative is appropriate, considering the magnitude of the action and risk of harm; (d) the extent to which a human alternative would be beneficial to individuals and the public interest; (e) the extent to which a human alternative can prevent or mitigate harm; (f) the risk of harm to individuals beyond the requestor if a human alternative is available or not available; (g) the feasibility of providing a human alternative in different circumstances; and (h) any other considerations the division deems appropriate to balance the need to give an individual control over a consequential action related to such individual with the practical feasibility and effectiveness of granting such control. 2. A developer or deployer may not condition, effectively condition, attempt to condition, or attempt to effectively condition the exercise of any individual right under this article or individual choice through: (a) the use of any false, fictitious, fraudulent, or materially misleading statement or representation; or (b) the design, modification, or manipulation of any user interface with the purpose or substantial effect of obscuring, subverting, or impairing a reasonable individual's autonomy, decision making, or choice to exercise any such right.
H-01 Human Oversight of Automated Decisions · H-01.5 · Deployer · Automated Decisionmaking
Civil Rights Law § 108(3)
Plain Language
The Division must promulgate regulations within two years specifying when and how deployers must provide individuals a mechanism to appeal algorithmic consequential actions to a human reviewer. The regulations must ensure the appeal mechanism is free, accessible (including to individuals with disabilities), proportionate, non-discriminatory, and effective. Where appropriate, individuals must be able to identify and correct personal data used by the algorithm. Human reviewers must be trained. Like the opt-out right, this appeal right's specific parameters will be defined by regulation, but the legislative mandate to create an appeal mechanism is unambiguous.
Statutory Text
3. Not later than two years after the effective date of this article, the division shall promulgate regulations specifying the circumstances and manner in which a deployer shall provide to an individual a mechanism to appeal to a human a consequential action resulting from the deployer's use of a covered algorithm. In promulgating the regulations under this subdivision, the division shall do the following: (a) ensure that the appeal mechanism is clear and conspicuous, in plain language, easy-to-execute, and at no cost to individuals; (b) ensure that the appeal mechanism is proportionate to the consequential action; (c) ensure that the appeal mechanism is reasonably accessible to individuals with disabilities, timely, usable, effective, and non-discriminatory; (d) require, where appropriate, a mechanism for individuals to identify and correct any personal data used by the covered algorithm; (e) specify training requirements for human reviewers with respect to a consequential action; and (f) consider any other circumstances, procedures, or matters the division deems appropriate to balance the need to give an individual a right to appeal a consequential action related to such individual with the practical feasibility and effectiveness of granting such right.
G-03 Whistleblower & Anti-Retaliation Protections · G-03.3 · DeveloperDeployer · Automated Decisionmaking
Civil Rights Law § 109(1)-(2)
Plain Language
Developers and deployers are prohibited from retaliating against any individual for exercising rights under this article, refusing to waive rights, raising concerns about consequential actions, reporting or attempting to report violations, or cooperating in investigations or proceedings. Retaliation includes both service-level retaliation (denying goods, services, or opportunities) and employment-level retaliation (discharge, demotion, suspension, threats, harassment). The prohibition covers both consumers/affected individuals and employee whistleblowers. Developers and deployers may still offer differential pricing or services if the difference is necessary and directly related to the value provided by the algorithm, and may operate loyalty/rewards programs consistent with the article.
Statutory Text
1. A developer or deployer may not: (a) discriminate or retaliate against an individual (including by denying or threatening to deny the equal enjoyment of goods, services, or other activities or opportunities in relation to a consequential action) because the individual exercised any right, refused to waive any such right, raised a concern about a consequential action under this article, or assisted in any investigation or proceeding under this article; or (b) directly or indirectly, discharge, demote, suspend, threaten, harass, or otherwise discriminate or retaliate against an individual for raising a concern, reporting or attempting to report a violation of this article, or cooperating in any investigation or proceeding under this article. 2. Nothing in this article shall prohibit a developer or deployer from: (a) denying service to an individual, charging an individual a different price or rate, or providing a different level or quality of goods or services to an individual if the differential in service is necessary and directly related to the value provided to the developer or deployer by the covered algorithm; or (b) offering loyalty, rewards, premium features, discounts, or club card programs that provide benefits or rewards based on frequency of patronizing, or the amount of money spent at, a business consistent with this article.
G-02 Public Transparency & Documentation · G-02.4 · DeveloperDeployer · Automated Decisionmaking
Civil Rights Law § 110(1)-(5)
Plain Language
Every developer and deployer must publish a comprehensive public disclosure covering: identity and contact information (including corporate affiliates receiving personal data), links to evaluation/assessment summaries, categories of personal data collected and processing purposes, third-party data transfers, individual rights descriptions, compliance practices, a mandated audit disclaimer, and the disclosure's effective date. Disclosures must be in plain language, accessible to individuals with disabilities, and available in the top 10 languages spoken in New York. Material changes require prior electronic notification to affected individuals in each covered language. All previous disclosure versions must be retained for at least 10 years and published online with a public change log describing each material change. This is one of the most detailed public disclosure requirements in any U.S. AI bill.
Statutory Text
1. Each developer or deployer shall make publicly available, in plain language and in a clear, conspicuous, not misleading, easy-to-read, and readily accessible manner, a disclosure that provides a detailed and accurate representation of the developer or deployer's practices regarding the requirements under this article. 2. The disclosure required under subdivision one of this section shall include, at a minimum, the following: (a) the identity and the contact information of: (i) the developer or deployer to which the disclosure applies (including the developer or deployer's point of contact and electronic and physical mail address, as applicable for any inquiry concerning a covered algorithm or individual rights under this article); and (ii) any other entity within the same corporate structure as the developer or deployer to which personal data is transferred by the developer or deployer. (b) a link to the website containing the developer or deployer's summaries of pre-deployment evaluations, impact assessments, and annual review of assessments, as applicable; (c) the categories of personal data the developer or deployer collects or processes in the development or deployment of a covered algorithm and the processing purpose for each such category; (d) whether the developer or deployer transfers personal data, and, if so, each third party to which the developer or deployer transfers such data and the purpose for which such data is transferred, except with respect to a transfer to a governmental entity pursuant to a court order or law that prohibits the developer or deployer from disclosing such transfer; (e) a prominent description of how an individual can exercise the rights described in this article; (f) a general description of the developer or deployer's practices for compliance with the requirements described in sections one hundred three and one hundred six of this article; (g) the following disclosure: "The audit of this algorithm was conducted to comply with the New York Artificial Intelligence Civil Rights Act, which seeks to avoid the use of any algorithm that has a disparate impact on certain protected classes of individuals. The audit does not guarantee that this algorithm is safe or in compliance with all applicable laws."; and (h) the effective date of the disclosure. 3. The disclosure required under this section shall be made available in each covered language in which the developer or deployer operates or provides a good or service. 4. Any disclosure provided under this section shall be made available in a manner that is reasonably accessible to and usable by individuals with disabilities. 5. (a) If a developer or deployer makes a material change to the disclosure required under this section, the developer or deployer shall notify each individual affected by such material change prior to implementing the material change. (b) Each developer or deployer shall take all reasonable measures to provide to each affected individual a direct electronic notification regarding any material change to the disclosure, in each covered language in which the disclosure is made available and taking into account available technology and the nature of the relationship with such individual. (c) (i) Beginning after the effective date of this article, each developer or deployer shall retain a copy of each previous version of the disclosure required under this section for a period of at least ten years after the last day on which such version was effective and publish each such version on its website. Each developer or deployer shall make publicly available, in a clear, conspicuous, and readily accessible manner, a log describing the date and nature of each material change to its disclosure during the retention period, and such descriptions shall be sufficient for a reasonable individual to understand the material effect of each material change. (ii) The obligations described in this paragraph shall not apply to any previous version of a developer or deployer's disclosure of practices regarding the collection, processing, and transfer of personal data, or any material change to such disclosure, that precedes the effective date of this article.
H-01 Human Oversight of Automated Decisions · H-01.3 · Deployer · Automated Decisionmaking
Civil Rights Law § 110(6)-(7)
Plain Language
Deployers must create a short-form notice (max 500 words) for each covered algorithm that summarizes individual rights, highlights unexpected practices or consequential actions, and is written in plain language accessible to individuals with disabilities. For individuals with whom the deployer has an existing relationship, the notice must be delivered electronically at the first interaction with the algorithm. For individuals without a pre-existing relationship, the notice must be posted on the deployer's website. This is distinct from the comprehensive disclosure in § 110(1) — it is a concise, user-facing summary designed to provide meaningful notice at the point of algorithmic interaction.
Statutory Text
6. A deployer shall provide a short-form notice regarding a covered algorithm it develops, offers, licenses, or uses in a manner that: (a) is concise, clear, conspicuous, in plain language, and not misleading; (b) is readily accessible to individuals with disabilities; (c) is based on what is reasonably anticipated within the context of the relationship between the individual and the deployer; (d) includes an overview of each applicable individual right and disclosure in a manner that draws attention to any practice that may be unexpected to a reasonable individual or that involves a consequential action; (e) is not more than five hundred words in length; and (f) is available to the public at no cost. 7. (a) If a deployer has a relationship with an individual, the deployer shall provide an electronic version of the short-form notice directly to the individual upon the individual's first interaction with the covered algorithm. (b) If a deployer does not have a relationship with an individual, the deployer shall provide the short-form notice in a clear, conspicuous, accessible, and not misleading manner on their website.
Other · Automated Decisionmaking
Civil Rights Law § 110(9)
Plain Language
Every developer and deployer must provide a publicly accessible mechanism through which individuals impacted by a covered algorithm can report potential violations of the article. This is a public-facing complaint intake channel — it must be clear, conspicuous, and readily accessible. Unlike internal whistleblower mechanisms, this is directed at affected individuals (consumers, applicants, etc.) rather than employees.
Statutory Text
9. Each developer or deployer shall make publicly available, in a clear, conspicuous, and readily accessible manner, a mechanism for an individual impacted by a covered algorithm to report to the developer or deployer potential violations of this article.
Other · Automated Decisionmaking
Civil Rights Law § 111
Plain Language
The Division must conduct a public study on the feasibility of requiring deployers to provide individuals with explanations of how covered algorithms affected them — including what factors drove the output, how explanations can be made accessible and accurate, and what developers must share with deployers to enable explanations. The study report, including legislative recommendations, must be submitted to the Governor and relevant legislative committees within 18 months of the effective date. This imposes no obligation on developers or deployers — it is a legislative research mandate that may lead to future rulemaking or legislation on algorithmic explainability.
Statutory Text
1. The division shall conduct a study, with notice and public comment, on the feasibility of requiring deployers to provide a clear, conspicuous, easy-to-use, no-cost mechanism that is accessible for individuals with disabilities and allows an individual to receive an explanation as to whether and how a covered algorithm used by the deployer affects or affected an individual. 2. The study required under subdivision one of this section shall include the following: (a) an overview of the purposes for which an explanation would be provided to an individual and the extent to which an explanation would feasibly serve such purposes. (b) how explanations can be provided in a manner that is clear, conspicuous, easy-to-use, no-cost, accessible to individuals with disabilities, effective for individuals with limited English language proficiency, and calibrated to the level of risk based on the covered algorithm; (c) an assessment of the feasibility of a requirement for deployers to provide a mechanism for individuals who may be affected or were affected by a covered algorithm to request an explanation that: (i) includes information regarding why the covered algorithm produced the result it produced with respect to the individual making the request, and that is truthful, accurate, and scientifically valid; (ii) identifies at least the most significant factors used to inform the covered algorithm's outputs; and (iii) includes any other information deemed relevant by the division to provide an explanation for an individual who may be affected or was affected by a covered algorithm; (d) an assessment of what information a developer must provide a deployer in order to ensure explanations can be provided to individuals upon request; (e) the extent to which current technical capabilities of covered algorithms impacts the feasibility of providing explanations; (f) how a deployer can take reasonable measures to verify the identity of an individual making a request for an explanation to ensure that the deployer provides an explanation only to the affected individual, including steps a deployer should take to ensure the safe and secure storage, collection, and deletion of personal information; and (g) recommendations for the legislature on how to implement regulations around mechanisms for explanations. 3. In conducting the study required under this subsection, the division shall consult with the office of information technology services, and any other agency, office, commission or department deemed relevant by the division. 4. Not later than eighteen months after the effective date of this article, the division shall submit to the governor, the majority and minority leaders of the senate and the assembly, the senate Internet and Technology Committee, and the assembly Science and Technology Committee a report that includes the findings of the study conducted under subdivision one of this section, together with recommendations for such legislation and administrative action as the division determines appropriate.
Other · Automated Decisionmaking
Civil Rights Law § 112
Plain Language
The Division must: (1) within 90 days, publish a multilingual consumer awareness web page describing all provisions, rights, and remedies under the article; (2) beginning two years after the effective date and annually thereafter, publish machine-readable reports summarizing all submitted evaluations and assessments with aggregated trends; and (3) within 180 days of the first annual report, launch a publicly searchable repository of all submitted pre-deployment evaluations, impact assessments, and developer reviews. The repository must be sortable, downloadable, and accessible, with trade secrets and personal data redacted. This creates a public transparency infrastructure operated by the Division — developers and deployers are not directly obligated by this section (their submission obligations arise under § 104(6)).
Statutory Text
1. (a) Not later than ninety days after the effective date of this article, the division shall publish, on the internet website of the division, a web page that describes each provision, right, obligation, and requirement of this article (categorized with respect to individuals, deployers, and developers) and the remedies, exemptions, and protections associated with this article, in plain and concise language, in each covered language, and in an easy-to-understand, accessible manner. (b) The division shall update the information published under paragraph (a) of this subdivision as necessitated by any change in law, regulation, guidance, or judicial decision. Any such update shall be published in plain and concise language, in each covered language, and in an easy-to-understand, accessible manner. 2. Not later than two years after the date of effective date of this article, and annually thereafter, the division shall publish on the internet website of the division a report that: (a) describes and summarizes the information contained in any pre-deployment evaluation, impact assessment, and developer review submitted to the division in accordance with this article; (b) describes broad trends, aggregated statistics, and anonymized information about performing impact assessments of covered algorithms, for the purposes of updating guidance related to impact assessments and summary reporting, oversight, and making recommendations to other regulatory agencies; and (c) is accessible and machine readable. 3.(a) Not later than one hundred eighty days after the division publishes the first annual report under subdivision two of this section, the division shall develop a publicly accessible repository to publish each pre-deployment evaluation, impact assessment, and developer review submitted to the division in accordance with section one hundred three and one hundred four of this article. (b) The division shall design the repository established under paragraph (a) of this section to: (i) be publicly available and easily discoverable on the internet website of the division; (ii) allow users to sort and search the repository by multiple characteristics (such as by developer or deployer and date reported) simultaneously; (iii) allow users to make a copy of or download the information obtained from the repository, including any subsets of information obtained by sorting or searching as described in subparagraph (ii) of this paragraph; (iv) be in accordance with user experience and accessibility best practices; and (v) include information about the design, use, and maintenance of the repository, including any other information determined appropriate by the division. (c) The division shall publish in the repository any pre-deployment evaluation, impact assessment, and developer review not later than thirty days after receiving such evaluation, assessment, or review, except if the division has good cause to delay such publication. (d) The division: (i) may redact and segregate any trade secret (as defined in section 1839 of title 18, United States Code) from public disclosure under this subsection; (ii) shall redact and segregate personal data from public disclosure under this subdivision; and (iii) may withhold information as permitted under section 552 of title 5, United States Code.
Other · Automated Decisionmaking
Civil Rights Law § 105
Plain Language
The Division must promulgate rules within two years specifying: what factors must be considered in preliminary evaluations and impact assessments, what must be included in published summaries, and the process by which developers may request additional information from deployers. This is a rulemaking directive to the Division — while it will eventually shape the specific content requirements for developer and deployer evaluations, it does not itself create an obligation on regulated entities beyond what §§ 103-104 already require.
Statutory Text
Not later than two years after the effective date of this article, the division shall: (a) promulgate rules specifying: (i) what information and factors a developer or deployer shall consider in making the preliminary evaluation or preliminary impact assessment described in sections one hundred three and one hundred four of this article, respectively; (ii) what information a developer or deployer shall include in a summary of an evaluation, assessment, or developer review described in section one hundred four of this article; and (iii) the extent to and process by which a developer may request additional information from a deployer, including the purposes for which a developer is permitted to use such additional information; and (b) in promulgating such rules, consider the need to protect the privacy of personal data, as well as the need for information sharing by developers and deployers to comply with this section and inform the public.