A-09654
NY · State · USA
NY
USA
● Pending
Proposed Effective Date
2027-01-01
New York A 9654 — An Act to amend the civil rights law, in relation to enacting the "New York Artificial Intelligence Civil Rights Act"
Comprehensive AI civil rights bill applying to developers and deployers of 'covered algorithms' — AI systems used in consequential decisions affecting employment, education, housing, healthcare, credit, criminal justice, government services, and other high-stakes domains. Prohibits algorithmic discrimination and disparate impact based on a broad list of protected characteristics. Requires independently audited pre-deployment evaluations and annual post-deployment impact assessments, with results submitted to the Division of Consumer Protection and publicly summarized. Imposes detailed notice and disclosure obligations, including short-form notices to individuals and public reporting mechanisms. Mandates harm mitigation, stakeholder consultation, and whistleblower protections. Enforcement is through both Attorney General civil actions (with civil penalties of $15,000 per violation or 4% of gross annual revenue) and a private right of action (treble damages or $15,000 per violation, plus punitive damages and attorneys' fees). Pre-dispute arbitration agreements are unenforceable.
Summary

Comprehensive AI civil rights bill applying to developers and deployers of 'covered algorithms' — AI systems used in consequential decisions affecting employment, education, housing, healthcare, credit, criminal justice, government services, and other high-stakes domains. Prohibits algorithmic discrimination and disparate impact based on a broad list of protected characteristics. Requires independently audited pre-deployment evaluations and annual post-deployment impact assessments, with results submitted to the Division of Consumer Protection and publicly summarized. Imposes detailed notice and disclosure obligations, including short-form notices to individuals and public reporting mechanisms. Mandates harm mitigation, stakeholder consultation, and whistleblower protections. Enforcement is through both Attorney General civil actions (with civil penalties of $15,000 per violation or 4% of gross annual revenue) and a private right of action (treble damages or $15,000 per violation, plus punitive damages and attorneys' fees). Pre-dispute arbitration agreements are unenforceable.

Enforcement & Penalties
Enforcement Authority
Dual enforcement: the Attorney General may bring a civil action as parens patriae on behalf of state residents in federal district court when there is reason to believe a violation has occurred or residents' interests are threatened. The Division of Consumer Protection has rulemaking and oversight authority, maintains a public repository of evaluations and assessments, and has the right to intervene in private actions. Private right of action is available to any individual or class of individuals alleging a violation. Before filing a private action, the individual must provide written notice to the Division and the Attorney General, who have 60 days to determine whether to intervene. Pre-dispute arbitration agreements and pre-dispute joint-action waivers are unenforceable with respect to disputes arising under the act.
Penalties
AG enforcement: civil penalties of $15,000 per violation or 4% of defendant's average gross annual revenue over the preceding three years, whichever is greater; injunctive relief (permanent, temporary, or preliminary); damages, restitution, or other compensation on behalf of residents; reasonable attorneys' fees and litigation costs. Private right of action: treble damages or $15,000 per violation, whichever is greater; nominal damages; punitive damages; reasonable attorneys' fees and litigation costs; equitable or declaratory relief. Statutory damages do not require proof of actual monetary harm.
Who Is Covered
"Deployer" means any person that uses a covered algorithm for a commercial act. The terms "deployer" and "developer" shall not be interpreted to be mutually exclusive.
(a) "Developer" means any person that designs, codes, customizes, produces, or substantially modifies an algorithm that is intended or reasonably likely to be used as a covered algorithm for such person's own use, or use by a third party, in connection with a commercial act, or for use by a government entity. (b) In the event that a deployer uses an algorithm as a covered algorithm, and no person is considered the developer of the algorithm for purposes of paragraph (a) of this subdivision, the deployer shall be considered the developer of the covered algorithm for the purposes of this article. (c) The terms "deployer" and "developer" shall not be interpreted to be mutually exclusive.
What Is Covered
"Covered algorithm" means: (a) a computational process derived from machine learning, natural language processing, artificial intelligence techniques, or other computational processing techniques of similar or greater complexity, that, with respect to a consequential action: (i) creates or facilitates the creation of a product or information that is used as an integral part of the consequential action; (ii) promotes, recommends, ranks, or otherwise affects the display or delivery of information that is used as an integral part of the consequential action; (iii) makes a decision; or (iv) facilitates human decision making; or (b) any other computational process deemed appropriate by the division through rules.
Compliance Obligations 26 obligations · click obligation ID to open requirement page
H-02 Non-Discrimination & Bias Assessment · H-02.1H-02.2H-02.3 · DeveloperDeployer · Automated Decisionmaking
Civil Rights Law § 102(1)-(2)
Plain Language
Developers and deployers are prohibited from offering, licensing, promoting, selling, or using a covered algorithm for consequential actions in a manner that discriminates or causes disparate impact on the basis of any protected characteristic. The list of protected characteristics is exceptionally broad, including race, sex, disability, income level, immigration status, limited English proficiency, biometric information, and any other classification protected by federal or New York law. An action causing differential effect is unjustified unless the developer or deployer proves it is necessary for a substantial, legitimate, nondiscriminatory interest and that no less-discriminatory alternative exists. The algorithm is presumed to be analyzed holistically unless the developer or deployer proves separability by preponderance of evidence. Carve-outs exist for self-testing to prevent discrimination, diversity expansion, good-faith non-commercial research, and private clubs.
Statutory Text
1. A developer or deployer shall not offer, license, promote, sell, or use a covered algorithm in a manner that: (a) causes or contributes to a disparate impact in a manner that prevents; (b) otherwise discriminates in a manner that prevents; or (c) otherwise makes unavailable, the equal enjoyment of goods, services, or other activities or opportunities, related to a consequential action, on the basis of a protected characteristic. 2. This section shall not apply to: (a) the offer, licensing, or use of a covered algorithm for the sole purpose of: (i) a developer's or deployer's self-testing (or auditing by an independent auditor at a developer's or deployer's request) to identify, prevent, or mitigate discrimination, or otherwise to ensure compliance with obligations, under federal or state law; (ii) expanding an applicant, participant, or customer pool to raise the likelihood of increasing diversity or redressing historic discrimination; or (iii) conducting good faith security research, or other research, if conducting the research is not part or all of a commercial act; or (b) any private club or other establishment not in fact open to the public, as described in section 201(e) of the Civil Rights Act of 1964 (42 U.S.C. 2000a(e)).
H-02 Non-Discrimination & Bias Assessment · H-02.1H-02.2H-02.3H-02.6 · Developer · Automated Decisionmaking
Civil Rights Law § 103(1)-(3)
Plain Language
Before deploying, licensing, or offering any covered algorithm for a consequential action — including material changes to previously deployed algorithms — developers and deployers must conduct a two-stage pre-deployment evaluation. First, a preliminary evaluation assesses whether harm is plausible. If harm is not plausible, the developer or deployer must document a finding of no plausible harm and submit it to the Division. If harm is plausible, the developer must engage a qualified independent auditor to conduct a full pre-deployment evaluation. The full evaluation must cover algorithm design and methodology, training and testing data and methods (including demographic representation and protected characteristic testing), potential for harm and disparate impact, and recommendations for mitigation. The independent auditor must have no financial or employment relationship with the developer or deployer beyond the auditing engagement. The auditor submits a report with findings and recommendations to the developer. For material changes to existing algorithms, the scope may be limited to harms arising from the change.
Statutory Text
1. Prior to deploying, licensing, or offering a covered algorithm (including deploying a material change to a previously-deployed covered algorithm or a material change made prior to deployment) for a consequential action, a developer or deployer shall conduct a pre-deployment evaluation in accordance with this section. 2. (a) The developer shall conduct a preliminary evaluation of the plausibility that any expected use of the covered algorithm may result in a harm. (b) The deployer shall conduct a preliminary evaluation of the plausibility that any intended use of the covered algorithm may result in a harm. (c) Based on the results of the preliminary evaluation, the developer or deployer shall: (i) in the event that a harm is not plausible, record a finding of no plausible harm, including a description of the developer's expected use or the deployer's intended use of the covered algorithm, how the preliminary evaluation was conducted, and an explanation for the finding, and submit such record to the division; and (ii) in the event that a harm is plausible, conduct a full pre-deployment evaluation as described in subdivision three or subdivision four of this section, as applicable. (d) When conducting a preliminary evaluation of a material change to, or new use of, a previously-deployed covered algorithm, the developer or deployer may limit the scope of the evaluation to whether use of the covered algorithm may result in a harm as a result of the material change or new use. 3. (a) If a developer determines a harm is plausible during the preliminary evaluation described in subdivision two of this section, the developer shall engage an independent auditor to conduct a pre-deployment evaluation. The evaluation required by this subdivision shall include a detailed review and description, sufficient for an individual having ordinary skill in the art to understand the functioning, risks, uses, benefits, limitations, and other pertinent attributes of the covered algorithm, including: (i) the covered algorithm's design and methodology, including the inputs the covered algorithm is designed to use to produce an output and the outputs the covered algorithm is designed to produce; (ii) how the covered algorithm was created, trained, and tested, including: (A) any metric used to test the performance of the covered algorithm; (B) defined benchmarks and goals that correspond to such metrics, including whether there was sufficient representation of demographic groups that are reasonably likely to use or be affected by the covered algorithm in the data used to create or train the algorithm, and whether there was reasonable testing, if any, across such demographic groups; (C) the outputs the covered algorithm actually produces in testing; (D) a description of any consultation with relevant stakeholders, including any communities that will be impacted by the covered algorithm, regarding the development of the covered algorithm, or a disclosure that no such consultation occurred; (E) a description of which protected characteristics, if any, were used for testing and evaluation, and how and why such characteristics were used, including: (1) whether the testing occurred in comparable contextual conditions to the conditions in which the covered algorithm is expected to be used; and (2) if protected characteristics were not available to conduct such testing, a description of alternative methods the developer used to conduct the required assessment; (F) any other computational algorithm incorporated into the development of the covered algorithm, regardless of whether such precursor computational algorithm involves a consequential action; (G) a description of the data and information used to develop, test, maintain, or update the covered algorithm, including: (1) each type of personal data used, each source from which the personal data was collected, and how each type of personal data was inferred and processed; (2) the legal authorization for collecting and processing the personal data; and (3) an explanation of how the data (including personal data) used is representative, proportional, and appropriate to the development and intended uses of the covered algorithm; and (H) a description of the training process for the covered algorithm which includes the training, validation, and test data utilized to confirm the intended outputs; (iii) the potential for the covered algorithm to produce a harm or to have a disparate impact in the equal enjoyment of goods, services, or other activities or opportunities, and a description of such potential harm or disparate impact; (iv) alternative practices and recommendations to prevent or mitigate harm and recommendations for how the developer could monitor for harm after offering, licensing, or deploying the covered algorithm; and (v) any other information the division deems pertinent to prevent the covered algorithm from causing harm or having a disparate impact in the equal enjoyment of goods, services, or other activities or opportunities, as prescribed by rules promulgated by the division. (b) The independent auditor shall submit to the developer a report on the evaluation conducted under this subdivision, including the findings and recommendations of such independent auditor.
H-02 Non-Discrimination & Bias Assessment · H-02.1H-02.2H-02.3H-02.6 · Deployer · Automated Decisionmaking
Civil Rights Law § 103(4)
Plain Language
Deployers face a parallel pre-deployment evaluation obligation when their preliminary assessment finds harm is plausible. The deployer must engage an independent auditor to conduct a full evaluation covering the deployment context: how the algorithm contributes to consequential actions, necessity and proportionality relative to the baseline process being replaced, data inputs and their representativeness, expected versus actual outputs in testing, stakeholder consultation, and potential for harm or disparate impact. The deployer's evaluation is context-specific — it focuses on how the algorithm will be used in the deployer's particular environment, as opposed to the developer's evaluation which focuses on the algorithm's general design and training. The independent auditor submits a report with findings and recommendations to the deployer.
Statutory Text
4. (a) If a deployer determines a harm is plausible during the preliminary evaluation described in subdivision two of this section, the deployer shall engage an independent auditor to conduct a pre-deployment evaluation. The evaluation required by this subdivision shall include a detailed review and description, sufficient for an individual having ordinary skill in the art to understand the functioning, risks, uses, benefits, limitations, and other pertinent attributes of the covered algorithm, including: (i) the manner in which the covered algorithm makes or contributes to a consequential action and the purpose for which the covered algorithm will be deployed; (ii) the necessity and proportionality of the covered algorithm in relation to its planned use, including the intended benefits and limitations of the covered algorithm and a description of the baseline process being enhanced or replaced by the covered algorithm, if applicable; (iii) the inputs that the deployer plans to use to produce an output, including: (A) the type of personal data and information used and how the personal data and information will be collected, inferred, and processed; (B) the legal authorization for collecting and processing the personal data; and (C) an explanation of how the data used is representative, proportional, and appropriate to the deployment of the covered algorithm; (iv) the outputs the covered algorithm is expected to produce and the outputs the covered algorithm actually produces in testing; (v) a description of any additional testing or training completed by the deployer for the context in which the covered algorithm will be deployed; (vi) a description of any consultation with relevant stakeholders, including any communities that will be impacted by the covered algorithm, regarding the deployment of the covered algorithm; (vii) the potential for the covered algorithm to produce a harm or to have a disparate impact in the equal enjoyment of goods, services, or other activities or opportunities in the context in which the covered algorithm will be deployed and a description of such potential harm or disparate impact; (viii) alternative practices and recommendations to prevent or mitigate harm in the context in which the covered algorithm will be deployed and recommendations for how the deployer could monitor for harm after offering, licensing, or deploying the covered algorithm; and (ix) any other information the division deems pertinent to prevent the covered algorithm from causing harm or having a disparate impact in the equal enjoyment of goods, services, or other activities or opportunities as prescribed by rules promulgated by the division. (b) The independent auditor shall submit to the deployer a report on the evaluation conducted under this subdivision, including the findings and recommendations of such independent auditor.
H-02 Non-Discrimination & Bias Assessment · H-02.6H-02.8 · Deployer · Automated Decisionmaking
Civil Rights Law § 104(1)-(3)
Plain Language
Deployers must conduct annual post-deployment impact assessments for each covered algorithm. The process follows the same two-stage structure as the pre-deployment evaluation: a preliminary assessment identifies whether harm occurred during the reporting period. If no harm is identified, the deployer documents a no-harm finding and submits it to the Division. If harm occurred, the deployer must engage an independent auditor for a full impact assessment covering: the nature and extent of harm, disparate impact analysis with methodology, data inputs and their use for retraining, whether outputs matched expectations, how the algorithm was used in consequential actions, and mitigation actions taken. The auditor's report goes to the deployer, and within 30 days the deployer must share a summary with the developer. This is a continuing annual obligation for the entire life of the deployment.
Statutory Text
1. After the deployment of a covered algorithm, a deployer shall, on an annual basis, conduct an impact assessment in accordance with this section. The deployer shall conduct a preliminary impact assessment of the covered algorithm to identify any harm that resulted from the covered algorithm during the reporting period and: (a) if no resulting harm is identified by such assessment, shall record a finding of no harm, including a description of the developer's expected use or the deployer's intended use of the covered algorithm, how the preliminary evaluation was conducted, and an explanation for such finding, and submit such finding to the division; and (b) if a resulting harm is identified by such assessment, shall conduct a full impact assessment as described in subdivision two of this section. 2. In the event that the covered algorithm resulted in a harm during the reporting period, the deployer shall engage an independent auditor to conduct a full impact assessment with respect to the reporting period, including: (a) an assessment of the harm that resulted or was reasonably likely to have been produced during the reporting period; (b) a description of the extent to which the covered algorithm produced a disparate impact in the equal enjoyment of goods, services, or other activities or opportunities, including the methodology for such evaluation, of how the covered algorithm produced or likely produced such disparity; (c) a description of the types of data input into the covered algorithm during the reporting period to produce an output, including: (i) documentation of how data input into the covered algorithm to produce an output is represented and complete descriptions of each field of data; and (ii) whether and to what extent the data input into the covered algorithm to produce an output was used to train or otherwise modify the covered algorithm; (d) whether and to what extent the covered algorithm produced the outputs it was expected to produce; (e) a detailed description of how the covered algorithm was used to make a consequential action; (f) any action taken to prevent or mitigate harms, including how relevant staff are informed of, trained about, and implement harm mitigation policies and practices, and recommendations for how the deployer could monitor for and prevent harm after offering, licensing, or deploying the covered algorithm; and (g) any other information the division deems pertinent to prevent the covered algorithm from causing harm or having a disparate impact in the equal enjoyment of goods, services, or other activities or opportunities as prescribed by rules promulgated by the division. 3. (a) After the engagement of the independent auditor, the independent auditor shall submit to the deployer a report on the impact assessment conducted under subdivision two of this section, including the findings and recommendations of such independent auditor. (b) Not later than thirty days after the submission of a report on an impact assessment under this section, a deployer shall submit to the developer of the covered algorithm a summary of such report, subject to the trade secret and privacy protections described in subdivision six of this section.
H-02 Non-Discrimination & Bias Assessment · H-02.8 · Developer · Automated Decisionmaking
Civil Rights Law § 104(4)
Plain Language
Developers must annually review every impact assessment summary received from deployers of their covered algorithms. The review must assess deployer usage patterns, data inputs and outputs, contractual compliance, real-world versus pre-deployment performance, ongoing harm potential, disparate impact by protected characteristic, need for algorithm modification, and any other Division-prescribed responsive actions. This creates a feedback loop: deployers conduct annual impact assessments and share summaries with developers, who must then affirmatively review those summaries and determine whether corrective action is needed. This obligation runs parallel to the deployer's annual assessment — developers cannot passively receive deployer summaries without acting on them.
Statutory Text
4. A developer shall, on an annual basis, review each impact assessment summary submitted by a deployer of its covered algorithm under subdivision three of this section for the following purposes: (a) to assess how the deployer is using the covered algorithm, including the methodology for assessing such use; (b) to assess the type of data the deployer is inputting into the covered algorithm to produce an output and the types of outputs the covered algorithm is producing; (c) to assess whether the deployer is complying with any relevant contractual agreement with the developer and whether any remedial action is necessary; (d) to compare the covered algorithm's performance in real-world conditions versus pre-deployment testing, including the methodology used to evaluate such performance; (e) to assess whether the covered algorithm is causing harm or is reasonably likely to be causing harm; (f) to assess whether the covered algorithm is causing, or is reasonably likely to be causing, a disparate impact in the equal enjoyment of goods, services, or other activities or opportunities, and, if so, how and with respect to which protected characteristic; (g) to determine whether the covered algorithm needs modification; (h) to determine whether any other action is appropriate to ensure that the covered algorithm remains safe and effective; and (i) to undertake any other assessment or responsive action the division deems pertinent to prevent the covered algorithm from causing harm or having a disparate impact in the equal enjoyment of goods, services, or other activities or opportunities, as prescribed by rules promulgated by the division.
R-02 Regulatory Disclosure & Submissions · R-02.1 · DeveloperDeployer · Automated Decisionmaking
Civil Rights Law § 104(6)(a)-(b)
Plain Language
Within 30 days of completing any full pre-deployment evaluation, full impact assessment, or developer annual review, the developer or deployer must: (1) submit the complete evaluation, assessment, or review to the Division of Consumer Protection; (2) publish a public summary on their website; and (3) submit the summary to the Division. All evaluations, assessments, and reviews must be retained for at least 10 years. Upon legislative request, the documents must also be made available to the legislature. Trade secrets may be redacted from public disclosures, and personal data must be redacted.
Statutory Text
6. (a) A developer or deployer that conducts a full pre-deployment evaluation, full impact assessment, or developer annual review of assessments shall: (i) not later than thirty days after completion, submit the evaluation, assessment, or review to the division; (ii) upon request, make the evaluation, assessment, or review available to the legislature; and (iii) not later than thirty days after completion: (A) publish a summary of the evaluation, assessment, or review on the website of the developer or deployer in a manner that is easily accessible to individuals; and (B) submit such summary to the division. (b) A developer or deployer shall retain all evaluations, assessments, and reviews described in this section for a period of not fewer than ten years.
G-02 Public Transparency & Documentation · G-02.4 · DeveloperDeployer · Automated Decisionmaking
Civil Rights Law § 104(6)(a)(iii)
Plain Language
Developers and deployers must publish on their website a public summary of each full pre-deployment evaluation, impact assessment, or developer annual review within 30 days of completion. The summary must be easily accessible to individuals. Trade secrets may be redacted but personal data must be redacted. This is the public transparency component of the broader submission obligation — the full document goes to the Division, while the summary goes to the public.
Statutory Text
(iii) not later than thirty days after completion: (A) publish a summary of the evaluation, assessment, or review on the website of the developer or deployer in a manner that is easily accessible to individuals; and (B) submit such summary to the division.
S-01 AI System Safety Program · S-01.7 · DeveloperDeployer · Automated Decisionmaking
Civil Rights Law § 106(1)(a)-(b)
Plain Language
Developers and deployers must take reasonable measures to prevent and mitigate any harms identified by pre-deployment evaluations or post-deployment impact assessments. This is not merely an obligation to evaluate — it requires affirmative remediation of identified harms. Additionally, developers and deployers must ensure that independent auditors have all information necessary to conduct accurate evaluations and assessments. This creates a duty of cooperation with auditors that cannot be evaded by withholding information.
Statutory Text
(a) take reasonable measures to prevent and mitigate any harm identified by a pre-deployment evaluation described in section one hundred three or an impact assessment described in section one hundred four of this article; (b) take reasonable measures to ensure that an independent auditor has all necessary information to complete an accurate and effective pre-deployment evaluation described in section one hundred three or an impact assessment described in section one hundred four of this article;
CP-01 Deceptive & Manipulative AI Conduct · DeveloperDeployer · Automated Decisionmaking
Civil Rights Law § 106(1)(d)-(e)
Plain Language
Developers and deployers must certify — based on evaluation or assessment results — that their covered algorithm is not likely to result in harm or disparate impact, that benefits to affected individuals likely outweigh harms, and that the algorithm will not result in deceptive acts or practices. They must also ensure the algorithm performs at a reasonable level for a person with ordinary skill in the art and consistently with its publicly advertised purpose. This creates a substantive performance warranty and anti-deception certification that goes beyond procedural assessment obligations.
Statutory Text
(d) with respect to a covered algorithm, certify that, based on the results of a pre-deployment evaluation described in section one hundred three or an impact assessment described in section one hundred four of this article: (i) use of the covered algorithm is not likely to result in harm or disparate impact in the equal enjoyment of goods, services, or other activities or opportunities; (ii) the benefits from the use of the covered algorithm to individuals affected by the covered algorithm likely outweigh the harms from the use of the covered algorithm to such individuals; and (iii) use of the covered algorithm is not likely to result in a deceptive act or practice; (e) ensure that any covered algorithm of the developer or deployer functions at a level that would be considered reasonable performance by an individual with ordinary skill in the art; and in a manner that is consistent with its expected and publicly-advertised performance, purpose, or use;
CP-01 Deceptive & Manipulative AI Conduct · CP-01.3 · DeveloperDeployer · Automated Decisionmaking
Civil Rights Law § 106(2)(a)
Plain Language
Developers and deployers are prohibited from making false, deceptive, or misleading claims in their advertising, marketing, or public representations about their covered algorithms. This is a straightforward prohibition on deceptive commercial speech about AI systems, covering claims about capabilities, performance, accuracy, and any other attributes of the algorithm.
Statutory Text
2. (a) It shall be unlawful for a developer or deployer to engage in false, deceptive, or misleading advertising, marketing, or publicizing of a covered algorithm of the developer or deployer.
S-01 AI System Safety Program · S-01.1 · DeveloperDeployer · Automated Decisionmaking
Civil Rights Law § 106(2)(b)-(c)
Plain Language
Developers may not knowingly offer or license a covered algorithm for any consequential action that was not covered by the pre-deployment evaluation. Deployers may not knowingly use a covered algorithm for unevaluated consequential actions, unless the deployer assumes full developer responsibilities under the act. This effectively gates deployment to evaluated use cases — any expansion into new consequential action domains requires a new or supplemental evaluation. The deployer assumption-of-developer-responsibilities pathway provides a safety valve but at a significant compliance cost.
Statutory Text
(b) It shall be unlawful for a developer to knowingly offer or license a covered algorithm for any consequential action other than those evaluated in the pre-deployment evaluation described in section one hundred three of this article. (c) It shall be unlawful for a deployer to knowingly use a covered algorithm for any consequential action other than a use evaluated in the pre-deployment evaluation described in section one hundred three of this article, unless the deployer agrees to assume the responsibilities of a developer required by this article.
Other · DeveloperDeployer · Automated Decisionmaking
Civil Rights Law § 106(1)(c)
Plain Language
Developers and deployers must consult with relevant stakeholders, including any communities that will be impacted by a covered algorithm, prior to deploying, licensing, or offering the algorithm. This is a standalone procedural requirement for community engagement that goes beyond the stakeholder consultation elements already documented in the pre-deployment evaluation (§ 103). The obligation applies to every covered algorithm, not only those that pass the plausibility-of-harm threshold.
Statutory Text
(c) with respect to a covered algorithm, consult stakeholders, including any communities that will be impacted by the covered algorithm, regarding the development or deployment of the covered algorithm prior to the deploying, licensing, or offering the covered algorithm;
Other · DeveloperDeployer · Automated Decisionmaking
Civil Rights Law § 106(1)(f)
Plain Language
Developers and deployers must ensure that all data used throughout the lifecycle of a covered algorithm — design, development, deployment, and use — is relevant and appropriate to the deployment context and the publicly advertised purpose. This creates a continuing data fitness obligation that applies not just at training time but through the entire operational life of the algorithm.
Statutory Text
(f) ensure any data used in the design, development, deployment, or use of the covered algorithm is relevant and appropriate to the deployment context and the publicly-advertised purpose or use;
Other · Developer · Automated Decisionmaking
Civil Rights Law § 107(1)-(4)
Plain Language
Developers must make compliance-related information available to deployers upon reasonable request, including pre-deployment evaluation reports and information needed for deployer-side evaluations. Developers must either cooperate with deployer audits or arrange their own independent audit and share results. When algorithms are licensed, the written contract must specify data processing procedures, deployment instructions, data types and purposes, mutual obligations, material change notification, and a prohibition on combining data across clients. Contracts may not waive statutory obligations or restrict either party from reporting concerns to enforcement agencies. Developers must retain copies of all deployer contracts for 10 years. All developer obligations toward deployers extend equally to government entities.
Statutory Text
1. A developer shall do the following: (a) upon the reasonable request of the deployer, make available to the deployer information necessary to demonstrate the compliance of the deployer with the requirements of this article, including: (i) making available a report of the pre-deployment evaluation described in section one hundred three of this article or the annual review of assessments conducted by the developer under section one hundred four of this article; and (ii) providing information necessary to enable the deployer to conduct and document a pre-deployment evaluation under section one hundred three or an impact assessment described in section one hundred four of this article; and (b) either: (i) allow and cooperate with reasonable assessments conducted by the deployer or the deployer's designated independent auditor; or (ii) arrange for an independent auditor to conduct an assessment of the developer's policies and practices in support of the obligations under this article using an appropriate and accepted control standard or framework and assessment procedure for such assessments and provide a report of such assessment to the deployer upon request. 2. A developer may offer or license a covered algorithm to a deployer pursuant to a written contract between the developer and deployer, provided that the contract: (a) clearly sets forth the data processing procedures of the developer with respect to any collection, processing, or transfer of data performed on behalf of the deployer; (b) clearly sets forth: (i) instructions for collecting, processing, transferring, or disposing of data by the developer or deployer in the context of the use of the covered algorithm; (ii) instructions for deploying the covered algorithm as intended; (iii) the nature and purpose of any collection, processing, or transferring of data; (iv) the type of data subject to such collection, processing, or transferring; (v) the duration of such processing of data; and (vi) the rights and obligations of both parties, including a method by which the developer shall notify the deployer of material changes to its covered algorithm; (c) shall not relieve a developer or deployer of any requirement or liability imposed on such developer or deployer under this article; (d) prohibits both the developer and deployer from combining data received from or collected on behalf of the other party with data the developer or deployer received from or collected on behalf of another party; and (e) shall not prohibit a developer or deployer from raising concerns to any relevant enforcement agency with respect to the other party. 3. Each developer shall retain for a period of ten years a copy of each contract entered into with a deployer to which it provides requested products or services. 4. For purposes of this section, any requirement for a developer to contract with, assist, and follow the instructions of a deployer shall be read to include a requirement to contract with, assist, and follow the instructions of a government entity if the developer is providing a service to a government entity.
H-01 Human Oversight of Automated Decisions · H-01.4 · Deployer · Automated Decisionmaking
Civil Rights Law § 108(1)
Plain Language
The Division must promulgate regulations within two years specifying when and how deployers must give individuals the right to opt out of algorithmic decision-making and request a human alternative. The regulation must address notice clarity, timeliness, which types of consequential actions require a human alternative, feasibility, and the balance between individual control and practical effectiveness. Until regulations are promulgated, the specific scope and mechanics of the opt-out right are undefined — this is a directive to the Division, not a self-executing individual right. However, once regulations are finalized, deployers will need to implement opt-out mechanisms and human alternative pathways.
Statutory Text
1. Not later than two years after the effective date of this article, the division shall promulgate regulations in accordance with specifying the circumstances and manner in which a deployer shall provide to an individual a means to opt-out of the use of a covered algorithm for a consequential action and to elect to have the consequential action concerning the individual undertaken by a human without the use of a covered algorithm. In promulgating the regulations under this subdivision, the division shall consider the following: (a) how to ensure that any notice or request from a deployer regarding the right to a human alternative is clear and conspicuous, in plain language, easy to execute, and at no cost to an individual; (b) how to ensure that any such notice to individuals is effective, timely, and useful; (c) the specific types of consequential actions for which a human alternative is appropriate, considering the magnitude of the action and risk of harm; (d) the extent to which a human alternative would be beneficial to individuals and the public interest; (e) the extent to which a human alternative can prevent or mitigate harm; (f) the risk of harm to individuals beyond the requestor if a human alternative is available or not available; (g) the feasibility of providing a human alternative in different circumstances; and (h) any other considerations the division deems appropriate to balance the need to give an individual control over a consequential action related to such individual with the practical feasibility and effectiveness of granting such control.
CP-01 Deceptive & Manipulative AI Conduct · CP-01.3 · DeveloperDeployer · Automated Decisionmaking
Civil Rights Law § 108(2)
Plain Language
Developers and deployers are prohibited from using deceptive statements or dark pattern interface designs to discourage, obstruct, or manipulate individuals' exercise of their rights under the act. This includes both outright fraud and more subtle UI manipulation designed to obscure or subvert an individual's autonomy in making choices about algorithmic processing. The prohibition covers both intentional design ('purpose') and designs that have a 'substantial effect' of impairing individual choice, even if not intentionally deceptive.
Statutory Text
2. A developer or deployer may not condition, effectively condition, attempt to condition, or attempt to effectively condition the exercise of any individual right under this article or individual choice through: (a) the use of any false, fictitious, fraudulent, or materially misleading statement or representation; or (b) the design, modification, or manipulation of any user interface with the purpose or substantial effect of obscuring, subverting, or impairing a reasonable individual's autonomy, decision making, or choice to exercise any such right.
H-01 Human Oversight of Automated Decisions · H-01.4H-01.5 · Deployer · Automated Decisionmaking
Civil Rights Law § 108(3)
Plain Language
The Division must promulgate regulations within two years specifying when and how deployers must provide individuals a mechanism to appeal algorithmic consequential actions to a human reviewer. The regulations must ensure the appeal is free, accessible (including to individuals with disabilities), proportionate to the action, non-discriminatory, and timely. The regulations must also address data correction rights and training requirements for human reviewers. Like the opt-out provision in § 108(1), this is a rulemaking directive — the specific appeal right does not become operative until the Division issues regulations. Once in effect, deployers will need human review infrastructure with trained reviewers capable of overriding algorithmic decisions.
Statutory Text
3. Not later than two years after the effective date of this article, the division shall promulgate regulations specifying the circumstances and manner in which a deployer shall provide to an individual a mechanism to appeal to a human a consequential action resulting from the deployer's use of a covered algorithm. In promulgating the regulations under this subdivision, the division shall do the following: (a) ensure that the appeal mechanism is clear and conspicuous, in plain language, easy-to-execute, and at no cost to individuals; (b) ensure that the appeal mechanism is proportionate to the consequential action; (c) ensure that the appeal mechanism is reasonably accessible to individuals with disabilities, timely, usable, effective, and non-discriminatory; (d) require, where appropriate, a mechanism for individuals to identify and correct any personal data used by the covered algorithm; (e) specify training requirements for human reviewers with respect to a consequential action; and (f) consider any other circumstances, procedures, or matters the division deems appropriate to balance the need to give an individual a right to appeal a consequential action related to such individual with the practical feasibility and effectiveness of granting such right.
G-03 Whistleblower & Anti-Retaliation Protections · G-03.3 · DeveloperDeployer · Automated Decisionmaking
Civil Rights Law § 109(1)
Plain Language
Developers and deployers are prohibited from retaliating against any individual — whether a consumer, employee, or other person — for exercising rights under the act, refusing to waive those rights, raising concerns about algorithmic consequential actions, reporting violations, or cooperating with investigations. Retaliation includes both service-level retaliation (denying or threatening to deny equal access to goods and services) and employment-level retaliation (discharge, demotion, suspension, threats, harassment). The anti-retaliation protection is unusually broad: it covers not just employees (as in typical whistleblower statutes) but any individual who interacts with a covered algorithm, including consumers who complain about algorithmic decisions.
Statutory Text
1. A developer or deployer may not: (a) discriminate or retaliate against an individual (including by denying or threatening to deny the equal enjoyment of goods, services, or other activities or opportunities in relation to a consequential action) because the individual exercised any right, refused to waive any such right, raised a concern about a consequential action under this article, or assisted in any investigation or proceeding under this article; or (b) directly or indirectly, discharge, demote, suspend, threaten, harass, or otherwise discriminate or retaliate against an individual for raising a concern, reporting or attempting to report a violation of this article, or cooperating in any investigation or proceeding under this article.
G-02 Public Transparency & Documentation · G-02.4 · DeveloperDeployer · Automated Decisionmaking
Civil Rights Law § 110(1)-(5)
Plain Language
Every developer and deployer must publish a comprehensive public disclosure covering their AI practices, including: entity identity and contact information, links to evaluation and assessment summaries, categories of personal data collected and processing purposes, third-party data transfers, individual rights exercise instructions, compliance practices, a mandatory disclaimer about the limitations of the audit, and the disclosure's effective date. The disclosure must be in plain language, accessible to individuals with disabilities, and available in the top 10 languages spoken in New York. Material changes require advance notification to affected individuals via direct electronic communication. All previous disclosure versions must be retained for10 years and published on the website, along with a public change log describing the date and nature of each material change.
Statutory Text
1. Each developer or deployer shall make publicly available, in plain language and in a clear, conspicuous, not misleading, easy-to-read, and readily accessible manner, a disclosure that provides a detailed and accurate representation of the developer or deployer's practices regarding the requirements under this article. 2. The disclosure required under subdivision one of this section shall include, at a minimum, the following: (a) the identity and the contact information of: (i) the developer or deployer to which the disclosure applies (including the developer or deployer's point of contact and electronic and physical mail address, as applicable for any inquiry concerning a covered algorithm or individual rights under this article); and (ii) any other entity within the same corporate structure as the developer or deployer to which personal data is transferred by the developer or deployer. (b) a link to the website containing the developer or deployer's summaries of pre-deployment evaluations, impact assessments, and annual review of assessments, as applicable; (c) the categories of personal data the developer or deployer collects or processes in the development or deployment of a covered algorithm and the processing purpose for each such category; (d) whether the developer or deployer transfers personal data, and, if so, each third party to which the developer or deployer transfers such data and the purpose for which such data is transferred, except with respect to a transfer to a governmental entity pursuant to a court order or law that prohibits the developer or deployer from disclosing such transfer; (e) a prominent description of how an individual can exercise the rights described in this article; (f) a general description of the developer or deployer's practices for compliance with the requirements described in sections one hundred three and one hundred six of this article; (g) the following disclosure: "The audit of this algorithm was conducted to comply with the New York Artificial Intelligence Civil Rights Act, which seeks to avoid the use of any algorithm that has a disparate impact on certain protected classes of individuals. The audit does not guarantee that this algorithm is safe or in compliance with all applicable laws."; and (h) the effective date of the disclosure. 3. The disclosure required under this section shall be made available in each covered language in which the developer or deployer operates or provides a good or service. 4. Any disclosure provided under this section shall be made available in a manner that is reasonably accessible to and usable by individuals with disabilities. 5. (a) If a developer or deployer makes a material change to the disclosure required under this section, the developer or deployer shall notify each individual affected by such material change prior to implementing the material change. (b) Each developer or deployer shall take all reasonable measures to provide to each affected individual a direct electronic notification regarding any material change to the disclosure, in each covered language in which the disclosure is made available and taking into account available technology and the nature of the relationship with such individual. (c) (i) Beginning after the effective date of this article, each developer or deployer shall retain a copy of each previous version of the disclosure required under this section for a period of at least ten years after the last day on which such version was effective and publish each such version on its website. Each developer or deployer shall make publicly available, in a clear, conspicuous, and readily accessible manner, a log describing the date and nature of each material change to its disclosure during the retention period, and such descriptions shall be sufficient for a reasonable individual to understand the material effect of each material change. (ii) The obligations described in this paragraph shall not apply to any previous version of a developer or deployer's disclosure of practices regarding the collection, processing, and transfer of personal data, or any material change to such disclosure, that precedes the effective date of this article.
T-01 AI Identity Disclosure · T-01.1 · Deployer · Automated Decisionmaking
Civil Rights Law § 110(6)-(8)
Plain Language
Deployers must provide a short-form notice (500 words maximum) to individuals about their covered algorithms. The notice must be concise, plain-language, disability-accessible, and highlight any practices that may be unexpected or that involve consequential actions, including an overview of individual rights. For individuals with whom the deployer has a relationship, the notice must be delivered electronically at the individual's first interaction with the algorithm. For individuals without a direct relationship, the notice must be posted conspicuously on the deployer's website. The Division will promulgate regulations specifying minimum content requirements and a template. This is a point-of-interaction disclosure, distinct from the comprehensive public disclosure in § 110(1).
Statutory Text
6. A deployer shall provide a short-form notice regarding a covered algorithm it develops, offers, licenses, or uses in a manner that: (a) is concise, clear, conspicuous, in plain language, and not misleading; (b) is readily accessible to individuals with disabilities; (c) is based on what is reasonably anticipated within the context of the relationship between the individual and the deployer; (d) includes an overview of each applicable individual right and disclosure in a manner that draws attention to any practice that may be unexpected to a reasonable individual or that involves a consequential action; (e) is not more than five hundred words in length; and (f) is available to the public at no cost. 7. (a) If a deployer has a relationship with an individual, the deployer shall provide an electronic version of the short-form notice directly to the individual upon the individual's first interaction with the covered algorithm. (b) If a deployer does not have a relationship with an individual, the deployer shall provide the short-form notice in a clear, conspicuous, accessible, and not misleading manner on their website. 8. The division shall promulgate regulations specifying the minimum content required to be included in the short-form notice described in subdivision six of this section, which shall not exceed the content requirements described in subdivision six of this section and shall include a template or model for the short-form notice described in subdivision seven of this section.
Other · DeveloperDeployer · Automated Decisionmaking
Civil Rights Law § 110(9)
Plain Language
Every developer and deployer must maintain a publicly available, easily accessible mechanism through which individuals affected by a covered algorithm can report potential violations of the act. This is a public-facing complaint intake obligation — it does not replace enforcement channels but creates a direct reporting pathway between affected individuals and the entity operating the algorithm.
Statutory Text
9. Each developer or deployer shall make publicly available, in a clear, conspicuous, and readily accessible manner, a mechanism for an individual impacted by a covered algorithm to report to the developer or deployer potential violations of this article.
Other · Government · Automated Decisionmaking
Civil Rights Law § 111(1)-(4)
Plain Language
The Division must conduct a public study on the feasibility of requiring deployers to provide individuals with explanations of how covered algorithms affected them — including the most significant factors driving algorithmic outputs. The study must address technical feasibility, accessibility, identity verification, developer-to-deployer information flows, and legislative recommendations. The Division must report findings to the governor and legislature within 18 months of the act's effective date. This is a government study mandate, not a compliance obligation on developers or deployers. It signals a likely future explainability requirement but creates no current obligation for the private sector.
Statutory Text
1. The division shall conduct a study, with notice and public comment, on the feasibility of requiring deployers to provide a clear, conspicuous, easy-to-use, no-cost mechanism that is accessible for individuals with disabilities and allows an individual to receive an explanation as to whether and how a covered algorithm used by the deployer affects or affected an individual. 2. The study required under subdivision one of this section shall include the following: (a) an overview of the purposes for which an explanation would be provided to an individual and the extent to which an explanation would feasibly serve such purposes. (b) how explanations can be provided in a manner that is clear, conspicuous, easy-to-use, no-cost, accessible to individuals with disabilities, effective for individuals with limited English language proficiency, and calibrated to the level of risk based on the covered algorithm; (c) an assessment of the feasibility of a requirement for deployers to provide a mechanism for individuals who may be affected or were affected by a covered algorithm to request an explanation that: (i) includes information regarding why the covered algorithm produced the result it produced with respect to the individual making the request, and that is truthful, accurate, and scientifically valid; (ii) identifies at least the most significant factors used to inform the covered algorithm's outputs; and (iii) includes any other information deemed relevant by the division to provide an explanation for an individual who may be affected or was affected by a covered algorithm; (d) an assessment of what information a developer must provide a deployer in order to ensure explanations can be provided to individuals upon request; (e) the extent to which current technical capabilities of covered algorithms impacts the feasibility of providing explanations; (f) how a deployer can take reasonable measures to verify the identity of an individual making a request for an explanation to ensure that the deployer provides an explanation only to the affected individual, including steps a deployer should take to ensure the safe and secure storage, collection, and deletion of personal information; and (g) recommendations for the legislature on how to implement regulations around mechanisms for explanations. 3. In conducting the study required under this subsection, the division shall consult with the office of information technology services, and any other agency, office, commission or department deemed relevant by the division. 4. Not later than eighteen months after the effective date of this article, the division shall submit to the governor, the majority and minority leaders of the senate and the assembly, the senate Internet and Technology Committee, and the assembly Science and Technology Committee a report that includes the findings of the study conducted under subdivision one of this section, together with recommendations for such legislation and administrative action as the division determines appropriate.
Other · Government · Automated Decisionmaking
Civil Rights Law § 112(1)-(3)
Plain Language
The Division of Consumer Protection must: (1) within 90 days, publish a multilingual consumer-facing webpage describing all provisions and individual rights under the act; (2) beginning two years after the effective date and annually thereafter, publish a machine-readable report summarizing all evaluations and assessments submitted and describing broad trends; and (3) within 180 days of the first annual report, develop a searchable, downloadable public repository of all pre-deployment evaluations, impact assessments, and developer reviews. These are government agency infrastructure obligations, not private sector compliance obligations. However, the existence of the repository means developers' and deployers' evaluation summaries will be publicly searchable and downloadable.
Statutory Text
1. (a) Not later than ninety days after the effective date of this article, the division shall publish, on the internet website of the division, a web page that describes each provision, right, obligation, and requirement of this article (categorized with respect to individuals, deployers, and developers) and the remedies, exemptions, and protections associated with this article, in plain and concise language, in each covered language, and in an easy-to-understand, accessible manner. (b) The division shall update the information published under paragraph (a) of this subdivision as necessitated by any change in law, regulation, guidance, or judicial decision. Any such update shall be published in plain and concise language, in each covered language, and in an easy-to-understand, accessible manner. 2. Not later than two years after the date of effective date of this article, and annually thereafter, the division shall publish on the internet website of the division a report that: (a) describes and summarizes the information contained in any pre-deployment evaluation, impact assessment, and developer review submitted to the division in accordance with this article; (b) describes broad trends, aggregated statistics, and anonymized information about performing impact assessments of covered algorithms, for the purposes of updating guidance related to impact assessments and summary reporting, oversight, and making recommendations to other regulatory agencies; and (c) is accessible and machine readable. 3.(a) Not later than one hundred eighty days after the division publishes the first annual report under subdivision two of this section, the division shall develop a publicly accessible repository to publish each pre-deployment evaluation, impact assessment, and developer review submitted to the division in accordance with section one hundred three and one hundred four of this article. (b) The division shall design the repository established under paragraph (a) of this section to: (i) be publicly available and easily discoverable on the internet website of the division; (ii) allow users to sort and search the repository by multiple characteristics (such as by developer or deployer and date reported) simultaneously; (iii) allow users to make a copy of or download the information obtained from the repository, including any subsets of information obtained by sorting or searching as described in subparagraph (ii) of this paragraph; (iv) be in accordance with user experience and accessibility best practices; and (v) include information about the design, use, and maintenance of the repository, including any other information determined appropriate by the division. (c) The division shall publish in the repository any pre-deployment evaluation, impact assessment, and developer review not later than thirty days after receiving such evaluation, assessment, or review, except if the division has good cause to delay such publication. (d) The division: (i) may redact and segregate any trade secret (as defined in section 1839 of title 18, United States Code) from public disclosure under this subsection; (ii) shall redact and segregate personal data from public disclosure under this subdivision; and (iii) may withhold information as permitted under section 552 of title 5, United States Code.
Other · Government · Automated Decisionmaking
Civil Rights Law § 105
Plain Language
The Division must promulgate rules within two years specifying: what factors must be considered in preliminary evaluations and impact assessments, what must be included in public summaries of evaluations and reviews, and the process for developers to request additional information from deployers. The rulemaking must balance privacy protection with the need for information sharing. This is a government rulemaking directive that will flesh out the procedural details of §§ 103-104. Until rules are promulgated, developers and deployers should follow the statutory requirements directly.
Statutory Text
Not later than two years after the effective date of this article, the division shall: (a) promulgate rules specifying: (i) what information and factors a developer or deployer shall consider in making the preliminary evaluation or preliminary impact assessment described in sections one hundred three and one hundred four of this article, respectively; (ii) what information a developer or deployer shall include in a summary of an evaluation, assessment, or developer review described in section one hundred four of this article; and (iii) the extent to and process by which a developer may request additional information from a deployer, including the purposes for which a developer is permitted to use such additional information; and (b) in promulgating such rules, consider the need to protect the privacy of personal data, as well as the need for information sharing by developers and deployers to comply with this section and inform the public.
Other · Automated Decisionmaking
Civil Rights Law § 113
Plain Language
The Attorney General may bring civil actions as parens patriae in federal district court when a person's practices violate the act. Available remedies include injunctive relief, civil penalties ($15,000 per violation or 4% of average gross annual revenue over three years, whichever is greater), damages, restitution, and attorneys' fees. This provision creates the enforcement mechanism but imposes no new compliance obligation on developers or deployers.
Statutory Text
In any case in which the attorney general has reason to believe that an interest of the residents of the state has been or is threatened or adversely affected by the engagement of a person in a practice that violates this article, or a regulation promulgated thereunder, the attorney general may, as parens patriae, bring a civil action on behalf of the residents of the state in an appropriate Federal district court of the United States that meets applicable requirements relating to venue under section 1391 of title 28, United States Code, to: 1. enjoin any such violation by the person; 2. enforce compliance with the requirements of this article; 3. obtain a permanent, temporary, or preliminary injunction or other appropriate equitable relief; 4. obtain civil penalties in the amount of fifteen thousand dollars per violation, or four percent of the defendant's average gross annual revenue over the preceding three years, whichever is greater; 5. obtain damages, restitution, or other compensation on behalf of the residents of the state; 6. obtain reasonable attorneys' fees and litigation costs; and 7. obtain such other relief as the court may consider to be appropriate.
Other · Automated Decisionmaking
Civil Rights Law § 114(1)-(4)
Plain Language
Any individual or class may bring a private civil action for violations of the act. Prevailing plaintiffs may recover treble damages or $15,000 per violation (whichever is greater), nominal damages, punitive damages, attorneys' fees, and equitable or declaratory relief. Before filing suit, the individual must provide written notice to the Division and the AG, who have 60 days to decide whether to intervene. The AG and Division retain the right to intervene later even if they initially decline. Pre-dispute arbitration agreements and class action waivers are unenforceable for disputes under this act — and courts (not arbitrators) determine whether this prohibition applies. This is an enforcement mechanism provision; it creates no independent compliance obligation.
Statutory Text
1. Any individual or class of individuals alleging a violation of this article, or a regulation promulgated hereunder, may bring a civil action in any court of competent jurisdiction. 2. In a civil action brought under this section in which the plaintiff prevails, the court may award: (a) treble damages or fifteen thousand dollars per violation, whichever is greater; (b) nominal damages; (c) punitive damages; (d) reasonable attorneys' fees and litigation costs; and (e) any other relief, including equitable or declaratory relief, that the court determines appropriate. 3.(a) Prior to an individual bringing a civil action under this section, such individual shall notify the division and the attorney general, in writing and including a description of the allegations included in the civil action, that such individual intends to bring a civil action under such paragraph. Not later than sixty days after receiving such notice, the division and the attorney general shall each or jointly make a determination and respond to such individual as to whether they will intervene in such action. The division and the attorney general shall have a right to intervene in any civil action under this section, and upon intervening, to be heard on all matters arising in such action and file petitions for appeal of a decision in such action. (b) Paragraph (a) of this subdivision shall not be construed to limit the authority of the division or the attorney general to, at a later date, commence a civil action or intervene by motion if the division or the attorney general does not commence a proceeding or civil action within the sixty-day period described in paragraph (a) of this subdivision. 4. (a) Notwithstanding any other provision of law, no pre-dispute arbitration agreement or pre-dispute joint action waiver shall be valid or enforceable with regard to a dispute arising under this article. (b) Any determination as to whether or how this subdivision applies to any dispute shall be made by a court, rather than an arbitrator, without regard to whether such agreement purports to delegate such determination to an arbitrator. (c) For purposes of this subdivision: (i) "pre-dispute arbitration agreement" means any agreement to arbitrate a dispute that has not arisen at the time of the making of the agreement; and (ii) "pre-dispute joint-action waiver" means an agreement, whether or not part of a pre-dispute arbitration agreement, that would prohibit or waive the right of one of the parties to the agreement to participate in a joint, class, or collective action in a judicial, arbitral, administrative, or other related forum, concerning a dispute that has not yet arisen at the time of the making of the agreement.