Civ. Rights Law § 87(1)-(2), (3)-(9)
Plain Language
Both developers and deployers of high-risk AI systems must engage independent third-party auditors to evaluate their systems on a recurring schedule. Developers must complete a first audit within six months of offering or deploying the system, then annually. Deployers must complete a first audit within six months of deployment, a second audit one year later, then biennially. Audits must evaluate: (1) whether the entity has taken reasonable care to prevent algorithmic discrimination, (2) conformity of the risk management program with § 89, and for deployers additionally (3) system accuracy and reliability against intended and actual use cases. Strict auditor independence requirements apply — no entity that provided any service to the commissioning company in the past 12 months, and no competitor planning to compete for 5 years post-audit. Audit fees cannot be contingent on results. Audits may use AI as a tool (e.g., controlled testing) but may not be completed entirely by AI, and a separate high-risk AI system may not be used to complete the audit. An audit completed for compliance with another law satisfies this section if it meets all requirements. For systems already deployed at the effective date, an 18-month transition period applies (per § 88(6)).
Statutory Text
1. Developers of high-risk AI systems shall cause to be conducted third-party audits in accordance with this section.
(a) A developer of a high-risk AI system shall complete at least:
(i) a first audit within six months after completion of development of the high-risk AI system and the initial offering of the high-risk AI system to a deployer for deployment or, if the developer is first deployer to deploy the high-risk AI system, after initial deployment; and
(ii) one audit every one year following the submission of the first audit.
(b) A developer audit under this section shall include:
(i) an evaluation and determination of whether the developer has taken reasonable care to prevent foreseeable risk of algorithmic discrimination with respect to such high-risk AI system; and
(ii) an evaluation of the developer's documented risk management policy and program required under section eighty-nine of this article for conformity with subdivision one of such section eighty-nine.
2. Deployers of high-risk AI systems shall cause to be conducted third-party audits in accordance with this section.
(a) A deployer of a high-risk AI system shall complete at least:
(i) a first audit within six months after initial deployment;
(ii) a second audit within one year following the submission of the first audit; and
(iii) one audit every two years following the submission of the second audit.
(b) A deployer audit under this section shall include:
(i) an evaluation and determination of whether the deployer has taken reasonable care to prevent foreseeable risk of algorithmic discrimination with respect to such high-risk AI system;
(ii) an evaluation of system accuracy and reliability with respect to such high-risk AI system's deployer-intended and actual use cases; and
(iii) an evaluation of the deployer's documented risk management policy and program required under section eighty-nine of this article for conformity with subdivision one of such section eighty-nine.
3. A deployer or developer may hire more than one auditor to fulfill the requirements of this section.
4. At the attorney general's discretion, the attorney general may:
(a) promulgate further rules as necessary to ensure that audits under this section assess whether or not AI systems produce algorithmic discrimination and otherwise comply with the provisions of this article; and
(b) recommend an updated AI system auditing framework to the legislature, where such recommendations are based on a standard or framework (i) designed to evaluate the risks of AI systems, and (ii) that is nationally or internationally recognized and consensus-driven, including but not limited to a relevant framework or standard created by the International Standards Organization.
5. The independent auditor shall have complete and unredacted copies of all reports previously filed by the deployer or developer under section eighty-eight of this article.
6. An audit conducted under this section may be completed in part, but shall not be completed entirely, with the assistance of an AI system.
(a) Acceptable auditor uses of an AI system include, but are not limited to:
(i) use of an audited high-risk AI system in a controlled environment without impacts on end users for system testing purposes; or
(ii) detecting patterns in the behavior of an audited AI system.
(b) An auditor shall not:
(i) use a different high-risk AI system that is not the subject of an audit to complete an audit; or
(ii) use an AI system to draft an audit under this section without meaningful human review and oversight.
7. (a) An auditor shall be an independent entity including but not limited to an individual, non-profit, firm, corporation, partnership, cooperative, or association.
(b) For the purposes of this article, no auditor may be commissioned by a developer or deployer of a high-risk AI system if such entity:
(i) has already been commissioned to provide any auditing or non-auditing service, including but not limited to financial auditing, cybersecurity auditing, or consulting services of any type, to the commissioning company in the past twelve months; or
(ii) is, will be, or plans to be engaged in the business of developing or deploying an AI system that can compete commercially with such developer's or deployer's high-risk AI system in the five years following an audit.
(c) Fees paid to auditors may not be contingent on the result of the audit and the commissioning company shall not provide any incentives or bonuses for a positive audit result.
8. The attorney general may promulgate further rules to ensure (a) the independence of auditors under this section, and (b) that teams conducting audits incorporate feedback from communities that may foreseeably be the subject of algorithmic discrimination with respect to the AI system being audited.
9. If a developer or deployer has an audit completed for the purpose of complying with another applicable federal, state, or local law or regulation, and the audit otherwise satisfies all other requirements of this section, such audit shall be deemed to satisfy the requirements of this section.