R-03
Reporting & Regulatory Submissions
Operational Performance Reporting
Operators of certain AI systems must submit periodic scheduled reports to designated regulatory authorities on system performance in live deployment. Unlike incident reporting, this reporting is routine and not triggered by a specific event. Unlike regulatory submissions, the focus is on operational metrics from production rather than pre-deployment assessments.
Applies to DeployerGovernment Sector Chatbot
Bills — Enacted
2
unique bills
Bills — Proposed
8
Last Updated
2026-03-29
Core Obligation

Operators of certain AI systems must submit periodic scheduled reports to designated regulatory authorities on system performance in live deployment. Unlike incident reporting, this reporting is routine and not triggered by a specific event. Unlike regulatory submissions, the focus is on operational metrics from production rather than pre-deployment assessments.

Sub-Obligations2 sub-obligations
Bills That Map This Requirement 10 bills
Bill
Status
Sub-Obligations
Section
Pending 2027-01-01
R-03.1R-03.2
Bus. & Prof. Code § 22587.3(b)
Plain Language
Beginning January 1, 2028, operators must submit annual reports to the Office of Suicide Prevention covering the prior calendar year's crisis-related data: the existence of the graduated response system, all credible crisis expressions detected, and the duration and conditions of all crisis interruption pauses. Because the report covers the preceding calendar year, operators need to begin collecting and retaining the underlying data from January 1, 2027 — the law's primary effective date — not from the first reporting date. The Office of Suicide Prevention receives these reports but is not granted enforcement authority under this bill.
(b) Beginning January 1, 2028, an operator shall annually report to the Office of Suicide Prevention the items set forth in subdivision (a) with respect to the previous calendar year.
Enacted 2026-01-01
R-03.1
Bus. & Prof. Code § 22757.13(g)(1)-(3)
Plain Language
Beginning January 1, 2027, the Office of Emergency Services must produce an annual public report with anonymized and aggregated information about critical safety incidents it has reviewed. The report goes to the Legislature and Governor. Information that would compromise trade secrets, cybersecurity, public safety, or national security is excluded. While this is a government agency obligation rather than a direct developer obligation, frontier developers should be aware that their incident reports will be aggregated into public reporting — creating an indirect incentive for thorough and accurate initial submissions.
(g) (1) Beginning January 1, 2027, and annually thereafter, the Office of Emergency Services shall produce a report with anonymized and aggregated information about critical safety incidents that have been reviewed by the Office of Emergency Services since the preceding report. (2) The Office of Emergency Services shall not include information in a report pursuant to this subdivision that would compromise the trade secrets or cybersecurity of a frontier developer, public safety, or the national security of the United States or that would be prohibited by any federal or state law. (3) The Office of Emergency Services shall transmit a report pursuant to this subdivision to the Legislature, pursuant to Section 9795, and to the Governor.
Enacted 2026-01-01
R-03.1
Bus. & Prof. Code § 22757.14(d)(1)-(3)
Plain Language
Beginning January 1, 2027, the Attorney General must produce an annual report with anonymized and aggregated information about whistleblower reports from covered employees that the AG has reviewed. The report goes to the Legislature and Governor. Information that would compromise trade secrets, cybersecurity, covered employee confidentiality, public safety, or national security is excluded. This is a government reporting obligation — frontier developers should be aware that employee whistleblower reports may appear in aggregated public reporting.
(d) (1) Beginning January 1, 2027, and annually thereafter, the Attorney General shall produce a report with anonymized and aggregated information about reports from covered employees made pursuant to Chapter 5.1 (commencing with Section 1107) of Part 3 of Division 2 of the Labor Code that have been reviewed by the Attorney General since the preceding report. (2) The Attorney General shall not include information in a report pursuant to this subdivision that would compromise the trade secrets or cybersecurity of a frontier developer, confidentiality of a covered employee, public safety, or the national security of the United States or that would be prohibited by any federal or state law. (3) The Attorney General shall transmit a report pursuant to this subdivision to the Legislature, pursuant to Section 9795 of the Government Code, and to the Governor.
Pending 2027-01-01
R-03.1R-03.2
C.R.S. § 6-1-1708(5)(a)-(d)
Plain Language
Beginning July 1, 2027, operators must submit an annual report to the Colorado Attorney General's office covering: (1) the number of crisis referral notifications issued in the preceding calendar year; (2) protocols for detecting, removing, and responding to suicidal ideation or self-harm; and (3) protocols for preventing AI responses about suicidal ideation or self-harm actions. Reports must not contain user identifiers or personal information. Operators must use evidence-based measurement methods for tracking suicidal ideation and self-harm. The Attorney General's office will publish the report data publicly on its website. Because the report covers the preceding calendar year, operators should begin tracking crisis referral counts from at least January 1, 2027.
(a) On and after July 1, 2027, an operator shall annually report to the attorney general's office: (I) The number of times the operator has issued a crisis service provider referral notification in the preceding calendar year; (II) Any protocols the operator implemented to detect, remove, and respond to instances of suicidal ideation or self-harm by a user of a conversational artificial intelligence service; and (III) Any protocols the operator implemented to prevent a conversational artificial intelligence service response about suicidal ideation or self-harm actions. (b) The report required by subsection (5)(a) of this section must not include any identifiers or personal information about a user of a conversational artificial intelligence service. (c) The attorney general's office shall post on its public website data from reports submitted pursuant to subsection (5)(a) of this section. (d) For the purpose of creating a report as required by subsection (5)(a) of this section, an operator shall use evidence-based methods for measuring suicidal ideation or self-harm.
Pending 2027-01-01
R-03.1R-03.2
Section 20(b)
Plain Language
Operators must submit an annual report to the Illinois Attorney General covering: (1) the total number of times the crisis intervention protocol was triggered in the preceding calendar year, and (2) a summary of the most recent biennial third-party compliance audit results. Because the report covers the preceding calendar year, operators should begin tracking crisis protocol activation counts from the Act's effective date of January 1, 2027. This is a routine periodic reporting obligation, not triggered by a specific incident.
(b) On an annual basis, an operator shall submit a report to the Attorney General containing the following metrics for the preceding calendar year: (1) the total number of times the crisis intervention protocol was triggered; and (2) a summary of the results of the most recent compliance audit required by subsection (a).
Pending 2026-10-01
R-03.1R-03.2
Commercial Law § 14–1330(H)(1)–(2)
Plain Language
Beginning in 2027, operators must submit an annual report to the Office of Suicide Prevention by March 1 covering: protocol descriptions for self-harm/suicidal ideation prevention and minor sexually explicit content prevention, the number of crisis referral notifications issued, details on evidence-based detection methods used, and all user complaints filed along with review results and follow-up actions. Reports must not contain any personal identifying information about users. Because reporting begins March 1, 2027 and covers the preceding period, operators need to begin tracking metrics from the law's effective date of October 1, 2026.
(H) (1) On or before March 1 each year, beginning in 2027, an operator shall report to the Office: (I) Information on the protocols required under subsections (B) and (C) of this section; (II) The number of times the operator has issued a notification under subsection (B)(2) of this section; (III) Details about the methods used under subsection (B)(3) of this section; and (IV) All complaints filed under subsection (G) of this section, including the results of the review of each complaint and any follow–up actions taken. (2) The report required under paragraph (1) of this subsection may not contain any personal identifying information about a user.
Pending 2026-10-01
R-03.1
Commercial Law § 14–1330(H)(3)
Plain Language
Beginning July 1, 2027, the Office of Suicide Prevention must annually compile and publish on its website data from the operator reports submitted under subsection (H)(1). This obligation falls on the government agency, not on operators — operators' obligation is to submit the reports by March 1. The Office's publication creates public transparency around companion chatbot safety metrics across the industry.
(3) On or before July 1 each year, beginning in 2027, the Office shall: (I) Compile data from the reports submitted under paragraph (1) of this subsection for the immediately preceding calendar year; and (II) Publish the data on the Office's website.
Pending 2026-07-15
R-03.1
Section 4(1)-(3)
Plain Language
The Commissioner of Labor must maintain a database of all AI/automation disclosures submitted under the amended WARN Act and publish quarterly summaries analyzing AI/automation-related workforce reductions by number, sector, and location. These quarterly reports must be posted publicly on the Department of Labor's website and shared with the Department of Economic Development for workforce planning purposes. This is an obligation on the Commissioner (a government actor), not on employers — but employer compliance with the disclosure requirement in § 860-b(1)(f) feeds the database.
1. The commissioner of labor shall maintain a database of reports submitted under paragraph (f) of subdivision 1 of section 860-b of the labor law. 2. Such commissioner shall prepare and publish quarterly summaries analyzing the number, sector, and location of workforce reductions identified as resulting from AI or automation. 3. Such reports shall be made publicly available on the department of labor's website and shared with the department of economic development for use in workforce innovation planning and retraining programs.
Pending 2027-01-01
R-03.1
§ 59.1-619(C)
Plain Language
Operators must publish a semiannual public report containing two categories of quantitative metrics: (1) the number of times the chatbot provided information about suicide, self-harm, suicidal ideation, harming others, or illegal activity, and (2) the number of times a mental health redirect (crisis referral) was provided to users. This is a public reporting obligation — unlike CA SB 243, which requires reporting to a government office, Virginia requires the report to be made publicly available. The statute does not specify the format, publication location, or the first reporting period.
C. Operators shall publish a semiannual report available to the public on the number of times (i) the chatbot provided information about suicide, self-harm, suicidal ideation, harming others, or illegal activity and (ii) a mental health redirect has been provided to users.
Pre-filed 2025-07-01
R-03.1
9 V.S.A. § 4193e(c)
Plain Language
In the first year after deploying a high-risk AI system, deployers must submit testing results at the one-month, six-month, and twelve-month marks to the Division of Artificial Intelligence. These results must demonstrate the reliability of the system's outputs, document any variance over time, and describe mitigation strategies for those variances. This is distinct from the broader Safety and Impact Assessment — it is a performance monitoring submission specific to high-risk systems in their first operational year.
(c) Each deployer of a high-risk artificial intelligence system shall submit a one-, six-, and 12-month testing result to the Division of Artificial Intelligence showing the reliability of the results generated by the system, any variance in those results over the testing periods, and any mitigation strategies for variances, in the first year of deployment.
Pre-filed 2026-07-01
R-03.1R-03.2
9 V.S.A. § 4193c(a)-(c)
Plain Language
Beginning July 1, 2027 (one year after the act's effective date), operators must submit an annual report to the Office of the Attorney General covering: (1) the number of crisis service provider referral notifications issued in the prior calendar year, and (2) the protocols in place to detect and respond to user expressions of suicidal ideation or self-harm and to prohibit the chatbot from producing such content. Reports must contain no user identifiers or personal information. The Attorney General's office will publish the reported data on its website. Because the report covers the preceding calendar year, operators should begin tracking crisis referral counts from July 1, 2026.
(a) Beginning one year after the effective date of this act, an operator shall annually report to the Office of the Attorney General all of the following: (1) the number of times in the preceding calendar year the operator has issued a crisis service provider referral notification pursuant to subdivision 4193b(b)(2)(A) of this subchapter; and (2) the protocols put in place by the operator to: (A) detect and respond to expressions of suicidal ideation or self-harm by users; and (B) prohibit the companion chatbot from producing content about suicidal ideation, suicide, or self-harm with the user. (b) The reporting required by this section shall include only the information listed in subsection (a) of this section and shall not include any identifiers or personal information about users. (c) The Office of the Attorney General shall post on its website the data from a report received pursuant to this section.
Enacted 2026-01-01
R-03.1R-03.2
Bus. & Prof. Code § 22603(a)-(d)
Plain Language
Beginning July 1, 2027, operators must submit an annual report to the Office of Suicide Prevention covering: how many crisis referral notifications were sent in the prior year, protocols for detecting and responding to suicidal ideation, and protocols for blocking chatbot responses about suicide. Reports must contain no user personal information or identifiers, and operators must use evidence-based measurement methods for measuring suicidal ideation. The Office will publish data from these reports on its website. Because the report covers the preceding calendar year, operators need to begin tracking crisis referral counts and maintaining measurement infrastructure by no later than July 1, 2026 to ensure they have data for the first reporting period.
(a) Beginning July 1, 2027, an operator shall annually report to the office all of the following: (1) The number of times the operator has issued a crisis service provider referral notification pursuant to Section 22602 in the preceding calendar year. (2) Protocols put in place to detect, remove, and respond to instances of suicidal ideation by users. (3) Protocols put in place to prohibit a companion chatbot response about suicidal ideation or actions with the user. (b) The report required by this section shall include only the information listed in subdivision (a) and shall not include any identifiers or personal information about users. (c) The office shall post data from a report required by this section on its internet website. (d) An operator shall use evidence-based methods for measuring suicidal ideation.