H-0804
VT · State · USA
VT
USA
● Pre-filed
Proposed Effective Date
2026-07-01
Vermont H.804 — An act relating to companion chatbots
Imposes safety and disclosure obligations on operators of companion chatbot platforms accessible to Vermont residents. Requires operators to disclose to users that they are interacting with AI when a reasonable person could be misled, with stricter unconditional and periodic disclosure for users known to be minors. Operators must implement and publish a protocol to prevent suicidal ideation, suicide, and self-harm content, including referral to crisis services, and must separately institute a protocol to prevent sexually explicit content for minors. Requires annual reporting to the Office of the Attorney General on crisis referral counts and suicide prevention protocols beginning one year after the effective date. Enforcement is through the Attorney General under Vermont's Consumer Protection Act; violations are classified as unfair and deceptive acts in commerce.
Summary

Imposes safety and disclosure obligations on operators of companion chatbot platforms accessible to Vermont residents. Requires operators to disclose to users that they are interacting with AI when a reasonable person could be misled, with stricter unconditional and periodic disclosure for users known to be minors. Operators must implement and publish a protocol to prevent suicidal ideation, suicide, and self-harm content, including referral to crisis services, and must separately institute a protocol to prevent sexually explicit content for minors. Requires annual reporting to the Office of the Attorney General on crisis referral counts and suicide prevention protocols beginning one year after the effective date. Enforcement is through the Attorney General under Vermont's Consumer Protection Act; violations are classified as unfair and deceptive acts in commerce.

Enforcement & Penalties
Enforcement Authority
Attorney General enforcement. Violations are deemed unfair and deceptive acts in commerce under 9 V.S.A. § 2453. The Attorney General has authority to make rules, conduct civil investigations, bring civil actions, and enter into assurances of discontinuance under the Consumer Protection Act (9 V.S.A. chapter 63). No private right of action is explicitly created by the bill; however, Vermont's Consumer Protection Act (9 V.S.A. § 2461) provides a private cause of action for consumers injured by unfair or deceptive acts — that indirect theory is not created by this bill itself.
Penalties
Violations constitute unfair and deceptive acts in commerce under 9 V.S.A. § 2453. Remedies available to the Attorney General under Vermont's Consumer Protection Act (chapter 63) include civil penalties, injunctive relief, restitution, and assurances of discontinuance. The Vermont Consumer Protection Act also independently provides consumers with a private cause of action for actual damages or $100 statutory minimum (whichever is greater), plus attorney's fees and costs under 9 V.S.A. § 2461, though this bill does not itself create that remedy.
Who Is Covered
"Operator" means a person who makes a companion chatbot platform available to a user.
What Is Covered
(A) "Companion chatbot" means an artificial intelligence system with a natural language interface that provides adaptive, humanlike responses to user inputs and is capable of meeting a user's social needs, including by exhibiting humanlike features and being able to sustain a relationship across multiple interactions. (B) "Companion chatbot" does not include any of the following: (i) a chatbot that is used solely: (I) for customer service; (II) for the operational purposes of a business; (III) to conduct internal research; or (IV) to provide technical assistance; (ii) a chatbot that is a feature of a video game and is limited to replies related to the video game that cannot discuss topics related to mental health, self-harm, or sexually explicit conduct or maintain a dialogue on other topics unrelated to the video game; or (iii) a stand-alone consumer electronic device that functions as a speaker and voice command interface, acts as a voice-activated virtual assistant, and does not sustain a relationship across multiple interactions or generate outputs that are likely to elicit emotional responses in the user.
"Companion chatbot platform" means a platform that allows a user to engage with companion chatbots.
Compliance Obligations 5 obligations · click obligation ID to open requirement page
T-01 AI Identity Disclosure · T-01.1 · Deployer · Chatbot
9 V.S.A. § 4193b(a)
Plain Language
When a user could reasonably mistake the companion chatbot for a human, the operator must display a clear, conspicuous notification that the chatbot is AI-generated and not human. The notification must match the language of the interaction and be sized for easy readability. This is a conditional trigger — if the chatbot is clearly identifiable as AI from the outset, no disclosure is required. Compare to the minor-specific provision in § 4193b(c)(1), which imposes an unconditional immediate disclosure.
Statutory Text
If a user interacting with a companion chatbot could be reasonably misled to believe that the user is interacting with a human, an operator shall issue a clear and conspicuous notification to the individual indicating that the companion chatbot is artificially generated and not human. The text of the notification shall appear in the same language and in a size easily readable by the average viewer.
S-02 Prohibited Conduct & Output Restrictions · S-02.7S-02.9 · Deployer · Chatbot
9 V.S.A. § 4193b(b)(1)-(2)
Plain Language
Operators may not run a companion chatbot unless they implement and maintain a protocol that (1) prevents the chatbot from producing suicide or self-harm content, (2) ensures the chatbot does not ignore users expressing suicidal ideation or self-harm, and (3) at minimum refers users expressing such thoughts to crisis service providers. The protocol must be developed using commercially reasonable and technically feasible methods and must be published on the operator's website. This is a continuous operating prerequisite — the protocol must remain active as a condition of operation. The 'commercially reasonable and technically feasible' standard provides a practical safe harbor for the protocol's design.
Statutory Text
(1) An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with a user unless the operator implements and maintains a protocol for preventing the companion chatbot from: (A) producing suicidal ideation, suicide, or self-harm content to the user; and (B) ignoring a user that is expressing thoughts of suicidal ideation, suicide, or self-harm. (2) The protocol required in subdivision (1) of this subsection shall: (A) at minimum, provide a notification to the user that refers the user to crisis service providers if the user expresses suicidal ideation, suicide, or self-harm; (B) be developed using commercially reasonable and technically feasible methods; and (C) be published on the operator's website.
T-01 AI Identity Disclosure · T-01.1T-01.2 · Deployer · ChatbotMinors
9 V.S.A. § 4193b(c)(1)-(2)
Plain Language
When the operator knows a user is a minor (17 or younger), two unconditional obligations apply: (1) immediately disclose in a clear and conspicuous manner that the user is interacting with AI — no reasonable-person trigger, this is absolute; and (2) send a prominent reminder at least every 30 minutes during continuing interactions that the chatbot is AI and the user should take a break. The 30-minute interval is a minimum floor — operators may remind more frequently. These obligations are triggered only by actual knowledge that the user is a minor.
Statutory Text
An operator shall, for a user that the operator knows is a minor, do the following: (1) immediately disclose to the user in a clear and conspicuous manner that the user is interacting with artificial intelligence; (2) provide a clear and conspicuous notification to the user at least every 30 minutes for continuing companion chatbot interactions that reminds the user to take a break and that the companion chatbot is artificially generated and not human;
MN-01 Minor User AI Safety Protections · MN-01.6 · Deployer · ChatbotMinors
9 V.S.A. § 4193b(c)(3)
Plain Language
Operators must institute a protocol to prevent their companion chatbot from producing visual material of sexually explicit conduct for users known to be minors, and from directly telling a minor to engage in sexually explicit conduct. 'Sexually explicit conduct' is defined by reference to 18 U.S.C. § 2256, the federal child exploitation statute. This is a distinct protocol obligation from the suicide/self-harm protocol in § 4193b(b) — operators need separate or combined protocols addressing both categories. Triggered only by actual knowledge that the user is a minor.
Statutory Text
(3) institute a protocol to prevent its companion chatbot from producing visual material of sexually explicit conduct or directly stating that the minor should engage in sexually explicit conduct.
R-03 Operational Performance Reporting · R-03.1R-03.2 · Deployer · Chatbot
9 V.S.A. § 4193c(a)-(c)
Plain Language
Beginning July 1, 2027 (one year after the act's effective date), operators must submit an annual report to the Office of the Attorney General covering: (1) the number of crisis service provider referral notifications issued in the prior calendar year, and (2) the protocols in place to detect and respond to user expressions of suicidal ideation or self-harm and to prohibit the chatbot from producing such content. Reports must contain no user identifiers or personal information. The Attorney General's office will publish the reported data on its website. Because the report covers the preceding calendar year, operators should begin tracking crisis referral counts from July 1, 2026.
Statutory Text
(a) Beginning one year after the effective date of this act, an operator shall annually report to the Office of the Attorney General all of the following: (1) the number of times in the preceding calendar year the operator has issued a crisis service provider referral notification pursuant to subdivision 4193b(b)(2)(A) of this subchapter; and (2) the protocols put in place by the operator to: (A) detect and respond to expressions of suicidal ideation or self-harm by users; and (B) prohibit the companion chatbot from producing content about suicidal ideation, suicide, or self-harm with the user. (b) The reporting required by this section shall include only the information listed in subsection (a) of this section and shall not include any identifiers or personal information about users. (c) The Office of the Attorney General shall post on its website the data from a report received pursuant to this section.