H-0804
VT · State · USA
VT
USA
● Pre-filed
Proposed Effective Date
2026-07-01
Vermont H.804 — An act relating to companion chatbots
Imposes safety and disclosure obligations on operators of companion chatbot platforms accessible to Vermont residents. Requires operators to disclose AI identity when a user could reasonably be misled into thinking they are interacting with a human, with stricter unconditional and periodic disclosure requirements for users known to be minors. Requires operators to implement and publish protocols preventing the production of suicidal ideation and self-harm content and referring users expressing such ideation to crisis service providers. Imposes additional obligations for minor users, including protocols blocking sexually explicit content. Creates annual reporting obligations to the Office of the Attorney General. Enforcement is through the Attorney General under Vermont's Consumer Protection Act framework; no private right of action is created.
Summary

Imposes safety and disclosure obligations on operators of companion chatbot platforms accessible to Vermont residents. Requires operators to disclose AI identity when a user could reasonably be misled into thinking they are interacting with a human, with stricter unconditional and periodic disclosure requirements for users known to be minors. Requires operators to implement and publish protocols preventing the production of suicidal ideation and self-harm content and referring users expressing such ideation to crisis service providers. Imposes additional obligations for minor users, including protocols blocking sexually explicit content. Creates annual reporting obligations to the Office of the Attorney General. Enforcement is through the Attorney General under Vermont's Consumer Protection Act framework; no private right of action is created.

Enforcement & Penalties
Enforcement Authority
Attorney General enforcement. Violations are deemed unfair and deceptive acts in commerce under 9 V.S.A. § 2453. The Attorney General has authority to make rules, conduct civil investigations, bring civil actions, and enter into assurances of discontinuance under chapter 63 of title 9. No private right of action is expressly created by this subchapter. The Office of the Attorney General receives annual reports and publishes reported data on its website.
Penalties
The bill does not specify independent penalty amounts. Violations are classified as unfair and deceptive acts in commerce under 9 V.S.A. § 2453, which incorporates the remedies available under Vermont's Consumer Protection Act (chapter 63 of title 9), including injunctive relief, civil penalties, and assurances of discontinuance as provided under that chapter.
Who Is Covered
"Operator" means a person who makes a companion chatbot platform available to a user.
What Is Covered
(A) "Companion chatbot" means an artificial intelligence system with a natural language interface that provides adaptive, humanlike responses to user inputs and is capable of meeting a user's social needs, including by exhibiting humanlike features and being able to sustain a relationship across multiple interactions. (B) "Companion chatbot" does not include any of the following: (i) a chatbot that is used solely: (I) for customer service; (II) for the operational purposes of a business; (III) to conduct internal research; or (IV) to provide technical assistance; (ii) a chatbot that is a feature of a video game and is limited to replies related to the video game that cannot discuss topics related to mental health, self-harm, or sexually explicit conduct or maintain a dialogue on other topics unrelated to the video game; or (iii) a stand-alone consumer electronic device that functions as a speaker and voice command interface, acts as a voice-activated virtual assistant, and does not sustain a relationship across multiple interactions or generate outputs that are likely to elicit emotional responses in the user.
"Companion chatbot platform" means a platform that allows a user to engage with companion chatbots.
Compliance Obligations 6 obligations · click obligation ID to open requirement page
T-01 AI Identity Disclosure · T-01.1 · Deployer · Chatbot
9 V.S.A. § 4193b(a)
Plain Language
When a user could reasonably mistake the companion chatbot for a human, the operator must provide a clear and conspicuous notification stating the chatbot is AI-generated and not human. The notification must be in the same language as the interaction and in a font size easily readable by the average viewer. This is a conditional trigger — if the chatbot already presents itself clearly as AI, no additional disclosure is required. Compare to the minor-specific unconditional disclosure in § 4193b(c)(1).
Statutory Text
If a user interacting with a companion chatbot could be reasonably misled to believe that the user is interacting with a human, an operator shall issue a clear and conspicuous notification to the individual indicating that the companion chatbot is artificially generated and not human. The text of the notification shall appear in the same language and in a size easily readable by the average viewer.
S-02 Prohibited Conduct & Output Restrictions · S-02.7S-02.9 · Deployer · Chatbot
9 V.S.A. § 4193b(b)(1)-(2)
Plain Language
Operators may not allow a companion chatbot to engage with any user unless the operator implements and maintains a protocol that (1) prevents the chatbot from producing suicidal ideation, suicide, or self-harm content, and (2) prevents the chatbot from ignoring users expressing such thoughts. At minimum, the protocol must refer users expressing suicidal ideation or self-harm to crisis service providers. The protocol must be developed using commercially reasonable and technically feasible methods, providing a safe-harbor standard for compliance. Operators must publish the protocol details on their website. This is a continuous operating prerequisite — the chatbot cannot operate without the protocol in place.
Statutory Text
(1) An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with a user unless the operator implements and maintains a protocol for preventing the companion chatbot from: (A) producing suicidal ideation, suicide, or self-harm content to the user; and (B) ignoring a user that is expressing thoughts of suicidal ideation, suicide, or self-harm. (2) The protocol required in subdivision (1) of this subsection shall: (A) at minimum, provide a notification to the user that refers the user to crisis service providers if the user expresses suicidal ideation, suicide, or self-harm; (B) be developed using commercially reasonable and technically feasible methods; and (C) be published on the operator's website.
T-01 AI Identity Disclosure · T-01.1T-01.2 · Deployer · ChatbotMinors
9 V.S.A. § 4193b(c)(1)-(2)
Plain Language
When the operator knows a user is a minor (17 or younger), two unconditional disclosure obligations apply: (1) the operator must immediately disclose in a clear and conspicuous manner that the user is interacting with AI — this is unconditional, unlike the general disclosure in § 4193b(a) which requires a 'reasonable misleading' trigger; and (2) the operator must send a prominent notification at least every 30 minutes during continuing interactions reminding the minor to take a break and that the chatbot is AI-generated and not human. The 30-minute interval is notably more frequent than CA SB 243's 3-hour floor. These obligations are triggered only by actual knowledge that the user is a minor.
Statutory Text
An operator shall, for a user that the operator knows is a minor, do the following: (1) immediately disclose to the user in a clear and conspicuous manner that the user is interacting with artificial intelligence; (2) provide a clear and conspicuous notification to the user at least every 30 minutes for continuing companion chatbot interactions that reminds the user to take a break and that the companion chatbot is artificially generated and not human;
MN-01 Minor User AI Safety Protections · MN-01.6 · Deployer · ChatbotMinors
9 V.S.A. § 4193b(c)(3)
Plain Language
For users known to be minors, operators must institute a protocol preventing the companion chatbot from (1) producing visual material of sexually explicit conduct (as defined by federal law at 18 U.S.C. § 2256) and (2) directly stating that the minor should engage in sexually explicit conduct. This is an independent protocol obligation specific to minor users, separate from the general self-harm protocol in § 4193b(b). The obligation requires affirmative measures — not merely a policy — to block both visual and textual sexually explicit outputs directed at minors.
Statutory Text
(3) institute a protocol to prevent its companion chatbot from producing visual material of sexually explicit conduct or directly stating that the minor should engage in sexually explicit conduct.
R-03 Operational Performance Reporting · R-03.1R-03.2 · Deployer · Chatbot
9 V.S.A. § 4193c(a)-(c)
Plain Language
Beginning one year after the effective date (July 1, 2027), operators must submit annual reports to the Office of the Attorney General covering: (1) the number of crisis service provider referral notifications issued in the preceding calendar year, and (2) the protocols in place for detecting and responding to suicidal ideation or self-harm expressions and for preventing the chatbot from producing self-harm content. Reports must not include any user identifiers or personal information. The Attorney General's office will publish the reported data on its website. Because reporting covers the preceding calendar year, operators must begin tracking crisis referral counts from the act's effective date (July 1, 2026), not from the first reporting date.
Statutory Text
(a) Beginning one year after the effective date of this act, an operator shall annually report to the Office of the Attorney General all of the following: (1) the number of times in the preceding calendar year the operator has issued a crisis service provider referral notification pursuant to subdivision 4193b(b)(2)(A) of this subchapter; and (2) the protocols put in place by the operator to: (A) detect and respond to expressions of suicidal ideation or self-harm by users; and (B) prohibit the companion chatbot from producing content about suicidal ideation, suicide, or self-harm with the user. (b) The reporting required by this section shall include only the information listed in subsection (a) of this section and shall not include any identifiers or personal information about users. (c) The Office of the Attorney General shall post on its website the data from a report received pursuant to this section.
Other · Chatbot
9 V.S.A. § 4193d(a)-(b)
Plain Language
This provision classifies any violation of the companion chatbot subchapter as an unfair and deceptive act in commerce under Vermont's Consumer Protection Act (9 V.S.A. § 2453) and grants the Attorney General rulemaking, investigation, civil action, and assurance of discontinuance authority under chapter 63. This is an enforcement mechanism provision — it creates no new compliance obligation but activates the existing consumer protection enforcement framework for violations of the substantive requirements elsewhere in the subchapter.
Statutory Text
(a) A person who violates this subchapter or rules adopted pursuant to this subchapter commits an unfair and deceptive act in commerce in violation of section 2453 of this title. (b) The Attorney General shall have the same authority under this subchapter to make rules, conduct civil investigations, bring civil actions, and enter into assurances of discontinuance as provided under chapter 63 of this title.