H-7350
RI · State · USA
RI
USA
● Pending
Proposed Effective Date
2027-01-01
Rhode Island H 7350 — An Act Relating to Commercial Law — General Regulatory Provisions — Artificial Intelligence Companion Models
Imposes safety and disclosure obligations on operators of AI companion systems in Rhode Island. Operators may not operate or provide an AI companion unless it contains protocols for addressing user expressions of suicidal ideation, self-harm, physical harm to others, and financial harm to others, including crisis service referrals. Operators must provide a mandatory notification at the start of every interaction and at least every three hours thereafter stating that the AI companion is a computer program unable to feel human emotion, in prescribed formatting. Enforcement is through both an Attorney General enforcement power (including injunctions and deceptive trade practices authority) and a private right of action for persons physically injured through self-harm or physically or financially harmed by another due to a violation. The bill excludes customer-service-only chatbots from the definition of AI companion.
Summary

Imposes safety and disclosure obligations on operators of AI companion systems in Rhode Island. Operators may not operate or provide an AI companion unless it contains protocols for addressing user expressions of suicidal ideation, self-harm, physical harm to others, and financial harm to others, including crisis service referrals. Operators must provide a mandatory notification at the start of every interaction and at least every three hours thereafter stating that the AI companion is a computer program unable to feel human emotion, in prescribed formatting. Enforcement is through both an Attorney General enforcement power (including injunctions and deceptive trade practices authority) and a private right of action for persons physically injured through self-harm or physically or financially harmed by another due to a violation. The bill excludes customer-service-only chatbots from the definition of AI companion.

Enforcement & Penalties
Enforcement Authority
Dual enforcement. The Attorney General is empowered to investigate, sue, and seek injunctions against noncompliant AI companion providers, including under the deceptive trade practices statute (R.I. Gen. Laws ch. 6-13.1). Private right of action available to any person who was physically injured through self-harm or was physically or financially harmed by another because of a violation of §§ 6-63-2 and 6-63-3; action is brought in Superior Court. No cure period or safe harbor is specified.
Penalties
Damages, equitable relief, and such other remedies as the court may deem appropriate. No statutory minimum or per-violation amount is specified. Private plaintiffs must show physical injury through self-harm or physical or financial harm by another caused by the violation. The Attorney General may seek injunctions. No provision for attorney fees or costs.
Who Is Covered
"Operator" means any person, partnership, association, firm, or business entity, or any member, affiliate, subsidiary or beneficial owner of any partnership, association, firm, or business entity who operates or provides an AI companion.
What Is Covered
"AI companion" means a system using artificial intelligence, generative artificial intelligence, and/or emotional recognition algorithms to simulate social human interaction, by retaining information on prior interactions and user preference, asking questions, providing advice, and engaging in simulated conversation on matters of personal well-being. "AI companion" shall not include any system used by a business entity solely intended to provide users with information about available commercial services or products, customer account information, or other information related to a user's customer, or potential customer relationship with such business entity.
Compliance Obligations 2 obligations · click obligation ID to open requirement page
S-02 Prohibited Conduct & Output Restrictions · S-02.7 · Deployer · Chatbot
R.I. Gen. Laws § 6-63-2
Plain Language
Operators may not operate or provide an AI companion unless the system includes active protocols for detecting and responding to three categories of user expression: (1) suicidal ideation or self-harm, (2) physical harm to others, and (3) financial harm to others. The protocol must include, at minimum, a notification referring the user to crisis service providers such as suicide hotlines or crisis text lines. This is a continuous operating prerequisite — without the protocol, operating the AI companion is unlawful. Note that the crisis referral notification requirement is explicitly tied to category (3) (financial harm to others) in the statutory text, but the practical expectation is that crisis referral applies across all three categories. The bill is broader than typical companion chatbot safety statutes in that it also covers expressions of physical and financial harm to third parties, not just self-harm.
Statutory Text
It shall be unlawful for any operator to operate or provide an AI companion to a user unless such AI companion contains a protocol for addressing: (1) Possible suicidal ideation or self-harm expressed by a user to the AI companion; (2) Possible physical harm to others expressed by a user to the AI companion; and (3) Possible financial harm to others expressed by the user to the AI companion that includes, but is not limited to, a notification to the user that refers them to crisis service providers such as a suicide hotline, crisis text line, or other appropriate crisis services.
T-01 AI Identity Disclosure · T-01.1T-01.2 · Deployer · Chatbot
R.I. Gen. Laws § 6-63-3
Plain Language
Operators must unconditionally disclose to every user at the start of every AI companion interaction — and again at least every three hours during continuing interactions — that the AI companion is a computer program and not a human being, and that it is unable to feel human emotion. This is an unconditional, mandatory disclosure — there is no 'reasonable person would be misled' threshold. The notification must be delivered verbally or in bold, capitalized text of at least 16-point font. The prescribed language must include the specific AI companion's name. Unlike CA SB 243, which only requires periodic re-disclosure for minors, Rhode Island requires the three-hour interval for all users regardless of age.
Statutory Text
An operator shall provide a notification to a user at the beginning of any AI companion interaction and at least every three (3) hours for continuing AI companion interactions hereafter, which states either verbally or in bold and capitalized letters of at least sixteen (16) point type, the following: "THE AI COMPANION (OR NAME OF THE AI COMPANION) IS A COMPUTER PROGRAM AND NOT A HUMAN BEING. IT IS UNABLE TO FEEL HUMAN EMOTION".