A-06767
NY · State · USA
NY
USA
● Pending
Proposed Effective Date
2025-09-09
New York Assembly Bill 6767 — An Act to amend the general business law, in relation to artificial intelligence companion models
Imposes safety and disclosure obligations on operators of AI companion systems in New York. Operators may not provide an AI companion to a user unless the companion contains a protocol for addressing possible suicidal ideation or self-harm, possible physical harm to others, and possible financial harm to others expressed by the user, including crisis service referral notifications. Operators must also provide a mandatory disclosure at the beginning of every interaction and at least every three hours thereafter, in bold capitalized text of at least 16-point type, informing the user that the AI companion is a computer program and not a human being. Enforcement is through a private right of action available to persons who were physically injured through self-harm or physically or financially harmed by another as a result of a violation. No statutory minimum damages are specified; courts may award damages, equitable relief, and other appropriate remedies.
Summary

Imposes safety and disclosure obligations on operators of AI companion systems in New York. Operators may not provide an AI companion to a user unless the companion contains a protocol for addressing possible suicidal ideation or self-harm, possible physical harm to others, and possible financial harm to others expressed by the user, including crisis service referral notifications. Operators must also provide a mandatory disclosure at the beginning of every interaction and at least every three hours thereafter, in bold capitalized text of at least 16-point type, informing the user that the AI companion is a computer program and not a human being. Enforcement is through a private right of action available to persons who were physically injured through self-harm or physically or financially harmed by another as a result of a violation. No statutory minimum damages are specified; courts may award damages, equitable relief, and other appropriate remedies.

Enforcement & Penalties
Enforcement Authority
Private right of action. No designated agency enforcer. Any person who was physically injured through self-harm or was physically or financially harmed by another as a result of a violation may bring an action in a court of competent jurisdiction. Standing requires actual physical injury through self-harm or physical or financial harm caused by another person, traceable to the operator's violation.
Penalties
Damages, equitable relief, and such other remedies as the court may deem appropriate. No statutory minimum or per-violation amount is specified. Plaintiff must have been physically injured through self-harm or physically or financially harmed by another as a result of the violation. The statute does not address attorney fees or costs.
Who Is Covered
"Operator" means any person, partnership, association, firm, or business entity, or any member, affiliate, subsidiary or beneficial owner of any partnership, association, firm, or business entity who operates or provides an AI companion.
What Is Covered
"AI companion" means a system using artificial intelligence, generative artificial intelligence, and/or emotional recognition algorithms to simulate social human interaction, by retaining information on prior interactions and user preference, asking questions, providing advice, and engaging in simulated conversation on matters of personal well-being. "AI companion" shall not include any system used by a business entity solely intended to provide users with information about available commercial services or products, customer account information, or other information related to a user's customer, or potential customer, relationship with such business entity.
Compliance Obligations 3 obligations · click obligation ID to open requirement page
S-02 Prohibited Conduct & Output Restrictions · S-02.7 · Deployer · Chatbot
Gen. Bus. Law § 1701
Plain Language
Operators may not operate or provide an AI companion at all unless the system includes a protocol for addressing three categories of user-expressed risk: (1) suicidal ideation or self-harm, (2) physical harm to others, and (3) financial harm to others. The protocol must include, at minimum, a notification referring the user to crisis service providers such as a suicide hotline or crisis text line. This is broader than CA SB 243's crisis protocol, which covers only suicidal ideation and self-harm — this bill adds protocols for physical harm to others and financial harm to others. This is a continuous operating prerequisite: an operator cannot lawfully run the companion without the protocol in place.
Statutory Text
It shall be unlawful for any operator to operate or provide an AI companion to a user unless such AI companion contains a protocol for addressing: 1. possible suicidal ideation or self-harm expressed by a user to the AI companion, 2. possible physical harm to others expressed by a user to the AI companion, and 3. possible financial harm to others expressed by the user to the AI companion, that includes but is not limited to, a notification to the user that refers them to crisis service providers such as a suicide hotline, crisis text line, or other appropriate crisis services.
T-01 AI Identity Disclosure · T-01.1T-01.2 · Deployer · Chatbot
Gen. Bus. Law § 1702
Plain Language
Operators must unconditionally disclose to every user at the start of every AI companion interaction that the system is a computer program and not a human being, and that it is unable to feel human emotion. This disclosure must be repeated at least every three hours during continuing interactions. The disclosure must be provided either verbally or in bold, capitalized text of at least 16-point type. Unlike CA SB 243's conditional trigger (disclosure only when a reasonable person could be misled), this is unconditional — every interaction, every user, regardless of whether the user could reasonably be misled. The statute prescribes the exact mandatory language to be used.
Statutory Text
An operator shall provide a notification to a user at the beginning of any AI companion interaction and at least every three hours for continuing AI companion interactions thereafter, which states either verbally or in bold and capitalized letters of at least sixteen point type, the following: "THE AI COMPANION (OR NAME OF THE AI COMPANION) IS A COMPUTER PROGRAM AND NOT A HUMAN BEING. IT IS UNABLE TO FEEL HUMAN EMOTION".
Other · Chatbot
Gen. Bus. Law § 1703
Plain Language
This provision creates the statute's enforcement mechanism — a private right of action — rather than a new compliance obligation. A natural person who was physically injured through self-harm, or physically or financially harmed by another person, as a result of a violation of § 1701 (crisis protocol requirement) or § 1702 (notification requirement), may sue for damages, equitable relief, and other remedies. Standing is limited to persons who suffered actual harm traceable to the violation.
Statutory Text
Any person who was physically injured through self-harm or was physically or financially harmed by another as a result of a violation of section seventeen hundred one or seventeen hundred two of this article may bring an action in a court of competent jurisdiction for damages, equitable relief, and such other remedies as the court may deem appropriate.