AB-6767
NY · State · USA
NY
USA
● Pending
New York Assembly Bill 6767 — An Act to amend the general business law, in relation to artificial intelligence companion models
NY AB 6767 imposes safety and disclosure obligations on operators of AI companion systems — AI systems that simulate social human interaction, retain user preferences, and engage in conversations about personal well-being. Operators must maintain protocols for addressing user expressions of suicidal ideation, self-harm, physical harm to others, and financial harm to others, including crisis service referrals. Operators must also provide a mandatory disclosure at the start of every interaction and every three hours thereafter stating that the AI companion is a computer program, not a human, and cannot feel human emotion. Enforcement is exclusively through a private right of action available to persons physically injured through self-harm or physically or financially harmed by another as a result of a violation. The bill is currently pending in the Assembly Committee on Consumer Affairs and Protection.
Summary

NY AB 6767 imposes safety and disclosure obligations on operators of AI companion systems — AI systems that simulate social human interaction, retain user preferences, and engage in conversations about personal well-being. Operators must maintain protocols for addressing user expressions of suicidal ideation, self-harm, physical harm to others, and financial harm to others, including crisis service referrals. Operators must also provide a mandatory disclosure at the start of every interaction and every three hours thereafter stating that the AI companion is a computer program, not a human, and cannot feel human emotion. Enforcement is exclusively through a private right of action available to persons physically injured through self-harm or physically or financially harmed by another as a result of a violation. The bill is currently pending in the Assembly Committee on Consumer Affairs and Protection.

Enforcement & Penalties
Enforcement Authority
Private right of action. No designated agency enforcer. A person who was physically injured through self-harm or was physically or financially harmed by another as a result of a violation may bring an action in a court of competent jurisdiction. Standing requires physical injury through self-harm or physical or financial harm by another caused by the violation. No cure period or safe harbor is specified.
Penalties
Damages, equitable relief, and such other remedies as the court may deem appropriate. No statutory minimum or per-violation amount is specified. The statute requires that the plaintiff was physically injured through self-harm or physically or financially harmed by another, implying actual harm is required. Attorney fees are not addressed.
Who Is Covered
"Operator" means any person, partnership, association, firm, or business entity, or any member, affiliate, subsidiary or beneficial owner of any partnership, association, firm, or business entity who operates or provides an AI companion.
What Is Covered
"AI companion" means a system using artificial intelligence, generative artificial intelligence, and/or emotional recognition algorithms to simulate social human interaction, by retaining information on prior interactions and user preference, asking questions, providing advice, and engaging in simulated conversation on matters of personal well-being. "AI companion" shall not include any system used by a business entity solely intended to provide users with information about available commercial services or products, customer account information, or other information related to a user's customer, or potential customer, relationship with such business entity.
Compliance Obligations 3 obligations · click obligation ID to open requirement page
S-02 Prohibited Conduct & Output Restrictions · S-02.7 · Deployer · Chatbot
Gen. Bus. Law § 1701
Plain Language
Operators may not operate or provide an AI companion at all unless the system contains a protocol for addressing three categories of user-expressed risk: (1) suicidal ideation or self-harm, (2) physical harm to others, and (3) financial harm to others. The protocol must include, at minimum, a notification referring the user to crisis service providers such as a suicide hotline or crisis text line. This is a continuous operating prerequisite — the protocol must remain active as a condition of operation. Notably, the bill extends crisis protocols beyond self-harm to cover expressions of intent to physically or financially harm others, which is broader than comparable companion chatbot statutes like CA SB 243.
Statutory Text
It shall be unlawful for any operator to operate or provide an AI companion to a user unless such AI companion contains a protocol for addressing: 1. possible suicidal ideation or self-harm expressed by a user to the AI companion, 2. possible physical harm to others expressed by a user to the AI companion, and 3. possible financial harm to others expressed by the user to the AI companion, that includes but is not limited to, a notification to the user that refers them to crisis service providers such as a suicide hotline, crisis text line, or other appropriate crisis services.
T-01 AI Identity Disclosure · T-01.1T-01.2 · Deployer · Chatbot
Gen. Bus. Law § 1702
Plain Language
Operators must provide a mandatory disclosure to every user at the start of every AI companion interaction — unconditionally, not only when a reasonable person could be misled. For continuing interactions, the same disclosure must be repeated at least every three hours. The disclosure must state that the AI companion is a computer program, not a human being, and that it cannot feel human emotion. The disclosure must be delivered either verbally or in bold, capitalized text of at least 16-point type. The statute prescribes exact mandatory language, which is notably more prescriptive than comparable statutes like CA SB 243 that leave the specific wording to the operator. This obligation applies to all users regardless of age.
Statutory Text
An operator shall provide a notification to a user at the beginning of any AI companion interaction and at least every three hours for continuing AI companion interactions thereafter, which states either verbally or in bold and capitalized letters of at least sixteen point type, the following: "THE AI COMPANION (OR NAME OF THE AI COMPANION) IS A COMPUTER PROGRAM AND NOT A HUMAN BEING. IT IS UNABLE TO FEEL HUMAN EMOTION".
Other · Chatbot
Gen. Bus. Law § 1703
Plain Language
This provision creates the private right of action that is the sole enforcement mechanism for the statute. It grants standing to any person who was physically injured through self-harm or was physically or financially harmed by another as a result of a violation of the crisis protocol (§ 1701) or notification (§ 1702) requirements. Available remedies include damages, equitable relief, and any other remedies the court deems appropriate. This provision does not create a new compliance obligation — it establishes how violations of the operative sections are enforced.
Statutory Text
Any person who was physically injured through self-harm or was physically or financially harmed by another as a result of a violation of section seventeen hundred one or seventeen hundred two of this article may bring an action in a court of competent jurisdiction for damages, equitable relief, and such other remedies as the court may deem appropriate.