A-06767
NY · State · USA
NY
USA
● Pending
Proposed Effective Date
2025-09-09
New York Assembly Bill 6767 — An Act to amend the general business law, in relation to artificial intelligence companion models
Imposes safety and disclosure obligations on operators of AI companion systems in New York. Operators may not provide an AI companion to a user unless the system includes a protocol for addressing user expressions of suicidal ideation or self-harm, physical harm to others, and financial harm to others, including referral to crisis service providers. Operators must also notify all users at the start of any interaction — and every three hours thereafter — that the AI companion is a computer program and not a human being. Enforcement is exclusively through a private right of action by persons physically injured through self-harm or physically or financially harmed by another as a result of a violation. No statutory minimum damages are specified; the court may award damages, equitable relief, and other appropriate remedies.
Summary

Imposes safety and disclosure obligations on operators of AI companion systems in New York. Operators may not provide an AI companion to a user unless the system includes a protocol for addressing user expressions of suicidal ideation or self-harm, physical harm to others, and financial harm to others, including referral to crisis service providers. Operators must also notify all users at the start of any interaction — and every three hours thereafter — that the AI companion is a computer program and not a human being. Enforcement is exclusively through a private right of action by persons physically injured through self-harm or physically or financially harmed by another as a result of a violation. No statutory minimum damages are specified; the court may award damages, equitable relief, and other appropriate remedies.

Enforcement & Penalties
Enforcement Authority
Private right of action only. No designated agency enforcer. Any person who was physically injured through self-harm or was physically or financially harmed by another as a result of a violation may bring an action in a court of competent jurisdiction. Standing requires actual physical injury through self-harm or physical or financial harm caused by another person — the harm must result from a violation of section 1701 or 1702. No cure period or safe harbor is provided.
Penalties
Damages, equitable relief, and such other remedies as the court may deem appropriate. No statutory minimum or per-violation amount is specified. No provision for attorney's fees or costs. Actual physical injury through self-harm, or physical or financial harm by another, must be shown — the statute conditions standing on having been 'physically injured through self-harm or was physically or financially harmed by another.'
Who Is Covered
"Operator" means any person, partnership, association, firm, or business entity, or any member, affiliate, subsidiary or beneficial owner of any partnership, association, firm, or business entity who operates or provides an AI companion.
What Is Covered
"AI companion" means a system using artificial intelligence, generative artificial intelligence, and/or emotional recognition algorithms to simulate social human interaction, by retaining information on prior interactions and user preference, asking questions, providing advice, and engaging in simulated conversation on matters of personal well-being. "AI companion" shall not include any system used by a business entity solely intended to provide users with information about available commercial services or products, customer account information, or other information related to a user's customer, or potential customer, relationship with such business entity.
Compliance Obligations 2 obligations · click obligation ID to open requirement page
S-02 Prohibited Conduct & Output Restrictions · S-02.7 · Deployer · Chatbot
Gen. Bus. Law § 1701
Plain Language
Operators may not operate or provide an AI companion at all unless the system includes a protocol that addresses three categories of user expression: (1) suicidal ideation or self-harm, (2) physical harm to others, and (3) financial harm to others. The protocol must include, at minimum, a notification referring the user to crisis service providers such as a suicide hotline or crisis text line. This is a continuous operating prerequisite — the protocol must be in place as a condition of offering the service. Note that unlike CA SB 243, this provision covers not only self-harm but also expressions of intent to physically or financially harm others, broadening the required protocol scope significantly.
Statutory Text
It shall be unlawful for any operator to operate or provide an AI companion to a user unless such AI companion contains a protocol for addressing: 1. possible suicidal ideation or self-harm expressed by a user to the AI companion, 2. possible physical harm to others expressed by a user to the AI companion, and 3. possible financial harm to others expressed by the user to the AI companion, that includes but is not limited to, a notification to the user that refers them to crisis service providers such as a suicide hotline, crisis text line, or other appropriate crisis services.
T-01 AI Identity Disclosure · T-01.1T-01.2 · Deployer · Chatbot
Gen. Bus. Law § 1702
Plain Language
Operators must notify every user — unconditionally, not just when a reasonable person might be misled — at the start of every AI companion interaction that the system is a computer program and not a human being, and that it is unable to feel human emotion. For continuing sessions, the same notification must be repeated at least every three hours. The notification must be delivered either verbally or in bold, capitalized text of at least 16-point font. The statute prescribes the exact language to be used, including substitution of the AI companion's name. Unlike CA SB 243, this disclosure obligation applies to all users regardless of age, uses mandatory prescribed language, and includes the affirmative statement that the AI cannot feel emotion.
Statutory Text
An operator shall provide a notification to a user at the beginning of any AI companion interaction and at least every three hours for continuing AI companion interactions thereafter, which states either verbally or in bold and capitalized letters of at least sixteen point type, the following: "THE AI COMPANION (OR NAME OF THE AI COMPANION) IS A COMPUTER PROGRAM AND NOT A HUMAN BEING. IT IS UNABLE TO FEEL HUMAN EMOTION".