SB-3008
NY · State · USA
NY
USA
● Enacted
Effective Date
2025-11-05
New York S. 3008-C / A. 3008-C — New York State Budget Bill (Part U: Artificial Intelligence Companion Models)
Part U of the New York State Budget Bill creates Article 47 of the General Business Law, imposing two operative obligations on operators of AI companion systems: (1) a mandatory crisis protocol for detecting and responding to user expressions of suicidal ideation or self-harm, including referral to the 988 hotline and other crisis services; and (2) an unconditional AI identity disclosure at the start of every interaction and at least every three hours during continuing interactions. NY's disclosure obligation is unconditional — it applies regardless of whether a reasonable person would be misled — and the periodic re-disclosure floor applies to all users, not just minors. Enforcement is AG-only with civil penalties up to $15,000 per day per violation; there is no private right of action. Penalties are deposited into a newly created Suicide Prevention Fund.
Summary

Part U of the New York State Budget Bill creates Article 47 of the General Business Law, imposing two operative obligations on operators of AI companion systems: (1) a mandatory crisis protocol for detecting and responding to user expressions of suicidal ideation or self-harm, including referral to the 988 hotline and other crisis services; and (2) an unconditional AI identity disclosure at the start of every interaction and at least every three hours during continuing interactions. NY's disclosure obligation is unconditional — it applies regardless of whether a reasonable person would be misled — and the periodic re-disclosure floor applies to all users, not just minors. Enforcement is AG-only with civil penalties up to $15,000 per day per violation; there is no private right of action. Penalties are deposited into a newly created Suicide Prevention Fund.

Enforcement & Penalties
Enforcement Authority
New York Attorney General. The AG may bring an action in supreme court for violations of § 1701 (crisis protocol) or § 1702 (AI identity disclosure). No private right of action.
Penalties
Civil penalty of up to $15,000 per day for each violation of § 1701 (crisis protocol) or § 1702 (AI identity disclosure). Injunctive relief and other remedies as the court may deem appropriate. All fees, fines, and penalties deposited into the Suicide Prevention Fund established by § 99-ss of the State Finance Law.
Who Is Covered
Operator: any person, partnership, association, firm, or business entity, or any member, affiliate, subsidiary or beneficial owner of any partnership, association, firm, or business entity who operates for or provides an AI companion to a user.
What Is Covered
AI companion: a system using artificial intelligence, generative artificial intelligence, and/or emotional recognition algorithms designed to simulate a sustained human or human-like relationship with a user by: (i) retaining information on prior interactions or user sessions and user preferences to personalize the interaction and facilitate ongoing engagement with the AI companion; (ii) asking unprompted or unsolicited emotion-based questions that go beyond a direct response to a user prompt; and (iii) sustaining an ongoing dialogue concerning matters personal to the user. 'AI companion' shall not include: (i) any system used by a business entity solely for customer service or to strictly provide users with information about available commercial services or products provided by such entity, customer service account information, or other information strictly related to its customer service; (ii) any system that is primarily designed and marketed for providing efficiency improvements or, research or technical assistance; or (iii) any system used by a business entity solely for internal purposes or employee productivity.
Emotional recognition algorithms: artificial intelligence that detects and interprets human emotional signals in text (using natural language processing and sentiment analysis), audio (using voice emotion AI), video (using facial movement analysis, gait analysis, or physiological signals), or a combination thereof.
Compliance Obligations 2 obligations · click obligation ID to open requirement page
MN-02 AI Crisis Response Protocols · MN-02.1 · Deployer · Chatbot
General Business Law § 1701
Plain Language
Operators must build and maintain a crisis response protocol into the AI companion before it may be offered to users. The protocol must include, at minimum: (1) detection of user expressions of suicidal ideation or self-harm, and (2) upon detection, a notification to the user directing them to the 988 Suicide & Crisis Lifeline, a crisis text line, or other appropriate crisis services. This is a product-design obligation — the protocol must exist in the system itself, not just as an operator policy. The obligation is unconditional and applies to all operators of AI companions regardless of user demographics.
Statutory Text
It shall be unlawful for any operator to operate for or provide an AI companion to a user unless such AI companion contains a protocol to take reasonable efforts for detecting and addressing suicidal ideation or expressions of self-harm expressed by a user to the AI companion, that includes but is not limited to, detection of user expressions of suicidal ideation or self-harm, and a notification to the user that refers them to crisis service providers such as the 9-8-8 suicide prevention and behavioral health crisis hotline under section 36.03 of the mental hygiene law, a crisis text line, or other appropriate crisis services upon detection of such user's expressions of suicidal ideation or self-harm.
T-01 AI Identity Disclosure · T-01.1T-01.2 · Deployer · Chatbot
General Business Law § 1702
Plain Language
Operators must deliver a clear and prominent disclosure - either verbal or written - that the user is not communicating with a human. This disclosure is required: (1) at the start of every AI companion interaction (initial disclosure), subject to a cap of once per day, and (2) at least every three hours during any continuing interaction (periodic re-disclosure). The obligation is unconditional — it applies to all users regardless of whether a reasonable person would be misled.
Statutory Text
An operator shall provide a clear and conspicuous notification to a user at the beginning of any AI companion interaction which need not exceed once per day and at least every three hours for continuing AI companion interactions which states either verbally or in writing that the user is not communicating with a human.