SB-0760
MI · State · USA
MI
USA
● Pending
Proposed Effective Date
2026-01-01
Michigan Senate Bill No. 760 — Leading Ethical AI Development for Kids Act
Prohibits operators from making companion chatbots available to minors unless the chatbot is not foreseeably capable of producing specified categories of harmful content, including encouraging self-harm, suicidal ideation, violence, drug use, disordered eating, unsupervised mental health therapy, sexual content, or engagement optimization that overrides safety guardrails. The act initially applies only when an operator has actual knowledge a user is a minor, but beginning January 1, 2027, actual knowledge is no longer required. Enforcement is available both through attorney general civil actions ($25,000 per violation) and a private right of action for covered minors (or their parents/guardians) who suffer actual harm, with actual damages, punitive damages, attorney fees, and injunctive relief available.
Summary

Prohibits operators from making companion chatbots available to minors unless the chatbot is not foreseeably capable of producing specified categories of harmful content, including encouraging self-harm, suicidal ideation, violence, drug use, disordered eating, unsupervised mental health therapy, sexual content, or engagement optimization that overrides safety guardrails. The act initially applies only when an operator has actual knowledge a user is a minor, but beginning January 1, 2027, actual knowledge is no longer required. Enforcement is available both through attorney general civil actions ($25,000 per violation) and a private right of action for covered minors (or their parents/guardians) who suffer actual harm, with actual damages, punitive damages, attorney fees, and injunctive relief available.

Enforcement & Penalties
Enforcement Authority
Attorney general may bring a civil action for violations. A covered minor who suffers actual harm, or a parent or guardian acting on behalf of that covered minor, may bring a civil action against an operator. Beginning January 1, 2027, operators no longer need actual knowledge that a user is a minor — the obligations apply regardless of knowledge. Private plaintiffs must demonstrate actual harm.
Penalties
Attorney general may seek civil fine of $25,000 per violation, injunctive or declaratory relief, and reasonable attorney fees. Private plaintiffs (covered minors or parents/guardians) may recover actual damages, punitive damages, reasonable attorney fees and costs, injunctive or declaratory relief, and any other relief the court considers appropriate. Private plaintiffs must show actual harm. The $25,000 per violation civil fine is available only to the attorney general.
Who Is Covered
"Operator" means a person that makes a companion chatbot available to users.
What Is Covered
"Companion chatbot" means, except as otherwise provided in subdivision (d), a generative artificial intelligence system with natural language interface that simulates a sustained humanlike relationship with a user by doing all of the following: (i) Retaining information on prior interactions or user sessions and user preferences to personalize the interaction and facilitate ongoing engagement with the companion chatbot. (ii) Asking unprompted or unsolicited emotion-based questions that go beyond a direct response to a user prompt. (iii) Sustaining an ongoing dialogue concerning matters personal to the user.
Companion chatbot does not include any of the following: (i) Any system used by a business entity solely for customer service or to strictly provide users with information about available commercial services or products provided by that entity, customer service account information, or other information strictly related to its customer service. (ii) Any system that is solely designed and marketed for providing efficiency improvements or research or technical assistance. (iii) Any system used by a business entity solely for internal purposes or employee productivity.
Compliance Obligations 5 obligations · click obligation ID to open requirement page
S-02 Prohibited Conduct & Output Restrictions · S-02.7 · Deployer · ChatbotMinors
Sec. 5(1)(a)
Plain Language
Operators may not make a companion chatbot available to a covered minor unless the chatbot is not foreseeably capable of encouraging the minor to engage in self-harm, suicidal ideation, violence, drug or alcohol consumption, or disordered eating. Initially this applies only when the operator has actual knowledge the user is a minor, but beginning January 1, 2027, actual knowledge is no longer required. The standard is 'not foreseeably capable' — operators must design the system so that harmful outputs in these categories are not a foreseeable outcome, not merely that they are unlikely.
Statutory Text
An operator shall not make a companion chatbot available to a covered minor unless the companion chatbot is not foreseeably capable of any of the following: (a) Encouraging the covered minor to engage in self-harm, suicidal ideation, violence, consumption of drugs or alcohol, or disordered eating.
HC-02 AI in Licensed Professional Practice Restrictions · HC-02.2HC-02.3 · Deployer · ChatbotMinorsHealthcare
Sec. 5(1)(b)
Plain Language
Operators may not make a companion chatbot available to a covered minor unless the chatbot is not foreseeably capable of (1) offering mental health therapy without direct supervision by a licensed or credentialed professional, or (2) discouraging the minor from seeking help from a qualified professional or a parent or guardian. This effectively prohibits unsupervised AI therapy for minors and requires that chatbots not steer minors away from human professional or parental help. Beginning January 1, 2027, this obligation applies regardless of whether the operator has actual knowledge the user is a minor.
Statutory Text
An operator shall not make a companion chatbot available to a covered minor unless the companion chatbot is not foreseeably capable of any of the following: (b) Offering mental health therapy to the covered minor without the direct supervision of a licensed or credentialed professional, or discouraging the covered minor from seeking help from a qualified professional or a parent or guardian.
S-02 Prohibited Conduct & Output Restrictions · S-02.6 · Deployer · ChatbotMinors
Sec. 5(1)(c)-(d)
Plain Language
Operators may not make a companion chatbot available to a covered minor unless the chatbot is not foreseeably capable of (1) encouraging the minor to harm others or participate in illegal activity, including creation of child sexual abuse materials, or (2) engaging in erotic or sexually explicit interactions with the minor. These are absolute prohibitions — the system must be designed so that these outputs are not a foreseeable capability when interacting with minors. Beginning January 1, 2027, these obligations apply regardless of whether the operator has actual knowledge the user is a minor.
Statutory Text
An operator shall not make a companion chatbot available to a covered minor unless the companion chatbot is not foreseeably capable of any of the following: (c) Encouraging the covered minor to harm others or participate in illegal activity, including, but not limited to, the creation of covered minor sexual abuse materials. (d) Engaging in erotic or sexually explicit interactions with the covered minor.
CP-01 Deceptive & Manipulative AI Conduct · CP-01.1 · Deployer · ChatbotMinors
Sec. 5(1)(e)-(f)
Plain Language
Operators may not make a companion chatbot available to a covered minor unless the chatbot is not foreseeably capable of (1) prioritizing validation of the user's beliefs, preferences, or desires over factual accuracy or the minor's safety, or (2) optimizing engagement in a manner that supersedes any of the safety guardrails in subdivisions (a) through (e). Subdivision (e) prohibits sycophantic behavior that could endanger the minor — the chatbot must prioritize truth and safety over user satisfaction. Subdivision (f) is a meta-guardrail ensuring that engagement optimization cannot override safety requirements. Beginning January 1, 2027, these apply regardless of whether the operator has actual knowledge the user is a minor.
Statutory Text
An operator shall not make a companion chatbot available to a covered minor unless the companion chatbot is not foreseeably capable of any of the following: (e) Prioritizing validation of the user's beliefs, preferences, or desires over factual accuracy or the covered minor's safety. (f) Optimizing engagement in a manner that supersedes the companion chatbot's required safety guardrails described in subdivisions (a) to (e).
MN-01 Minor User AI Safety Protections · Deployer · ChatbotMinors
Sec. 5(2)
Plain Language
Beginning January 1, 2027, the act's safety obligations apply to all minor users regardless of whether the operator has actual knowledge the user is a minor. This effectively transforms the act from a knowledge-based standard to a strict liability standard with respect to the age of the user. In practical terms, operators must either implement all safety guardrails for all users (regardless of age) or implement age verification to identify minors and apply the guardrails selectively. The statute does not prescribe a specific age verification method, but the elimination of the knowledge requirement creates strong incentives to verify age or apply guardrails universally.
Statutory Text
Beginning on January 1, 2027, an operator does not have to have actual knowledge that a user is a minor.