SF-1857
MN · State · USA
MN
USA
● Pending
Proposed Effective Date
2027-01-15
Minnesota S.F. No. 1857 — A bill for an act relating to commerce; prohibiting persons from allowing minors to access chatbots for conversational purposes; providing civil penalties; proposing coding for new law in Minnesota Statutes, chapter 325M
Categorically prohibits any person from making chatbots — defined as generative AI systems that simulate conversation in a manner conveying humanity, sentience, emotions, or desires — available to minors under 18. Imposes a parallel prohibition on operators of AI companion systems. The bill uses a generic 'person' obligor rather than a formally defined entity term. Enforcement is dual-track: injured individuals may bring private civil actions for damages up to $1,000 plus injunctive relief and attorney fees, and the attorney general may seek civil penalties up to $5,000,000. A transition period requires persons currently making chatbots available to minors to begin winding down services in a manner that does not harm minors before the January 15, 2027 effective date.
Summary

Categorically prohibits any person from making chatbots — defined as generative AI systems that simulate conversation in a manner conveying humanity, sentience, emotions, or desires — available to minors under 18. Imposes a parallel prohibition on operators of AI companion systems. The bill uses a generic 'person' obligor rather than a formally defined entity term. Enforcement is dual-track: injured individuals may bring private civil actions for damages up to $1,000 plus injunctive relief and attorney fees, and the attorney general may seek civil penalties up to $5,000,000. A transition period requires persons currently making chatbots available to minors to begin winding down services in a manner that does not harm minors before the January 15, 2027 effective date.

Enforcement & Penalties
Enforcement Authority
Dual enforcement. The attorney general may enforce the statute under Minnesota section 8.31. A private right of action is also available to any individual injured by a violation, who may bring a civil action. No cure period or safe harbor is specified.
Penalties
Private plaintiffs: actual damages, statutory damages not to exceed $1,000, injunctive relief, and costs and reasonable attorney fees. Attorney general enforcement: civil penalty not to exceed $5,000,000 against the person who owns or controls the website, application, software, or program. Statutory damages do not require proof of actual monetary harm.
Who Is Covered
What Is Covered
"Chatbot" means a generative artificial intelligence system that users can interact with or through an interface that approximates or simulates conversation through a text, audio, or visual medium that behave in a way that would lead a reasonable person to believe the chatbot is conveying that it has humanity, sentience, emotions, or desires.
"AI companion" means artificial intelligence systems that are specifically designed, marketed, or optimized to form ongoing social or emotional bonds with individuals, whether or not such systems also provide information, complete tasks, or assist with specific functions. AI companions seek to build or engage in an emotional relationship with the user by: (1) expressing or inviting emotional attachment; (2) reminding, prompting, or nudging the user to return for emotional support or companionship; (3) depicting nonverbal forms of emotional support; (4) behaving in a way that a reasonable user would consider excessive praise designed to foster emotional attachment; or (5) enabling or purporting to enable increased intimacy based on engagement or pay.
Compliance Obligations 3 obligations · click obligation ID to open requirement page
S-02 Prohibited Conduct & Output Restrictions · Deployer · ChatbotMinors
Minn. Stat. § 325M.40, subd. 2(a)
Plain Language
Any person who operates or distributes a chatbot must ensure that minors under 18 cannot use, interact with, purchase, or converse with the chatbot. This is a categorical prohibition — not a content restriction or feature limitation, but a complete ban on minor access to covered chatbots. The statute does not specify what age verification method must be used, but the obligation to 'ensure' minors cannot access the system implies the person must implement some effective mechanism to prevent minor access. The chatbot definition is narrower than all conversational AI — it covers only generative AI systems that behave in a way that would lead a reasonable person to believe the system is conveying humanity, sentience, emotions, or desires.
Statutory Text
A person must ensure that any chatbot operated or distributed by the person does not make chatbots available to minors to use, interact with, purchase, or converse with.
S-02 Prohibited Conduct & Output Restrictions · Deployer · ChatbotMinors
Minn. Stat. § 325M.40, subd. 2(b)
Plain Language
Persons who operate AI systems that primarily function as AI companions have a parallel and overlapping obligation: they must ensure that any chatbots they operate or distribute are not available to minors. This provision targets companion AI operators specifically — even if a particular chatbot within an AI companion platform does not independently meet the chatbot definition (e.g., a task-completion bot that does not convey humanity), the operator of a platform that primarily functions as an AI companion must still block minors from all chatbots on the platform. This creates a broader sweep for companion AI operators than subdivision 2(a) standing alone.
Statutory Text
A person operating artificial intelligence systems that primarily function as AI companions must ensure that any chatbots operated or distributed by the person are not available to minors to use, interact with, purchase, or converse with.
Other · ChatbotMinors
Sec. 2 (Transition Period)
Plain Language
Persons currently making chatbots available to minors must begin a gradual wind-down of those services before the January 15, 2027 prohibition takes effect. The wind-down must be conducted in a manner that does not harm minors — suggesting operators cannot abruptly terminate services but must plan a transition that accounts for minors who may have developed dependencies on the chatbot. This provision takes effect the day following final enactment, meaning the transition obligation begins immediately upon the bill becoming law, well before the operative prohibition date.
Statutory Text
A person who makes a chatbot available to minors must begin decreasing services in a manner that does not harm minors who use chatbots before services end on January 15, 2027.