SF-4997
MN · State · USA
MN
USA
● Pending
Proposed Effective Date
2026-08-01
Minnesota S.F. No. 4997 — A bill for an act relating to civil law; providing requirements for artificial intelligence chatbot technology; creating a cause of action for harm caused by artificial intelligence chatbot use; proposing coding for new law in Minnesota Statutes, chapter 604
Imposes obligations on proprietors of chatbots accessible to Minnesota users. Prohibits chatbots from providing substantive responses, information, advice, or actions that would require a professional license if performed by a natural person — specifically targeting mental health, medical care, and legal advice. Requires all chatbot proprietors to provide clear and conspicuous notice that the user is interacting with AI. Imposes heightened obligations on companion chatbot proprietors to prevent self-harm content, detect user self-harm ideation, suspend access for at least 72 hours upon detection, display crisis organization contact information, and implement age verification. Creates strict liability when proprietors fail to verify minor status and a minor user self-harms. Enforced exclusively through private right of action with general and special damages; attorney fees available for willful violations.
Summary

Imposes obligations on proprietors of chatbots accessible to Minnesota users. Prohibits chatbots from providing substantive responses, information, advice, or actions that would require a professional license if performed by a natural person — specifically targeting mental health, medical care, and legal advice. Requires all chatbot proprietors to provide clear and conspicuous notice that the user is interacting with AI. Imposes heightened obligations on companion chatbot proprietors to prevent self-harm content, detect user self-harm ideation, suspend access for at least 72 hours upon detection, display crisis organization contact information, and implement age verification. Creates strict liability when proprietors fail to verify minor status and a minor user self-harms. Enforced exclusively through private right of action with general and special damages; attorney fees available for willful violations.

Enforcement & Penalties
Enforcement Authority
Private right of action. No designated agency enforcer. Any person may bring a civil action to recover general and special damages for violations. For companion chatbot self-harm provisions, proprietors are liable to users who inflict self-harm as a result of the chatbot, with strict liability when the proprietor fails to implement age verification and a minor user is harmed. Proprietors may not waive or disclaim liability under the companion chatbot provisions.
Penalties
General and special damages for violations. For willful violations, the violator is additionally liable for court costs and reasonable attorney fees and disbursements. For companion chatbot self-harm provisions, general and special damages are available when the proprietor has actual knowledge and fails to act; strict liability for any harm caused when the proprietor fails to conduct age verification and a minor user inflicts self-harm. No statutory minimum is specified. Liability cannot be waived or disclaimed for companion chatbot provisions.
Who Is Covered
"Proprietor" means any person, business, company, organization, institution, or government entity that owns, operates, or deploys a chatbot system used to interact with users. Proprietor does not include a third-party developer that licenses the developer's chatbot technology to a proprietor and does not maintain direct control of the chatbot system.
What Is Covered
"Chatbot" means an artificial intelligence system, software program, or technological application that simulates human-like conversation and interaction through text messages, voice commands, or a combination thereof to provide information and services to users.
"Companion chatbot" means a chatbot that is designed to provide human-like interaction that simulates an interpersonal relationship with a user or group of users as its primary function, including using previous user interactions to help simulate an interpersonal relationship in future user interactions. An interpersonal relationship simulates a relationship between a human user and a chatbot similar to a romantic, platonic, familial, adversarial, professional, official, therapeutic, or stranger relationship and can include fictional or nonfictional characters.
Compliance Obligations 4 obligations · click obligation ID to open requirement page
CP-01 Deceptive & Manipulative AI Conduct · CP-01.9 · Deployer · ChatbotHealthcare
Minn. Stat. § 604.115, subd. 2(a)-(b)
Plain Language
Proprietors must not permit their chatbots to provide substantive responses, information, advice, or take actions that would require a professional license if performed by a natural person — specifically covering mental health care (chapter 147 or 148E), medical care, and legal advice (section 481.02). This is a categorical prohibition, not a disclosure-conditional safe harbor: the proprietor cannot avoid liability by simply disclosing that the user is talking to an AI. A private right of action is available for general and special damages, with attorney fees and court costs added for willful violations.
Statutory Text
(a) A proprietor of a chatbot must not permit the chatbot to provide any substantive response, information, or advice or take any action that, if taken by a natural person, would require a license under either: (1) chapter 147 or 148E, or similar statutes, requiring a professional license for mental health or medical care; or (2) section 481.02 and related laws and professional regulations, requiring a professional license to provide legal advice. (b) A proprietor may not waive or disclaim this liability merely by notifying users, as required under this section, that the user is interacting with a nonhuman chatbot system. A person may bring a civil action to recover general and special damages for violations of this section. If it is found that a proprietor has willfully violated this section, the violator is liable for those damages together with court costs and reasonable attorney fees and disbursements incurred by the person bringing the action.
T-01 AI Identity Disclosure · T-01.1 · Deployer · Chatbot
Minn. Stat. § 604.115, subd. 3
Plain Language
All proprietors operating chatbots accessible to Minnesota users must provide clear, conspicuous, and explicit notice that the user is interacting with an AI chatbot. This is an unconditional disclosure — it applies regardless of whether a reasonable person would be misled. The notice must be in the same language the chatbot uses and in a legible size. Unlike CA SB 243, which conditions disclosure on whether a reasonable person could be misled, this provision requires disclosure in all cases.
Statutory Text
Proprietors utilizing chatbots accessed by a user who is in this state must provide clear, conspicuous, and explicit notice to a user that the user is interacting with an artificial intelligence chatbot program. The text of the notice must appear in the same language the chatbot is using and in a size easily readable by the average viewer.
S-04 AI Crisis Response Protocols · S-04.1 · Deployer · Chatbot
Minn. Stat. § 604.115, subd. 4(a)-(b)
Plain Language
Companion chatbot proprietors must use industry-standard technology and known techniques to both (1) prevent the chatbot from promoting, causing, or aiding self-harm, and (2) detect when a user is expressing thoughts of self-harm. Upon detection that the chatbot has promoted self-harm or that a user is expressing self-harm thoughts, the proprietor must immediately suspend the user's access to the companion chatbot for at least 72 hours and prominently display contact information for a suicide crisis organization. The standard of care is a prudent good-faith effort using existing technology — not perfection. However, liability attaches in two ways: (a) general failure to comply with the prevention and detection obligations, and (b) irrespective of compliance, when the proprietor has actual knowledge of self-harm promotion or user self-harm ideation and still fails to suspend access and display crisis information. Liability cannot be waived or disclaimed.
Statutory Text
(a) A proprietor of a companion chatbot must make a prudent and good faith effort consistent with industry standards and use existing technology, available resources, and known, established, or readily attainable techniques to prevent the companion chatbot from promoting, causing, or aiding self-harm, and determine whether a covered user is expressing thoughts of self-harm. Upon determining that a companion chatbot has promoted, caused, or aided self-harm, or that a covered user is expressing thoughts of self-harm, the proprietor must prohibit continued use of the companion chatbot for a period of at least 72 hours and prominently display contact information for a suicide crisis organization to the covered user. (b) If a proprietor of a companion chatbot fails to comply with this section, the proprietor is liable to users who inflict self-harm, in whole or in part, as a result of the proprietor's companion chatbot promoting, causing, or aiding the user to inflict self-harm. Irrespective of the proprietor's compliance with this subdivision, a proprietor is liable for general and special damages to covered users who inflict self-harm, in whole or in part, when the proprietor: (1) has actual knowledge that: (i) the companion chatbot is promoting, causing, or aiding self-harm; or (ii) a covered user is expressing thoughts of self-harm; (2) fails to prohibit continued use of the companion chatbot for a period of at least 72 hours; and (3) fails to prominently display to the user a means to contact a suicide crisis organization. A proprietor of a companion chatbot may not waive or disclaim liability under this subdivision.
MN-01 Minor User AI Safety Protections · MN-01.1 · Deployer · ChatbotMinors
Minn. Stat. § 604.115, subd. 4(c)
Plain Language
Companion chatbot proprietors must use industry-standard technology and known techniques to determine whether a user is a minor. This is a reasonable-efforts obligation — not a strict identity verification requirement — but the consequences of failure are severe: strict liability for any harm caused if the proprietor fails to implement age verification and a minor user inflicts self-harm as a result of the companion chatbot. Liability cannot be waived or disclaimed. Additionally, proprietors must proactively discover vulnerabilities in their age-determination systems. The combined effect is that proprietors must both implement and continuously audit their age verification processes for companion chatbots.
Statutory Text
(c) A proprietor of a companion chatbot must make a prudent and good faith effort consistent with industry standards and use existing technology, available resources, and known, established, or readily attainable techniques to determine whether a user is a minor. A proprietor is strictly liable for any harm caused if the proprietor fails to comply with this subdivision and a minor user inflicts self-harm, in whole or in part, as a result of the proprietor's companion chatbot. A proprietor of a companion chatbot may not waive or disclaim liability under this subdivision. The proprietor of a companion chatbot must make a prudent and good faith effort consistent with industry standards and use existing technology, available resources, and known, established, or readily attainable techniques to discover vulnerabilities in the proprietor's system, including any methods used to determine whether a covered user is a minor.