HF-4452
MN · State · USA
MN
USA
● Pending
Proposed Effective Date
2026-08-01
Minnesota H.F. No. 4452 — A bill for an act relating to civil law; providing requirements for artificial intelligence chatbot technology; creating a cause of action for harm caused by artificial intelligence chatbot use; proposing coding for new law in Minnesota Statutes, chapter 604
Imposes obligations on proprietors of chatbots and companion chatbots accessible to Minnesota users. Prohibits chatbots from providing substantive responses that would require a professional license if given by a natural person — specifically mental health/medical care and legal advice. Requires all chatbot proprietors to provide clear, conspicuous notice that users are interacting with AI. Companion chatbot proprietors face additional obligations to prevent self-harm content, detect self-harm expressions, suspend access for 72 hours upon detection, and display crisis contact information. Strict liability applies when proprietors fail to determine minor user status and a minor is harmed. Enforceable through a private right of action for general and special damages, with attorney fees available for willful violations.
Summary

Imposes obligations on proprietors of chatbots and companion chatbots accessible to Minnesota users. Prohibits chatbots from providing substantive responses that would require a professional license if given by a natural person — specifically mental health/medical care and legal advice. Requires all chatbot proprietors to provide clear, conspicuous notice that users are interacting with AI. Companion chatbot proprietors face additional obligations to prevent self-harm content, detect self-harm expressions, suspend access for 72 hours upon detection, and display crisis contact information. Strict liability applies when proprietors fail to determine minor user status and a minor is harmed. Enforceable through a private right of action for general and special damages, with attorney fees available for willful violations.

Enforcement & Penalties
Enforcement Authority
Private right of action. No designated agency enforcer. Any person may bring a civil action to recover general and special damages for violations. For companion chatbot self-harm provisions, the proprietor is liable to covered users who inflict self-harm as a result of the chatbot; strict liability applies when a minor user is harmed and the proprietor failed to make good faith efforts to determine whether the user is a minor. Liability under the companion chatbot provisions may not be waived or disclaimed.
Penalties
General and special damages for violations. For willful violations, the violator is also liable for court costs and reasonable attorney fees and disbursements. For companion chatbot self-harm involving minors where the proprietor failed to determine minor status, strict liability applies for any harm caused. No statutory minimum dollar amount is specified. Damages require proof of actual harm (general and special damages). Liability under companion chatbot provisions cannot be waived or disclaimed.
Who Is Covered
"Proprietor" means any person, business, company, organization, institution, or government entity that owns, operates, or deploys a chatbot system used to interact with users. Proprietor does not include a third-party developer that licenses the developer's chatbot technology to a proprietor and does not maintain direct control of the chatbot system.
What Is Covered
"Chatbot" means an artificial intelligence system, software program, or technological application that simulates human-like conversation and interaction through text messages, voice commands, or a combination thereof to provide information and services to users.
"Companion chatbot" means a chatbot that is designed to provide human-like interaction that simulates an interpersonal relationship with a user or group of users as its primary function, including using previous user interactions to help simulate an interpersonal relationship in future user interactions. An interpersonal relationship simulates a relationship between a human user and a chatbot similar to a romantic, platonic, familial, adversarial, professional, official, therapeutic, or stranger relationship and can include fictional or nonfictional characters.
Compliance Obligations 4 obligations · click obligation ID to open requirement page
CP-01 Deceptive & Manipulative AI Conduct · CP-01.9 · Deployer · ChatbotHealthcare
Minn. Stat. § 604.115, subd. 2(a)-(b)
Plain Language
Proprietors must prevent their chatbots from providing any substantive response, information, advice, or action that would require a professional license if performed by a human — specifically mental health or medical care (under Minnesota chapters 147 or 148E) or legal advice (under section 481.02). This is a broad prohibition: any output that crosses the line into licensed professional activity is forbidden, and the prohibition cannot be waived or disclaimed by disclosing the AI nature of the chatbot. Violations give rise to a private right of action for general and special damages, with attorney fees available for willful violations.
Statutory Text
(a) A proprietor of a chatbot must not permit the chatbot to provide any substantive response, information, or advice or take any action that, if taken by a natural person, would require a license under either: (1) chapter 147 or 148E, or similar statutes, requiring a professional license for mental health or medical care; or (2) section 481.02 and related laws and professional regulations, requiring a professional license to provide legal advice. (b) A proprietor may not waive or disclaim this liability merely by notifying users, as required under this section, that the user is interacting with a nonhuman chatbot system. A person may bring a civil action to recover general and special damages for violations of this section. If it is found that a proprietor has willfully violated this section, the violator is liable for those damages together with court costs and reasonable attorney fees and disbursements incurred by the person bringing the action.
T-01 AI Identity Disclosure · T-01.1 · Deployer · Chatbot
Minn. Stat. § 604.115, subd. 3
Plain Language
All chatbot proprietors must provide clear, conspicuous, and explicit notice to every user that they are interacting with an AI chatbot — not a human. This is an unconditional obligation: unlike some jurisdictions that trigger disclosure only when a reasonable person could be misled, Minnesota requires it for all chatbot interactions. The notice must be in the same language the chatbot uses and must be large enough to be easily readable. This applies to any chatbot accessed by a user located in Minnesota.
Statutory Text
Proprietors utilizing chatbots accessed by a user who is in this state must provide clear, conspicuous, and explicit notice to a user that the user is interacting with an artificial intelligence chatbot program. The text of the notice must appear in the same language the chatbot is using and in a size easily readable by the average viewer.
S-02 Prohibited Conduct & Output Restrictions · S-02.7 · Deployer · Chatbot
Minn. Stat. § 604.115, subd. 4(a)-(b)
Plain Language
Companion chatbot proprietors must take three affirmative steps: (1) make good faith, industry-standard efforts to prevent the chatbot from promoting, causing, or aiding self-harm; (2) use reasonable techniques to detect when a user is expressing thoughts of self-harm; and (3) upon detection, immediately suspend the user's access for at least 72 hours and prominently display suicide crisis organization contact information. The liability structure is two-tiered. First, failure to comply with these obligations creates liability for resulting self-harm. Second, even if the proprietor is otherwise compliant, liability attaches whenever the proprietor has actual knowledge of self-harm promotion or user self-harm expressions and fails to suspend access and display crisis information. Liability under this subdivision cannot be waived or disclaimed.
Statutory Text
(a) A proprietor of a companion chatbot must make a prudent and good faith effort consistent with industry standards and use existing technology, available resources, and known, established, or readily attainable techniques to prevent the companion chatbot from promoting, causing, or aiding self-harm, and determine whether a covered user is expressing thoughts of self-harm. Upon determining that a companion chatbot has promoted, caused, or aided self-harm, or that a covered user is expressing thoughts of self-harm, the proprietor must prohibit continued use of the companion chatbot for a period of at least 72 hours and prominently display contact information for a suicide crisis organization to the covered user. (b) If a proprietor of a companion chatbot fails to comply with this section, the proprietor is liable to users who inflict self-harm, in whole or in part, as a result of the proprietor's companion chatbot promoting, causing, or aiding the user to inflict self-harm. Irrespective of the proprietor's compliance with this subdivision, a proprietor is liable for general and special damages to covered users who inflict self-harm, in whole or in part, when the proprietor: (1) has actual knowledge that: (i) the companion chatbot is promoting, causing, or aiding self-harm; or (ii) a covered user is expressing thoughts of self-harm; (2) fails to prohibit continued use of the companion chatbot for a period of at least 72 hours; and (3) fails to prominently display to the user a means to contact a suicide crisis organization. A proprietor of a companion chatbot may not waive or disclaim liability under this subdivision.
MN-01 Minor User AI Safety Protections · MN-01.1 · Deployer · ChatbotMinors
Minn. Stat. § 604.115, subd. 4(c)
Plain Language
Companion chatbot proprietors must make good faith, industry-standard efforts using existing technology and known techniques to determine whether a user is a minor. This is an age-determination obligation — not full age verification — with a reasonableness standard tied to industry practices. If the proprietor fails to comply and a minor user inflicts self-harm as a result of the chatbot, the proprietor faces strict liability for any harm caused. The proprietor must also proactively discover vulnerabilities in their system, including vulnerabilities in their minor-detection methods. Liability under this subdivision cannot be waived or disclaimed. The strict liability standard for minor self-harm is notably more severe than the general/special damages standard in subdivision 4(a)-(b) for adult users.
Statutory Text
(c) A proprietor of a companion chatbot must make a prudent and good faith effort consistent with industry standards and use existing technology, available resources, and known, established, or readily attainable techniques to determine whether a user is a minor. A proprietor is strictly liable for any harm caused if the proprietor fails to comply with this subdivision and a minor user inflicts self-harm, in whole or in part, as a result of the proprietor's companion chatbot. A proprietor of a companion chatbot may not waive or disclaim liability under this subdivision. The proprietor of a companion chatbot must make a prudent and good faith effort consistent with industry standards and use existing technology, available resources, and known, established, or readily attainable techniques to discover vulnerabilities in the proprietor's system, including any methods used to determine whether a covered user is a minor.