HF-4452
MN · State · USA
MN
USA
● Pending
Proposed Effective Date
2026-08-01
Minnesota HF 4452 — A bill for an act relating to civil law; providing requirements for artificial intelligence chatbot technology; creating a cause of action for harm caused by artificial intelligence chatbot use; proposing coding for new law in Minnesota Statutes, chapter 604
Imposes obligations on proprietors of chatbots and companion chatbots accessible to users in Minnesota. Prohibits chatbots from providing responses that would require a professional license in mental health, medical care, or legal practice if delivered by a natural person. Requires all chatbot proprietors to provide clear, conspicuous notice that the user is interacting with an AI. Imposes heightened duties on companion chatbot proprietors to prevent self-harm content, detect self-harm expressions, suspend access for 72 hours upon detection, and display crisis contact information. Creates strict liability for companion chatbot proprietors who fail to determine whether a user is a minor when a minor user inflicts self-harm. Enforced exclusively through private right of action with general and special damages; attorney fees available for willful violations.
Summary

Imposes obligations on proprietors of chatbots and companion chatbots accessible to users in Minnesota. Prohibits chatbots from providing responses that would require a professional license in mental health, medical care, or legal practice if delivered by a natural person. Requires all chatbot proprietors to provide clear, conspicuous notice that the user is interacting with an AI. Imposes heightened duties on companion chatbot proprietors to prevent self-harm content, detect self-harm expressions, suspend access for 72 hours upon detection, and display crisis contact information. Creates strict liability for companion chatbot proprietors who fail to determine whether a user is a minor when a minor user inflicts self-harm. Enforced exclusively through private right of action with general and special damages; attorney fees available for willful violations.

Enforcement & Penalties
Enforcement Authority
Private right of action only. No designated agency enforcer. Any person may bring a civil action to recover general and special damages for violations. For companion chatbot self-harm provisions, liability attaches regardless of compliance when the proprietor has actual knowledge and fails to act; liability may not be waived or disclaimed. Strict liability applies to proprietors who fail to make good-faith efforts to determine whether a user is a minor and a minor user inflicts self-harm as a result of the companion chatbot.
Penalties
General and special damages for violations. For willful violations, the violator is also liable for court costs and reasonable attorney fees and disbursements. For companion chatbot self-harm provisions: general and special damages to covered users who inflict self-harm when the proprietor has actual knowledge and fails to act; strict liability for any harm caused when the proprietor fails to make good-faith efforts to determine whether a user is a minor and a minor user inflicts self-harm. Liability may not be waived or disclaimed. No statutory minimum dollar amount is specified.
Who Is Covered
"Proprietor" means any person, business, company, organization, institution, or government entity that owns, operates, or deploys a chatbot system used to interact with users. Proprietor does not include a third-party developer that licenses the developer's chatbot technology to a proprietor and does not maintain direct control of the chatbot system.
What Is Covered
"Chatbot" means an artificial intelligence system, software program, or technological application that simulates human-like conversation and interaction through text messages, voice commands, or a combination thereof to provide information and services to users.
"Companion chatbot" means a chatbot that is designed to provide human-like interaction that simulates an interpersonal relationship with a user or group of users as its primary function, including using previous user interactions to help simulate an interpersonal relationship in future user interactions. An interpersonal relationship simulates a relationship between a human user and a chatbot similar to a romantic, platonic, familial, adversarial, professional, official, therapeutic, or stranger relationship and can include fictional or nonfictional characters.
Compliance Obligations 4 obligations · click obligation ID to open requirement page
CP-01 Deceptive & Manipulative AI Conduct · CP-01.9 · Deployer · ChatbotHealthcare
Minn. Stat. § 604.115, subd. 2(a)-(b)
Plain Language
Proprietors must not allow their chatbots to deliver substantive responses, information, advice, or take actions that would require a professional license if performed by a human — specifically mental health or medical care licenses (MN chapters 147 or 148E) or a legal practice license (MN § 481.02). This is a categorical prohibition: the chatbot may not provide such content at all, and the proprietor cannot disclaim liability by disclosing that the user is interacting with an AI. A private right of action exists for general and special damages; willful violations additionally expose the proprietor to attorney fees and court costs.
Statutory Text
(a) A proprietor of a chatbot must not permit the chatbot to provide any substantive response, information, or advice or take any action that, if taken by a natural person, would require a license under either: (1) chapter 147 or 148E, or similar statutes, requiring a professional license for mental health or medical care; or (2) section 481.02 and related laws and professional regulations, requiring a professional license to provide legal advice. (b) A proprietor may not waive or disclaim this liability merely by notifying users, as required under this section, that the user is interacting with a nonhuman chatbot system. A person may bring a civil action to recover general and special damages for violations of this section. If it is found that a proprietor has willfully violated this section, the violator is liable for those damages together with court costs and reasonable attorney fees and disbursements incurred by the person bringing the action.
T-01 AI Identity Disclosure · T-01.1 · Deployer · Chatbot
Minn. Stat. § 604.115, subd. 3
Plain Language
Every proprietor of a chatbot accessed by a user located in Minnesota must provide clear, conspicuous, and explicit notice that the user is interacting with an AI chatbot. This is an unconditional disclosure requirement — it applies regardless of whether a reasonable person would be misled. The notice must be in the same language the chatbot is using and in a font size easily readable by the average viewer. Unlike CA SB 243's conditional trigger (only when a reasonable person could be misled), this obligation applies to every chatbot interaction with a Minnesota user.
Statutory Text
Proprietors utilizing chatbots accessed by a user who is in this state must provide clear, conspicuous, and explicit notice to a user that the user is interacting with an artificial intelligence chatbot program. The text of the notice must appear in the same language the chatbot is using and in a size easily readable by the average viewer.
S-02 Prohibited Conduct & Output Restrictions · S-02.7 · Deployer · Chatbot
Minn. Stat. § 604.115, subd. 4(a)-(b)
Plain Language
Companion chatbot proprietors have a three-part ongoing obligation: (1) use good-faith, industry-standard efforts to prevent the chatbot from promoting, causing, or aiding self-harm; (2) use similar efforts to detect whether a user is expressing thoughts of self-harm; and (3) upon detection or actual knowledge, immediately suspend the user's access to the companion chatbot for at least 72 hours and prominently display suicide crisis organization contact information. Liability attaches on two independent tracks: first, for failure to comply with the prudent-effort obligations generally; second — regardless of general compliance — whenever the proprietor has actual knowledge of self-harm promotion or user self-harm expressions and fails to suspend access and display crisis resources. Liability cannot be waived or disclaimed under any circumstances, including through terms of service.
Statutory Text
(a) A proprietor of a companion chatbot must make a prudent and good faith effort consistent with industry standards and use existing technology, available resources, and known, established, or readily attainable techniques to prevent the companion chatbot from promoting, causing, or aiding self-harm, and determine whether a covered user is expressing thoughts of self-harm. Upon determining that a companion chatbot has promoted, caused, or aided self-harm, or that a covered user is expressing thoughts of self-harm, the proprietor must prohibit continued use of the companion chatbot for a period of at least 72 hours and prominently display contact information for a suicide crisis organization to the covered user. (b) If a proprietor of a companion chatbot fails to comply with this section, the proprietor is liable to users who inflict self-harm, in whole or in part, as a result of the proprietor's companion chatbot promoting, causing, or aiding the user to inflict self-harm. Irrespective of the proprietor's compliance with this subdivision, a proprietor is liable for general and special damages to covered users who inflict self-harm, in whole or in part, when the proprietor: (1) has actual knowledge that: (i) the companion chatbot is promoting, causing, or aiding self-harm; or (ii) a covered user is expressing thoughts of self-harm; (2) fails to prohibit continued use of the companion chatbot for a period of at least 72 hours; and (3) fails to prominently display to the user a means to contact a suicide crisis organization. A proprietor of a companion chatbot may not waive or disclaim liability under this subdivision.
MN-01 Minor User AI Safety Protections · MN-01.1 · Deployer · ChatbotMinors
Minn. Stat. § 604.115, subd. 4(c)
Plain Language
Companion chatbot proprietors must make good-faith, industry-standard efforts to determine whether any user is a minor, using existing technology and readily attainable techniques. This is effectively a reasonable age verification requirement — not a specific technical mandate, but a duty to employ available methods. If the proprietor fails this duty and a minor user inflicts self-harm as a result of the companion chatbot, the proprietor faces strict liability — meaning no showing of fault or negligence is required beyond the failure to determine minor status. Proprietors must also proactively discover vulnerabilities in their own age-determination systems. Liability cannot be waived or disclaimed.
Statutory Text
(c) A proprietor of a companion chatbot must make a prudent and good faith effort consistent with industry standards and use existing technology, available resources, and known, established, or readily attainable techniques to determine whether a user is a minor. A proprietor is strictly liable for any harm caused if the proprietor fails to comply with this subdivision and a minor user inflicts self-harm, in whole or in part, as a result of the proprietor's companion chatbot. A proprietor of a companion chatbot may not waive or disclaim liability under this subdivision. The proprietor of a companion chatbot must make a prudent and good faith effort consistent with industry standards and use existing technology, available resources, and known, established, or readily attainable techniques to discover vulnerabilities in the proprietor's system, including any methods used to determine whether a covered user is a minor.