Minn. Stat. § 604.115, subd. 4(a)-(b)
Plain Language
Companion chatbot proprietors must take three affirmative steps: (1) make good faith, industry-standard efforts to prevent the chatbot from promoting, causing, or aiding self-harm; (2) use reasonable techniques to detect when a user is expressing thoughts of self-harm; and (3) upon detection, immediately suspend the user's access for at least 72 hours and prominently display suicide crisis organization contact information. The liability structure is two-tiered. First, failure to comply with these obligations creates liability for resulting self-harm. Second, even if the proprietor is otherwise compliant, liability attaches whenever the proprietor has actual knowledge of self-harm promotion or user self-harm expressions and fails to suspend access and display crisis information. Liability under this subdivision cannot be waived or disclaimed.
Statutory Text
(a) A proprietor of a companion chatbot must make a prudent and good faith effort consistent with industry standards and use existing technology, available resources, and known, established, or readily attainable techniques to prevent the companion chatbot from promoting, causing, or aiding self-harm, and determine whether a covered user is expressing thoughts of self-harm. Upon determining that a companion chatbot has promoted, caused, or aided self-harm, or that a covered user is expressing thoughts of self-harm, the proprietor must prohibit continued use of the companion chatbot for a period of at least 72 hours and prominently display contact information for a suicide crisis organization to the covered user. (b) If a proprietor of a companion chatbot fails to comply with this section, the proprietor is liable to users who inflict self-harm, in whole or in part, as a result of the proprietor's companion chatbot promoting, causing, or aiding the user to inflict self-harm. Irrespective of the proprietor's compliance with this subdivision, a proprietor is liable for general and special damages to covered users who inflict self-harm, in whole or in part, when the proprietor: (1) has actual knowledge that: (i) the companion chatbot is promoting, causing, or aiding self-harm; or (ii) a covered user is expressing thoughts of self-harm; (2) fails to prohibit continued use of the companion chatbot for a period of at least 72 hours; and (3) fails to prominently display to the user a means to contact a suicide crisis organization. A proprietor of a companion chatbot may not waive or disclaim liability under this subdivision.