Minn. Stat. § 604.115, subd. 4(a)-(b)
Plain Language
Companion chatbot proprietors have a three-part ongoing obligation: (1) use good-faith, industry-standard efforts to prevent the chatbot from promoting, causing, or aiding self-harm; (2) use similar efforts to detect whether a user is expressing thoughts of self-harm; and (3) upon detection or actual knowledge, immediately suspend the user's access to the companion chatbot for at least 72 hours and prominently display suicide crisis organization contact information. Liability attaches on two independent tracks: first, for failure to comply with the prudent-effort obligations generally; second — regardless of general compliance — whenever the proprietor has actual knowledge of self-harm promotion or user self-harm expressions and fails to suspend access and display crisis resources. Liability cannot be waived or disclaimed under any circumstances, including through terms of service.
Statutory Text
(a) A proprietor of a companion chatbot must make a prudent and good faith effort consistent with industry standards and use existing technology, available resources, and known, established, or readily attainable techniques to prevent the companion chatbot from promoting, causing, or aiding self-harm, and determine whether a covered user is expressing thoughts of self-harm. Upon determining that a companion chatbot has promoted, caused, or aided self-harm, or that a covered user is expressing thoughts of self-harm, the proprietor must prohibit continued use of the companion chatbot for a period of at least 72 hours and prominently display contact information for a suicide crisis organization to the covered user. (b) If a proprietor of a companion chatbot fails to comply with this section, the proprietor is liable to users who inflict self-harm, in whole or in part, as a result of the proprietor's companion chatbot promoting, causing, or aiding the user to inflict self-harm. Irrespective of the proprietor's compliance with this subdivision, a proprietor is liable for general and special damages to covered users who inflict self-harm, in whole or in part, when the proprietor: (1) has actual knowledge that: (i) the companion chatbot is promoting, causing, or aiding self-harm; or (ii) a covered user is expressing thoughts of self-harm; (2) fails to prohibit continued use of the companion chatbot for a period of at least 72 hours; and (3) fails to prominently display to the user a means to contact a suicide crisis organization. A proprietor of a companion chatbot may not waive or disclaim liability under this subdivision.