Requires operators of companion chatbot platforms to implement a graduated crisis response system when a chatbot detects a credible crisis expression — a user statement reasonably indicating intent to harm themselves or others, determined through contextual analysis rather than keyword detection alone. Upon first detection, the chatbot must acknowledge distress, encourage human support, and provide 988 Suicide and Crisis Lifeline contact information. If the user reaffirms or escalates, the chatbot must initiate a 20-minute 'crisis interruption pause' that suspends conversational output, displays a deescalation message, and prominently shows crisis resources. Operators must document all crisis detections and pauses and, beginning January 1, 2028, report annually to the Office of Suicide Prevention. The bill specifies no enforcement mechanism, penalties, or private right of action.