Iowa HF 2715 imposes safety and disclosure obligations on deployers of public-facing chatbots, AI companions, and therapeutic chatbots available to Iowa users. Deployers must maintain harm-mitigation protocols, limit user data collection to what is necessary, disclose the chatbot's AI nature and lack of professional licensure at the start of each interaction and every three hours of continuous use, and implement suicide and self-harm crisis response protocols. Heightened requirements apply to AI companions and therapeutic chatbots when minors are involved, including commercially reasonable age-determination measures and parental notification for self-harm prompts. Therapeutic chatbots may only be made available to minors under strict conditions including a licensed professional's recommendation and peer-reviewed clinical trial data. Enforcement is exclusively by the attorney general, with civil penalties up to $2,500 per violation ($7,500 for injunction violations) and a 30-day cure period except where imminent harm to a minor is at stake. A safe harbor protects deployers making commercially reasonable compliance efforts from liability for unforeseeable or emergent outputs.