Iowa HF 2715 imposes safety, disclosure, and conduct requirements on deployers of public-facing chatbots, with heightened obligations for AI companions and therapeutic chatbots used by minors. Deployers must maintain harm-detection and mitigation protocols, limit data collection to what is necessary, disclose AI identity and non-licensure status at the start of every interaction and every three hours during continuous use, and implement crisis referral protocols for suicidal ideation and self-harm. Deployers of AI companions and therapeutic chatbots must implement commercially reasonable age-determination measures and parental notification protocols when a minor expresses suicidal ideation. Therapeutic chatbots may only be made available to minors under strict conditions including a licensed professional's recommendation, peer-reviewed clinical trial data, and deployer safety testing protocols. Enforcement is exclusively through the attorney general, with civil penalties up to $2,500 per violation ($7,500 for injunction violations) and a 30-day cure period except for imminent harm to minors.