Imposes safety, data minimization, and minor-protection obligations on deployers of chatbots in Iowa. All chatbot deployers must maintain harm-detection and mitigation protocols that prioritize user safety over deployer interests, and must limit data collection to what is necessary for the chatbot's stated purpose. Deployers must implement reasonable age verification to prevent minors from using AI companions. A chatbot knowingly designed to impersonate a real individual (living or deceased) may not be made publicly available without consent, subject to narrow exceptions for educational, research, artistic, cultural, or political-value chatbots. Therapeutic chatbots may only be made available to minors if multiple conditions are satisfied, including a licensed professional recommendation, peer-reviewed clinical trial data, and deployer safety protocols. Enforcement is via attorney general civil actions (up to $2,500 per violation) and a private right of action for minors (punitive damages of $100–$750 or actual damages, plus emotional distress damages and attorney fees). The chapter does not apply to chatbots that provide only generic responses where a reasonable person would not expect the responses to create an emotional bond.