HF-2204
IA · State · USA
IA
USA
● Pre-filed
Proposed Effective Date
2025-07-01
Iowa House File 2204 — A bill for an act relating to the requirements for chatbot deployers, including required protocols, limitations on data collection, and requirements for minors to interact with artificial intelligence companions and therapeutic chatbots, and providing civil penalties, punitive penalties, and civil causes of action
Imposes safety, data minimization, and minor-protection obligations on deployers of chatbots in Iowa. All chatbot deployers must implement and maintain harm-detection and harm-mitigation protocols that prioritize user safety over the deployer's interests, and must limit data collection and storage to what is necessary for the chatbot's purpose. Deployers must implement reasonable age verification to prevent minors from using or purchasing AI companions. Chatbots designed to impersonate real individuals (living or deceased) may not be made publicly available without consent from the individual or their estate, subject to narrow exceptions for educational, research, artistic, cultural, or political value. Therapeutic chatbots may not be made available to minors unless stringent conditions are met, including a licensed professional recommendation, peer-reviewed clinical trial data, testing documentation, and clear disclosures to parents. Enforced by the attorney general (civil penalties up to $2,500 per violation) and through a private right of action available to minors (punitive damages of $100–$750 or actual damages, plus emotional distress damages, costs, and attorney fees).
Summary

Imposes safety, data minimization, and minor-protection obligations on deployers of chatbots in Iowa. All chatbot deployers must implement and maintain harm-detection and harm-mitigation protocols that prioritize user safety over the deployer's interests, and must limit data collection and storage to what is necessary for the chatbot's purpose. Deployers must implement reasonable age verification to prevent minors from using or purchasing AI companions. Chatbots designed to impersonate real individuals (living or deceased) may not be made publicly available without consent from the individual or their estate, subject to narrow exceptions for educational, research, artistic, cultural, or political value. Therapeutic chatbots may not be made available to minors unless stringent conditions are met, including a licensed professional recommendation, peer-reviewed clinical trial data, testing documentation, and clear disclosures to parents. Enforced by the attorney general (civil penalties up to $2,500 per violation) and through a private right of action available to minors (punitive damages of $100–$750 or actual damages, plus emotional distress damages, costs, and attorney fees).

Enforcement & Penalties
Enforcement Authority
Attorney general enforcement. The attorney general may bring an action on behalf of the state to enforce the chapter and may seek an injunction for violations. A minor who uses a chatbot that does not comply with the chapter may bring a private right of action to recover damages. No cure period or safe harbor is specified.
Penalties
AG enforcement: civil penalty of up to $2,500 per violation, or $7,500 per violation of an injunction issued under the chapter; penalties deposited into the state general fund; injunctive relief available. Private right of action (minors only): the greater of punitive damages of not less than $100 but not more than $750, or actual damages; plus emotional distress damages, court costs, and reasonable attorney fees. Statutory punitive damages do not require proof of actual monetary harm.
Who Is Covered
"Deployer" means a person that owns an artificial intelligence available for public use.
What Is Covered
"Chatbot" means an artificial intelligence that is made to simulate human conversation with a user through text or audio output.
"AI companion" means a chatbot that interacts with users to simulate a human-like romantic or emotional bond.
"Therapeutic chatbot" means a chatbot designed for the primary purpose of providing mental health support, counseling, or therapy by diagnosing, treating, mitigating, or preventing a mental health condition.
Compliance Obligations 5 obligations · click obligation ID to open requirement page
S-01 AI System Safety Program · S-01.4S-01.5 · Deployer · Chatbot
§ 554J.2(1)
Plain Language
Every deployer of a chatbot must establish, implement, and continuously maintain protocols designed to detect potential harms the chatbot may cause users, respond to those harms, report on them, and mitigate them. The statute expressly requires that these protocols prioritize user safety and well-being over the deployer's own commercial or operational interests. This is a continuing operational obligation — not a one-time pre-launch exercise.
Statutory Text
A deployer of a chatbot shall do all of the following: 1. Implement and maintain protocols meant to detect, respond to, report, and mitigate harm the chatbot may cause a user in a manner that prioritizes the safety and well-being of users over the deployer's interests.
D-01 Automated Processing Rights & Data Controls · D-01.4 · Deployer · Chatbot
§ 554J.2(2)
Plain Language
Deployers must practice data minimization — they may only collect and store user information gathered through the chatbot to the extent necessary to fulfill the stated purpose for which the chatbot is made publicly available. Information collected beyond what is necessary for that purpose violates this provision. There is no exception for secondary uses or separate justification.
Statutory Text
A deployer of a chatbot shall do all of the following: 2. Limit the collection and storage of user information collected by the chatbot to what is necessary to fulfill the deployer's purpose for making the chatbot publicly available.
MN-01 Minor User AI Safety Protections · MN-01.1 · Deployer · ChatbotMinors
§ 554J.3(1)
Plain Language
Deployers of AI companions must implement reasonable age verification to prevent anyone under 18 from using or purchasing an AI companion. The statute defines acceptable verification methods: government-issued ID, financial documents reliably evidencing age, or a widely accepted practice that reliably evidences age. This is a categorical prohibition on minor access to AI companions — there is no parental consent exception. Note this obligation applies only to AI companions (chatbots simulating romantic or emotional bonds), not to all chatbots.
Statutory Text
1. A deployer shall implement reasonable age verification measures to ensure that a minor cannot use or purchase an AI companion the deployer makes publicly available.
CP-02 Non-Consensual Intimate Imagery · CP-02.4 · Deployer · Chatbot
§ 554J.3(2)
Plain Language
Deployers may not make publicly available any chatbot that was knowingly designed to impersonate a real person — living or dead — unless they first obtain express permission. For living individuals, permission must come from the person themselves or their legal representative. For deceased individuals, permission must come from whoever is responsible for the estate. A narrow exception exists for deceased individuals with no estate representative: the chatbot may be deployed without permission only if it was designed solely as an educational or research tool, or if a reasonable person would believe it has objective artistic, cultural, or political value. No comparable exception exists for living individuals or for deceased individuals whose estates have a responsible person.
Statutory Text
2. A deployer shall not make a chatbot publicly available if the chatbot was knowingly designed to impersonate a real individual, regardless of whether the individual is living or deceased, unless the deployer first obtains permission to impersonate the individual from any of the following: a. For a living individual, from the individual or the individual's legal representative. b. For a deceased individual, from the person responsible for the deceased individual's estate. If no person is responsible for the deceased individual's estate, a deployer may make a chatbot that was designed to knowingly impersonate a deceased individual publicly available without permission if the chatbot was designed solely as an educational or research tool or if a reasonable person would believe the chatbot has objective artistic, cultural, or political value.
MN-01 Minor User AI Safety Protections · MN-01.6 · Deployer · ChatbotMinorsHealthcare
§ 554J.3(3)
Plain Language
Deployers face a near-prohibition on making therapeutic chatbots available to minors, subject to six cumulative conditions that must all be satisfied: (1) the chatbot must display a clear and conspicuous disclaimer at the start of each interaction stating it is AI and not a licensed professional; (2) a licensed psychologist (chapter 154B) or mental health professional (chapter 154D) must have evaluated the minor and recommended the chatbot; (3) the developer must have significant documentation of how the chatbot was tested; (4) peer-reviewed clinical trial data must demonstrate the chatbot is a safe and effective tool for the minor's specific mental health condition; (5) the deployer must have provided clear disclosures of the chatbot's functions, limitations, and data privacy policies to both the recommending licensed professional and the minor's parents, guardians, or custodians; and (6) the deployer must have developed and implemented protocols for testing for risks, identifying risks, mitigating risks, and quickly rectifying any harm caused. All six conditions must be met — failure on any one means the therapeutic chatbot cannot be made available to the minor.
Statutory Text
3. A deployer shall not make a therapeutic chatbot available for a minor's use or purchase unless all of the following apply: a. The therapeutic chatbot provides a clear and conspicuous disclaimer at the beginning of each interaction with the therapeutic chatbot that the therapeutic chatbot is an artificial intelligence and is not a licensed professional. b. The therapeutic chatbot was recommended for the minor's use by an individual licensed under chapter 154B or 154D after performing an evaluation of the minor. c. The therapeutic chatbot's developer has significant documentation of how the therapeutic chatbot was tested. d. Peer-reviewed clinical trial data exists demonstrating the therapeutic chatbot would be a safe, effective tool for the minor's diagnosis, treatment, mitigation, or prevention of a mental health condition. e. The therapeutic chatbot's deployer provided clear disclosures of the chatbot's functions, limitations, and data privacy policies to the individual recommending the therapeutic chatbot under paragraph "b", and to the minor's parents, guardians, or custodians. f. The therapeutic chatbot's deployer developed and implemented protocols for testing the therapeutic chatbot for risks to users, identifying possible risks the therapeutic chatbot poses to users, mitigating risks the therapeutic chatbot poses to users, and quickly rectifying harm the therapeutic chatbot may have caused a user.