HSB-647
IA · State · USA
IA
USA
● Pending
Proposed Effective Date
2026-07-01
Iowa House Study Bill 647 — A bill for an act relating to chatbots, including deployer requirements and interactions with minors
Imposes safety, data minimization, and minor-protection obligations on deployers of chatbots in Iowa. Deployers must maintain protocols to detect, respond to, report, and mitigate user harm, prioritizing user safety over deployer interests, and must limit data collection to what is necessary for the chatbot's purpose. Deployers must implement reasonable age verification to prevent minors from using or purchasing chatbots, with a narrow exception for mental health chatbots that meet stringent clinical, disclosure, and testing requirements including peer-reviewed clinical trial data and a licensed professional's recommendation. Enforcement is through the attorney general (civil penalties up to $2,500 per violation, $7,500 for injunction violations) and a private right of action for parents, guardians, or custodians of minors (statutory damages of $100–$750 or actual damages, whichever is greater).
Summary

Imposes safety, data minimization, and minor-protection obligations on deployers of chatbots in Iowa. Deployers must maintain protocols to detect, respond to, report, and mitigate user harm, prioritizing user safety over deployer interests, and must limit data collection to what is necessary for the chatbot's purpose. Deployers must implement reasonable age verification to prevent minors from using or purchasing chatbots, with a narrow exception for mental health chatbots that meet stringent clinical, disclosure, and testing requirements including peer-reviewed clinical trial data and a licensed professional's recommendation. Enforcement is through the attorney general (civil penalties up to $2,500 per violation, $7,500 for injunction violations) and a private right of action for parents, guardians, or custodians of minors (statutory damages of $100–$750 or actual damages, whichever is greater).

Enforcement & Penalties
Enforcement Authority
The attorney general may bring an action on behalf of the state to enforce the chapter and may seek an injunction. A private right of action is available to the parent, guardian, or custodian of a minor who uses a chatbot that does not comply with the chapter. Standing for private plaintiffs requires the minor to have used a non-compliant chatbot; no cure period or safe harbor is specified.
Penalties
Attorney general enforcement: civil penalty of not more than $2,500 per violation, or $7,500 if a person violates an injunction issued under the chapter. Penalties deposited into the state general fund. Attorney general may also seek injunctive relief. Private right of action by parent, guardian, or custodian of a minor: civil penalty of not less than $100 but not more than $750, or actual damages, whichever is greater. Statutory damages do not require proof of actual monetary harm.
Who Is Covered
"Deployer" means a person that owns an artificial intelligence available for public use.
What Is Covered
"Chatbot" means an artificial intelligence that interacts with users by simulating human conversation.
Compliance Obligations 3 obligations · click obligation ID to open requirement page
S-01 AI System Safety Program · S-01.4S-01.5 · Deployer · Chatbot
§ 554J.2(1)
Plain Language
Deployers must implement and maintain ongoing protocols for detecting, responding to, reporting, and mitigating harms their chatbot may cause users. The protocols must prioritize user safety and well-being over the deployer's commercial or business interests. This is a continuing operational obligation — not a one-time pre-launch check. The statute does not specify to whom harm must be reported, leaving that to the deployer's protocol design.
Statutory Text
A deployer of a chatbot shall do all of the following: 1. Implement and maintain protocols meant to detect, respond to, report, and mitigate harm the chatbot may cause a user in a manner that prioritizes the safety and well-being of users over the deployer's interests.
D-01 Automated Processing Rights & Data Controls · D-01.4 · Deployer · Chatbot
§ 554J.2(2)
Plain Language
Deployers must minimize the user information their chatbot collects and stores, limiting it to what is necessary to fulfill the deployer's stated purpose for making the chatbot publicly available. This is a data minimization obligation — deployers cannot collect data beyond what is required for the chatbot's core purpose. Secondary uses or excessive retention are implicitly prohibited.
Statutory Text
A deployer of a chatbot shall do all of the following: 2. Limit the collection and storage of user information collected by the chatbot to what is necessary to fulfill the deployer's purpose for making the chatbot publicly available.
MN-01 Minor User AI Safety Protections · MN-01.1 · Deployer · ChatbotMinors
§ 554J.3(1)–(2)
Plain Language
Deployers must implement reasonable age verification — using government-issued ID, financial documents, or another widely accepted practice that reliably evidences age — to prevent minors from using or purchasing their chatbot. The default rule is a categorical prohibition on minor access. However, a narrow exception allows minor access when all seven conditions are met simultaneously: (a) the chatbot was designed primarily for mental health support by diagnosing, treating, mitigating, or preventing a mental health condition; (b) each interaction begins with a clear disclaimer that the chatbot is AI, not a licensed professional; (c) a professional licensed under Iowa chapter 154B (psychology) or 154D (behavioral science) recommended the chatbot after evaluating the minor; (d) the developer has significant testing documentation; (e) peer-reviewed clinical trial data demonstrates safety and efficacy; (f) the deployer disclosed the chatbot's functions, limitations, and privacy policies to both the recommending professional and the minor's parents/guardians/custodians; and (g) the deployer implemented protocols for testing, risk identification, risk mitigation, and harm rectification. All seven conditions must be satisfied — failure on any one means the minor access prohibition applies.
Statutory Text
1. A deployer shall implement reasonable age verification measures to ensure that a minor cannot use or purchase a chatbot the deployer makes publicly available. 2. Notwithstanding subsection 1, a deployer may make a chatbot available for a minor's use or purchase if all of the following apply: a. The chatbot was designed for the primary purpose of providing mental health support, counseling, or therapy by diagnosing, treating, mitigating, or preventing a mental health condition. b. The chatbot provides a clear and conspicuous disclaimer at the beginning of each interaction with the chatbot that the chatbot is an artificial intelligence and is not a licensed professional. c. The chatbot was recommended for the minor's use by an individual licensed under chapter 154B or 154D after performing an evaluation of the minor. d. The chatbot's developer has significant documentation of how the chatbot was tested. e. Peer-reviewed clinical trial data exists demonstrating the chatbot would be a safe, effective tool for the minor's diagnosis, treatment, mitigation, or prevention of a mental health condition. f. The chatbot's deployer provided clear disclosures of the chatbot's functions, limitations, and data privacy policies to the individual recommending the chatbot under paragraph "c", and to the minor's parents, guardians, or custodians. g. The chatbot's deployer developed and implemented protocols for testing the chatbot for risks to users, identifying possible risks the chatbot poses to users, mitigating risks the chatbot poses to users, and quickly rectifying harm the chatbot may have caused a user.