HSB-647
IA · State · USA
IA
USA
● Pending
Iowa House Study Bill 647 — A bill for an act relating to chatbots, including deployer requirements and interactions with minors
Imposes safety, data minimization, and minor-protection obligations on deployers of chatbots in Iowa. Deployers must maintain harm-detection and response protocols prioritizing user safety and must limit user data collection and storage to what is necessary. Deployers must implement reasonable age verification to prevent minors from using or purchasing chatbots, with a narrow exception for mental health chatbots that meet seven cumulative conditions including clinical trial evidence, licensed professional recommendation, AI identity disclosure, developer testing documentation, deployer risk protocols, and disclosures to parents. Enforcement is through the attorney general (injunctive relief and civil penalties up to $2,500/$7,500 per violation) and a private right of action for parents, guardians, or custodians of minors ($100–$750 or actual damages).
Summary

Imposes safety, data minimization, and minor-protection obligations on deployers of chatbots in Iowa. Deployers must maintain harm-detection and response protocols prioritizing user safety and must limit user data collection and storage to what is necessary. Deployers must implement reasonable age verification to prevent minors from using or purchasing chatbots, with a narrow exception for mental health chatbots that meet seven cumulative conditions including clinical trial evidence, licensed professional recommendation, AI identity disclosure, developer testing documentation, deployer risk protocols, and disclosures to parents. Enforcement is through the attorney general (injunctive relief and civil penalties up to $2,500/$7,500 per violation) and a private right of action for parents, guardians, or custodians of minors ($100–$750 or actual damages).

Enforcement & Penalties
Enforcement Authority
The attorney general may bring an action on behalf of the state to enforce the chapter and may seek an injunction. A private right of action is available to a parent, guardian, or custodian of a minor who uses a chatbot that does not comply with the chapter. Standing requires the minor to have used a non-compliant chatbot; no cure period or safe harbor is specified.
Penalties
Attorney general enforcement: civil penalty of up to $2,500 per violation, or up to $7,500 per violation of an injunction issued under the chapter. Injunctive relief is available. Penalties are deposited into the state general fund. Private right of action by parent, guardian, or custodian of a minor: civil penalty of not less than $100 but not more than $750, or actual damages, whichever is greater. Statutory damages do not require proof of actual monetary harm.
Who Is Covered
"Deployer" means a person that owns an artificial intelligence available for public use.
What Is Covered
"Chatbot" means an artificial intelligence that interacts with users by simulating human conversation.
Compliance Obligations 4 obligations · click obligation ID to open requirement page
S-01 AI System Safety Program · S-01.4S-01.5 · Deployer · Chatbot
§ 554J.2(1)
Plain Language
Deployers of chatbots must establish and continuously maintain protocols that detect, respond to, report on, and mitigate harms the chatbot may cause users. The protocols must prioritize user safety and well-being over the deployer's own commercial or operational interests. This is a continuing operational obligation — not a one-time pre-deployment check. The statute does not specify the form, content, or review cadence of these protocols, giving deployers discretion on implementation details.
Statutory Text
A deployer of a chatbot shall do all of the following: 1. Implement and maintain protocols meant to detect, respond to, report, and mitigate harm the chatbot may cause a user in a manner that prioritizes the safety and well-being of users over the deployer's interests.
D-01 Automated Processing Rights & Data Controls · D-01.4 · Deployer · Chatbot
§ 554J.2(2)
Plain Language
Deployers must apply data minimization principles to all user information collected by the chatbot. Collection and storage must be limited to what is necessary to fulfill the deployer's stated purpose for making the chatbot publicly available. This prohibits collecting user data for purposes beyond the chatbot's core function — secondary uses such as training other models, cross-product profiling, or advertising would need to be justified as necessary to the stated purpose. The statute does not define 'necessary' or specify retention periods.
Statutory Text
A deployer of a chatbot shall do all of the following: 2. Limit the collection and storage of user information collected by the chatbot to what is necessary to fulfill the deployer's purpose for making the chatbot publicly available.
MN-01 Minor User AI Safety Protections · MN-01.1 · Deployer · ChatbotMinors
§ 554J.3(1)-(2)
Plain Language
Deployers must implement reasonable age verification — which the statute defines as government ID, financial documents, or a widely accepted age-evidencing practice — to prevent minors from using or purchasing their chatbots. The default rule is a complete bar on minor access. A narrow exception exists for mental health chatbots, but only if all seven conditions are met simultaneously: (1) the chatbot's primary purpose is mental health support/therapy; (2) a clear AI disclaimer is shown at each interaction; (3) a licensed psychologist or mental health professional recommended the chatbot after evaluating the specific minor; (4) the developer has significant testing documentation; (5) peer-reviewed clinical trial data supports the chatbot's safety and efficacy for that mental health use; (6) the deployer disclosed functions, limitations, and data privacy policies to both the recommending professional and the minor's parents/guardians; and (7) the deployer has risk-testing and harm-rectification protocols in place. Failure to meet any one condition means the minor-access prohibition applies.
Statutory Text
1. A deployer shall implement reasonable age verification measures to ensure that a minor cannot use or purchase a chatbot the deployer makes publicly available. 2. Notwithstanding subsection 1, a deployer may make a chatbot available for a minor's use or purchase if all of the following apply: a. The chatbot was designed for the primary purpose of providing mental health support, counseling, or therapy by diagnosing, treating, mitigating, or preventing a mental health condition. b. The chatbot provides a clear and conspicuous disclaimer at the beginning of each interaction with the chatbot that the chatbot is an artificial intelligence and is not a licensed professional. c. The chatbot was recommended for the minor's use by an individual licensed under chapter 154B or 154D after performing an evaluation of the minor. d. The chatbot's developer has significant documentation of how the chatbot was tested. e. Peer-reviewed clinical trial data exists demonstrating the chatbot would be a safe, effective tool for the minor's diagnosis, treatment, mitigation, or prevention of a mental health condition. f. The chatbot's deployer provided clear disclosures of the chatbot's functions, limitations, and data privacy policies to the individual recommending the chatbot under paragraph "c", and to the minor's parents, guardians, or custodians. g. The chatbot's deployer developed and implemented protocols for testing the chatbot for risks to users, identifying possible risks the chatbot poses to users, mitigating risks the chatbot poses to users, and quickly rectifying harm the chatbot may have caused a user.
Other · ChatbotMinors
§ 554J.4(1)-(3)
Plain Language
This provision establishes two enforcement channels: (1) the attorney general may bring actions on behalf of the state and seek injunctions, with courts authorized to impose civil penalties of up to $2,500 per violation or $7,500 for violating an injunction; and (2) a parent, guardian, or custodian of a minor who used a non-compliant chatbot may bring a private action for the greater of $100–$750 in statutory damages or actual damages. Penalties collected by the state go to the general fund. This creates no new compliance obligation — it is the enforcement and remedy framework for the substantive obligations in §§ 554J.2 and 554J.3.
Statutory Text
1. The attorney general may bring an action on behalf of the state to enforce the provisions of this chapter and may seek an injunction for violations of this chapter. 2. a. A court may issue a civil penalty of not more than two thousand five hundred dollars for each violation of this chapter, or seven thousand five hundred dollars if a person violates an injunction issued under this chapter. b. Penalties assessed under this subsection shall be deposited into the general fund of the state. 3. The parent, guardian, or custodian of a minor who uses a chatbot that does not comply with this chapter may bring an action to recover a civil penalty of not less than one hundred dollars but not more than seven hundred fifty dollars or actual damages, whichever is greater.