HSB-611
IA · State · USA
IA
USA
● Pending
Proposed Effective Date
2025-07-01
Iowa House Study Bill 611 — A bill for an act establishing requirements and guidelines for chatbots, making appropriations, and providing civil penalties
Imposes safety and disclosure obligations on any person who designs, develops, or makes a chatbot available. Prohibits making a chatbot available with knowledge or reckless disregard that it encourages suicide, self-injury, or physical or sexual violence. Requires all chatbots to disclose their non-human identity at the start of each conversation and at thirty-minute intervals, to respond truthfully when asked if they are human, to disclose they do not provide professional services, and to not represent themselves as licensed professionals. Enforced exclusively by the Iowa attorney general, who may seek civil penalties up to $100,000 per violation, restitution, injunctive relief, or other appropriate relief. No private right of action is created.
Summary

Imposes safety and disclosure obligations on any person who designs, develops, or makes a chatbot available. Prohibits making a chatbot available with knowledge or reckless disregard that it encourages suicide, self-injury, or physical or sexual violence. Requires all chatbots to disclose their non-human identity at the start of each conversation and at thirty-minute intervals, to respond truthfully when asked if they are human, to disclose they do not provide professional services, and to not represent themselves as licensed professionals. Enforced exclusively by the Iowa attorney general, who may seek civil penalties up to $100,000 per violation, restitution, injunctive relief, or other appropriate relief. No private right of action is created.

Enforcement & Penalties
Enforcement Authority
Attorney general enforcement. The attorney general may bring a civil action to enjoin a violation of or enforce compliance with the chapter or rules adopted pursuant to the chapter. No private right of action is created. The attorney general is also directed to adopt implementing rules pursuant to chapter 17A.
Penalties
Civil penalties of not more than $100,000 per violation. The attorney general may also seek restitution or other appropriate relief. Penalties collected are credited to the general fund and appropriated to the attorney general for performing duties under the chapter.
Who Is Covered
What Is Covered
"Chatbot" means any interactive computer service or software application that does all of the following: a. Produces new expressive content or responses not fully predetermined by the developer or operator of the interactive computer service or software application. b. Accepts open-ended, natural-language, or multimodal user input and produces adaptive or context-responsive output. "Chatbot" does not include an interactive computer service or a software application described by all of the following: a. The responses of the interactive computer service or software application are limited to information only contained within the interactive computer service or software application, including user input, except for information necessary to make the interactive computer service or software application coherent. b. The interactive computer service or software application is only able to respond to topics in a narrow, specified field.
Compliance Obligations 6 obligations · click obligation ID to open requirement page
S-02 Prohibited Conduct & Output Restrictions · S-02.7 · DeveloperDeployer · Chatbot
§ 554J.2(1)
Plain Language
Any person who designs, develops, or makes a chatbot available is prohibited from doing so if they know — or recklessly disregard the possibility — that the chatbot encourages, promotes, or coerces users to commit suicide, perform self-injury, or perform acts of physical or sexual violence against humans or animals. The mens rea threshold is knowledge or reckless disregard, not strict liability. This covers the full lifecycle: design, development, and deployment. The scope of prohibited conduct extends beyond self-harm to include encouragement of violence against others and animals, which is broader than typical self-harm-only provisions.
Statutory Text
It shall be unlawful for a person to design, develop, or make a chatbot available with the knowledge, or with reckless disregard for the possibility, that the chatbot encourages, promotes, or coerces a user to commit suicide, perform acts of self-injury, or perform acts of physical or sexual violence on humans or animals.
T-01 AI Identity Disclosure · T-01.1T-01.2 · DeveloperDeployer · Chatbot
§ 554J.2(2)(a)
Plain Language
Every chatbot must provide a clear and conspicuous disclosure that the user is interacting with a chatbot — not a human — at two points: (1) at the beginning of each conversation, and (2) at thirty-minute intervals during the conversation. This is an unconditional requirement — it applies regardless of whether a reasonable person would be misled. The thirty-minute interval is notably more frequent than the three-hour interval in comparable legislation such as CA SB 243.
Statutory Text
Each chatbot shall meet all of the following requirements: a. Clearly and conspicuously disclose that the chatbot is a chatbot and not a human being at the beginning of each conversation and at thirty-minute intervals.
T-01 AI Identity Disclosure · T-01.3 · DeveloperDeployer · Chatbot
§ 554J.2(2)(b)
Plain Language
Chatbots must be programmed so they cannot claim to be human and must respond truthfully when a user asks whether the chatbot is a human. This is both a proactive design requirement (prevent claiming to be human) and an on-demand disclosure obligation (respond accurately when asked). The obligation is framed as a programming requirement, meaning it must be built into the chatbot's behavior, not merely addressed through a terms-of-service disclosure.
Statutory Text
Be programmed to prevent the chatbot from claiming to be a human or respond deceptively when asked by a user if the chatbot is a human.
CP-01 Deceptive & Manipulative AI Conduct · CP-01.9 · DeveloperDeployer · Chatbot
§ 554J.2(2)(c)-(d)
Plain Language
Chatbots must (1) clearly and conspicuously disclose at the beginning of each conversation and at regular intervals that they do not provide medical, legal, financial, or psychological services and that users should consult a licensed professional for such services, and (2) be programmed to prevent the chatbot from representing itself as a licensed professional of any type — including therapists, physicians, lawyers, and financial advisors. The first obligation is a recurring disclosure requirement; the second is a design-level prohibition. Both target the same risk: users mistaking chatbot output for professional advice or service.
Statutory Text
c. Clearly and conspicuously disclose that the chatbot does not provide medical, legal, financial, or psychological services and that the user should consult a licensed professional for such services at the beginning of each conversation and at regular intervals. d. Be programmed to prevent the chatbot from representing that the chatbot is a licensed professional, including but not limited to a therapist, physician, lawyer, financial advisor, or other professional.
Other · Chatbot
§ 554J.3(1)-(3)
Plain Language
This section establishes the enforcement and penalty structure for violations of § 554J.2. Violators face civil penalties up to $100,000 per violation. The attorney general may bring civil actions for injunctive relief, civil penalties, restitution, or other appropriate relief. Collected penalties are deposited into the state general fund and appropriated to the attorney general. This creates no new compliance obligation — it is the enforcement mechanism for the chatbot requirements.
Statutory Text
1. A person found in violation of section 554J.2 shall be fined not more than one hundred thousand dollars for each violation. 2. a. The attorney general may bring a civil action to enjoin a violation of or enforce compliance with this chapter or rules adopted pursuant to this chapter. b. In a case brought pursuant to this subsection, the attorney general may seek civil penalties, restitution, or other appropriate relief. 3. Penalties collected under this section shall be credited to the general fund of the state and appropriated to the attorney general for the purpose of performing duties under this chapter.
Other · Chatbot
§ 554J.4
Plain Language
The attorney general is required to adopt implementing rules under Iowa's Administrative Procedure Act (chapter 17A). This delegates rulemaking authority to the attorney general and signals that the operative requirements in § 554J.2 may be supplemented by administrative rules. This creates no new compliance obligation on chatbot developers or operators — it is a directive to the attorney general.
Statutory Text
The attorney general shall adopt rules pursuant to chapter 17A to implement this chapter.