HB-324
AL · State · USA
AL
USA
● Pending
Proposed Effective Date
2026-10-01
Alabama HB 324 — Relating to artificial intelligence (AI) chatbots; to require AI chatbot deployers to implement a reasonable age verification process and verify the age of all AI chatbot users; to provide prohibitions on the provision of certain AI chatbots to minors; to require AI chatbot deployers to provide alternative versions of the platform without human-like features to minors; to require AI chatbot deployers to adopt protocols for AI chatbots to detect and mitigate emergency situations; to limit the amount and type of information AI chatbot deployers are allowed to collect and store; to allow therapeutic AI chatbots meeting certain requirements to be made available to minors; to provide a private right of action for certain users; and to authorize the Attorney General to bring suit to enforce this act.
Imposes safety and access-control obligations on covered entities that own, operate, or make AI chatbots available to individuals in the United States. Requires reasonable age verification of all users, classification as minor or adult, and prohibition of 'human-like features' — including expressions of sentience, emotional relationship-building, and impersonation — for users classified as minors. Covered entities must implement emergency situation detection and response systems when users express intent to harm themselves or others. Data collection and storage is restricted to the minimum necessary for a legitimate purpose. Therapeutic chatbots may be made available to minors only if prescribed by a licensed mental health professional and supported by peer-reviewed clinical trial data. Creates a private right of action for minors (or parents/guardians) with statutory damages up to $750 per violation, and authorizes Attorney General enforcement with civil penalties up to $7,500 per intentional violation.
Summary

Imposes safety and access-control obligations on covered entities that own, operate, or make AI chatbots available to individuals in the United States. Requires reasonable age verification of all users, classification as minor or adult, and prohibition of 'human-like features' — including expressions of sentience, emotional relationship-building, and impersonation — for users classified as minors. Covered entities must implement emergency situation detection and response systems when users express intent to harm themselves or others. Data collection and storage is restricted to the minimum necessary for a legitimate purpose. Therapeutic chatbots may be made available to minors only if prescribed by a licensed mental health professional and supported by peer-reviewed clinical trial data. Creates a private right of action for minors (or parents/guardians) with statutory damages up to $750 per violation, and authorizes Attorney General enforcement with civil penalties up to $7,500 per intentional violation.

Enforcement & Penalties
Enforcement Authority
Dual enforcement. The Attorney General may bring an action against an operator upon complaint or otherwise whenever it appears a person has engaged in or is about to engage in prohibited acts. A minor who uses a noncompliant AI chatbot, or a parent or guardian acting on the minor's behalf, may bring a civil action individually or as a class action. No cure period or safe harbor is specified.
Penalties
Private right of action: greater of actual damages or statutory damages not to exceed $750 per violation, plus injunctive relief. Class actions are available. Attorney General enforcement: injunctive relief; civil penalties up to $2,500 per violation or up to $7,500 per intentional violation; and any other remedies the court deems appropriate. Statutory damages under the private right of action do not require proof of actual monetary harm.
Who Is Covered
COVERED ENTITY. Any person who owns, operates, or otherwise makes available an AI chatbot to individuals in the United States.
What Is Covered
AI CHATBOT. a. Any generative artificial intelligence interactive computer service or software application that: 1. Produces new expressive content or responses not fully predetermined by the developer or operator of the service or application; and 2. Accepts open-ended, natural language or multimodal user input and produces adaptive or context-responsive output. b. The term does not include an interactive computer service or software application that: 1. Limits the responses to contextualized replies; or 2. Is unable to respond on a range of topics outside of a narrow specified purpose.
Compliance Obligations 8 obligations · click obligation ID to open requirement page
MN-01 Minor User AI Safety Protections · MN-01.1 · Deployer · ChatbotMinors
Section 2(a)-(b)
Plain Language
Covered entities must require all users to create accounts and undergo age verification before accessing an AI chatbot. For existing accounts, the entity must freeze the account until the user completes verification. For new accounts, verification must occur at sign-up. Each user must be classified as a minor or adult. The entity must also periodically re-verify previously verified accounts. Acceptable verification requires government-issued ID or a commercial age verification system, plus user confirmation of non-minor status — merely entering a birth date or inferring age from IP address or hardware identifiers does not qualify. Third-party verification contractors may be used but do not relieve the covered entity of liability.
Statutory Text
(a) Each covered entity shall require each individual accessing an AI chatbot to make a user account in order to use or otherwise interact with the AI chatbot. (b)(1) With respect to each existing user account of an AI chatbot, a covered entity shall: a. Freeze existing user accounts; b. Require that the user is age verified through a reasonable age verification process to restore the functionality of the account; and c. Classify each age-verified user as a minor or an adult based on the reasonable age verification process. (2) At the time an individual creates a new user account to use an AI chatbot, a covered entity shall: a. Require that each individual is age verified through a reasonable age verification process; and b. Classify each individual as a minor or an adult based on the reasonable age verification process. (3) A covered entity shall periodically review previously age-verified user accounts using a reasonable age verification process, subject to subsection (d).
MN-01 Minor User AI Safety Protections · MN-01.5 · Deployer · ChatbotMinors
Section 2(c)
Plain Language
Covered entities must either (1) block minors from accessing any human-like features in their AI chatbots, or (2) provide an alternative version of the chatbot stripped of human-like features, if that is reasonable given the chatbot's purpose. 'Human-like features' is broadly defined to cover any expression suggesting sentience, emotions, desires, emotional relationship-building, impersonation, excessive praise fostering attachment, nudging the user to return for companionship, depicting nonverbal emotional support, or gating intimacy behind engagement or payment. Functional evaluations, generic social formalities, generic encouragement that does not create an ongoing bond, and neutral offers of further help are carved out.
Statutory Text
(c) Each covered entity shall: (1) Ensure that any AI chatbot operated or distributed by the platform does not make human-like features available to minors to use, interact with, purchase, or converse with; or (2) Provide an alternative version of the AI chatbot to minors without human-like features, if reasonable given the purpose of the AI chatbot.
S-04 AI Crisis Response Protocols · S-04.1 · Deployer · ChatbotMinors
Section 2(e)
Plain Language
Covered entities must implement and maintain systems that detect when a user indicates intent to harm themselves or others, and must promptly respond to, report, and mitigate such situations. The systems must prioritize user safety over the covered entity's other interests (e.g., commercial or engagement interests). This is a continuous operating obligation — not a one-time implementation exercise. Notably, this applies to all users, not just minors, and includes an obligation to 'report' emergency situations, though the statute does not specify to whom the report must be made.
Statutory Text
(e) Each covered entity shall implement and maintain reasonably effective systems to detect, promptly respond to, report, and mitigate emergency situations in a manner that prioritizes the safety and well-being of users over the covered entity's other interests.
D-01 Automated Processing Rights & Data Controls · D-01.4 · Deployer · ChatbotMinors
Section 2(f)
Plain Language
Covered entities are restricted to collecting and storing only the minimum amount of user information needed for a legitimate purpose. The information must be sufficient for, relevant to, and the minimum needed for that purpose — a classic data minimization standard. Additionally, the collected information must not conflict with a 'trusted party's best interests,' though the statute does not define 'trusted party.' This applies to all data collection in the chatbot context, not only to minor users.
Statutory Text
(f) Each covered entity shall collect and store only information that does not conflict with a trusted party's best interests, which must be: (1) Sufficient to fulfill a legitimate purpose of the covered entity; (2) Relevant to the legitimate purpose of the covered entity; and (3) The minimum amount of information needed for the legitimate purpose of the covered entity.
Other · Deployer · ChatbotMinorsHealthcare
Section 3
Plain Language
Therapeutic chatbots may be made available to minors — an exception to the general ban on human-like features for minors — but only if six cumulative conditions are met: (1) clear disclaimer at every interaction that the chatbot is AI, not a licensed professional; (2) no marketing as a substitute for a human professional; (3) a licensed mental health professional prescribes and monitors the tool as part of a treatment plan; (4) peer-reviewed clinical trial data supports the tool's safety and efficacy; (5) the system's functions, limitations, and data policies are transparent to both the professional and the user; and (6) the covered entity establishes clear accountability for any harm. All six conditions must be satisfied simultaneously for the exception to apply.
Statutory Text
A therapeutic AI chatbot may be made available to minors if the therapeutic AI chatbot meets all of the following requirements: (1) The therapeutic AI chatbot provides a clear and conspicuous disclaimer, verbally or in writing, at the beginning of each interaction that the AI chatbot is an artificial intelligence and not a licensed professional. (2) The AI chatbot is not marketed or designated as a substitute for a human professional. (3) A licensed mental health professional assesses a user's suitability and prescribes the tool as part of a comprehensive treatment plan and monitors its use and impact. (4) The covered entity provides robust, independent, and peer-reviewed clinical trial data demonstrating the safety and efficacy of the tool for specific conditions and populations. (5) The system's functions, limitations, and data privacy policies are transparent to both the licensed mental health professional and the user. (6) The covered entity establishes clear lines of accountability for any harm caused by the therapeutic AI chatbot.
Other · ChatbotMinors
Section 4
Plain Language
This provision creates the private right of action for minors (or parents/guardians acting on their behalf) to sue covered entities that violate the act. Available relief includes injunctive relief and the greater of actual damages or statutory damages up to $750 per violation. Class actions are permitted. This is an enforcement mechanism, not an independent compliance obligation.
Statutory Text
A minor who uses an AI chatbot that does not comply with the provisions of this act, or a parent or guardian acting on the minor's behalf, may bring a civil action on his or her own or in a class action to recover the following relief: (1) Injunctive relief. (2) Damages in an amount equal to the sum of any actual damages or statutory damages not to exceed seven hundred fifty dollars ($750) per violation, whichever is greater.
Other · ChatbotMinors
Section 5
Plain Language
This provision authorizes the Alabama Attorney General to bring enforcement actions against operators who violate the act, either upon complaint or on the AG's own initiative. Available remedies include injunctive relief, civil penalties up to $2,500 per violation ($7,500 per intentional violation), and any other court-ordered remedies. This is an enforcement mechanism, not an independent compliance obligation.
Statutory Text
Whenever it appears to the Attorney General, either upon complaint or otherwise, that a person has engaged in or is about to engage in any of the acts or practices prohibited by this act, the Attorney General may bring an action against an operator to: (1) Enjoin the person from continuing the unlawful acts or practices; (2) Seek civil penalties of not more than two thousand five hundred dollars ($2,500) for a violation under this act or not more than seven thousand five hundred dollars ($7,500) for an intentional violation under this act; and (3) Seek any other remedies as the court may deem appropriate.
MN-01 Minor User AI Safety Protections · Deployer · ChatbotMinors
Section 2(d)
Plain Language
Covered entities may outsource age verification to third-party vendors, but this delegation does not transfer or reduce the covered entity's legal obligations or liability under the act. The covered entity remains fully responsible for compliance even when using a contractor. This clarifies the liability framework for the age verification obligation in Section 2(b) rather than creating a separate compliance obligation.
Statutory Text
(d) For purposes of subsection (b), a covered entity may contract with a third party to implement the covered entity's reasonable age verification process. However, the use of a third party for a reasonable age verification process shall not relieve the covered entity of its obligations or from liability under this act.