HB-4083
OK · State · USA
OK
USA
● Pending
Proposed Effective Date
2026-11-01
Oklahoma House Bill 4083 — An Act relating to technology; providing definitions; directing deployers of chatbots to ensure AI chatbots do not make human-like features available to minors; directing deployers to implement reasonable age verification systems; permitting deployers to provide alternative versions of chatbot without human-like features; directing deployers to ensure social AI companions are not available to minors; providing exemptions for certain therapeutic chatbots; directing deployers to implement and maintain effective systems to detect emergency situations; directing deployers to only collect information that does not conflict with a trusting party's best interest; directing the Attorney General to bring action against businesses or persons who are in violation; creating a private right of action; providing for codification; and providing an effective date.
Imposes obligations on deployers of AI chatbots to protect minors from human-like features — defined broadly to include simulated sentience, emotional relationship-building, and impersonation of real persons. Deployers must ensure chatbots do not make human-like features available to minors and must implement reasonable age verification systems. Social AI companions are categorically prohibited for minors. A narrow exemption allows therapeutic chatbots to be made available to minors only under licensed professional supervision with peer-reviewed clinical evidence. Deployers must also maintain emergency detection and response systems and limit data collection to what is adequate, relevant, and necessary. Enforcement is through the Attorney General and a private right of action for minors or their parents/guardians with statutory damages of $100–$750 per user per incident.
Summary

Imposes obligations on deployers of AI chatbots to protect minors from human-like features — defined broadly to include simulated sentience, emotional relationship-building, and impersonation of real persons. Deployers must ensure chatbots do not make human-like features available to minors and must implement reasonable age verification systems. Social AI companions are categorically prohibited for minors. A narrow exemption allows therapeutic chatbots to be made available to minors only under licensed professional supervision with peer-reviewed clinical evidence. Deployers must also maintain emergency detection and response systems and limit data collection to what is adequate, relevant, and necessary. Enforcement is through the Attorney General and a private right of action for minors or their parents/guardians with statutory damages of $100–$750 per user per incident.

Enforcement & Penalties
Enforcement Authority
Attorney General enforcement. The Attorney General may bring a civil action against any business or person that violates the act. Private right of action available to any minor who uses a noncompliant chatbot, or a parent or guardian acting on the minor's behalf, individually or on a class-wide basis. No cure period or safe harbor is specified.
Penalties
AG enforcement: injunction, disgorgement of unjust gains, and civil penalties of up to $2,500 per violation or $7,500 per intentional violation. Private right of action: damages of not less than $100 and not greater than $750 per user per incident, or actual damages, whichever is greater; injunctive or declaratory relief also available. Statutory damages do not require proof of actual monetary harm.
Who Is Covered
"Deployers" means any person, partnership, state or local governmental agency, corporation, or developer that operates or distributes a chatbot;.
What Is Covered
"Chatbot" means a generative artificial intelligence system with which users can interact by or through an interface that approximates or simulates conversation through a text, audio, or visual medium;
"Social AI Companion" means generative artificial intelligence systems that are specifically designed, marketed, or optimized to form ongoing social or emotional bonds with users, whether or not such systems also provide information, complete tasks, or assist with specific functions;
"Therapy chatbot" means any chatbot modified or designed with a primary purpose of providing mental health support, counseling, or therapeutic intervention through the diagnosis, treatment, mitigation, or prevention of mental health conditions;
Compliance Obligations 7 obligations · click obligation ID to open requirement page
MN-01 Minor User AI Safety Protections · MN-01.5 · Deployer · ChatbotMinors
75A O.S. § 701(A)(1), (A)(3)
Plain Language
Deployers must ensure that no generative AI chatbot they operate or distribute makes human-like features available to minors. Human-like features include simulated sentience or emotions, emotional relationship-building behaviors (such as inviting attachment, nudging users to return for companionship, excessive praise, or pay-gated intimacy), and impersonation of real persons — but exclude functional evaluations, generic social formalities, and neutral offers of further help. Deployers may optionally provide a stripped-down version of the chatbot without human-like features for minors and unverified users.
Statutory Text
A. Each deployer: 1. Shall ensure that any generative AI chatbot operated or distributed by the deployer does not make human-like features available to minors to use, interact with, purchase, or converse with; ... 3. May, if reasonable given the purpose of the chatbot, provide an alternative version of the chatbot available to minors and non-verified users without human-like features.
MN-01 Minor User AI Safety Protections · MN-01.1 · Deployer · ChatbotMinors
75A O.S. § 701(A)(2)
Plain Language
Deployers must implement reasonable age verification systems to prevent minors from accessing chatbots with human-like features. The statute does not specify the particular verification method required — it must be 'reasonable.' This is the operational mechanism by which the substantive prohibition on minors accessing human-like features is enforced.
Statutory Text
2. Shall implement reasonable age verification systems to ensure that generative AI chatbots with human-like features are not provisioned to minors;
MN-01 Minor User AI Safety Protections · MN-01.1MN-01.6 · Deployer · ChatbotMinors
75A O.S. § 701(B)(1)-(2)
Plain Language
Social AI companion systems — those specifically designed, marketed, or optimized to form ongoing social or emotional bonds — are categorically prohibited for minors. Unlike subsection A (which prohibits only human-like features), subsection B prohibits the entire companion product for minors, even if a stripped-down version without human-like features could theoretically be offered. Deployers must also implement reasonable age verification to enforce this prohibition. This is a more restrictive standard than the general chatbot rule in subsection A.
Statutory Text
B. Deployers operating generative AI systems that primarily function as companions shall: 1. Ensure that any such chatbots operated or distributed by the deployer are not available to minors to use, interact with, purchase, or converse with; and 2. Implement reasonable age verification systems to ensure that such chatbots are not provisioned to minors.
HC-02 AI in Licensed Professional Practice Restrictions · HC-02.1HC-02.2HC-02.3HC-02.5 · DeployerDeveloperProfessional · ChatbotHealthcareMinors
75A O.S. § 701(C)(1)-(5)
Plain Language
Therapeutic chatbots may only be made available to minors if all five conditions are satisfied: (1) the chatbot displays a clear, conspicuous AI disclaimer at the start of every interaction stating it is not a licensed professional; (2) the chatbot is not marketed as a substitute for a human professional; (3) a licensed mental health professional assesses the minor user's suitability, prescribes the tool within a comprehensive treatment plan, and monitors use; (4) developers produce robust, independent, peer-reviewed clinical trial data on safety and efficacy for the specific conditions and populations served; and (5) the system's functions, limitations, and data privacy policies are transparent to both the professional and user, with clear accountability lines. This is a conditional exemption from the general prohibition — if any condition is unmet, the therapeutic chatbot cannot be used by minors.
Statutory Text
C. Therapeutic chatbots that meet all of the following requirements may be made available to minors: 1. The chatbot provides a clear and conspicuous disclaimer at the beginning of each individual interaction that it is AI and not a licensed professional; 2. The chatbot is not marketed or designated as a substitute for a human professional; 3. A licensed mental health professional (such as a clinical psychologist) assesses a user's suitability and prescribes the tool as part of a comprehensive treatment plan, and monitors its use and impact; 4. Developers provide robust, independent, peer-reviewed clinical trial data demonstrating both the safety and efficacy of the tool for specific conditions and populations; and 5. The system's functions, limitations, and data privacy policies are transparent to both the licensed mental health professional and the user. Clear lines of accountability are established for any harms caused by the system.
S-04 AI Crisis Response Protocols · S-04.1 · Deployer · ChatbotMinors
75A O.S. § 702(A)
Plain Language
Deployers must implement and maintain reasonably effective systems that detect when a user indicates intent to harm themselves or others, and must promptly respond to, report, and mitigate such situations. The statute explicitly requires that user safety and well-being be prioritized over the deployer's other interests (including commercial interests). This is a continuing operational obligation — the systems must be maintained, not merely installed. The obligation covers detection, response, reporting, and mitigation as four distinct functions.
Statutory Text
A. Deployers shall implement and maintain reasonably effective systems to detect, promptly respond to, report, and mitigate emergency situations in a manner that prioritizes the safety and well-being of users over the deployer's other interests.
D-01 Automated Processing Rights & Data Controls · D-01.4 · Deployer · ChatbotMinors
75A O.S. § 702(B)(1)-(3)
Plain Language
Deployers must limit their data collection and storage to information that does not conflict with the trusting party's (i.e., the user's) best interests. Collected information must satisfy all three tests: it must be adequate (sufficient for a legitimate purpose), relevant (linked to that purpose), and necessary (the minimum amount needed). This is a data minimization obligation with a fiduciary-like framing — the reference to 'trusting party's best interests' implies a duty of loyalty in data handling. The term 'trusting party' is not defined in the statute.
Statutory Text
B. Deployers shall collect and store only that information that does not conflict with a trusting party's best interests. Such information must be: 1. Adequate, in the sense that it is sufficient to fulfill a legitimate purpose of the deployer; 2. Relevant, in the sense that the information has a relevant link to that legitimate purpose; and 3. Necessary, in the sense that it is the minimum amount of information which is needed for that legitimate purpose.
Other · ChatbotMinors
75A O.S. § 703(A)-(B)
Plain Language
This provision establishes two enforcement tracks: (1) the Attorney General may bring civil actions seeking injunction, disgorgement, and civil penalties of up to $2,500 per violation or $7,500 per intentional violation; and (2) any minor (or parent/guardian on their behalf) may bring a private civil action, including class actions, for statutory damages of $100–$750 per user per incident or actual damages (whichever is greater), plus injunctive or declaratory relief. This creates no new compliance obligation — it defines the consequences of violating the substantive obligations in Sections 701 and 702.
Statutory Text
A. Any business or person that violates this act shall be subject to an injunction and disgorgement of any unjust gains due to violation of this act, and shall be liable for a civil penalty of not more than Two Thousand Five Hundred Dollars ($2,500.00) for each violation or Seven Thousand Five Hundred Dollars ($7,500.00) for each intentional violation, which shall be assessed and recovered in a civil action brought by the Attorney General. B. Any minor who uses a chatbot that does not comply with the terms of this act, or a parent or guardian acting on their behalf, may institute a civil action on their own, or on a class-wide basis, to recover damages in an amount not less than One Hundred Dollars ($100.00) and not greater than Seven Hundred Fifty Dollars ($750.00) per user per incident or actual damages, whichever is greater; and to obtain injunctive or declaratory relief.