HB-3544
OK · State · USA
OK
USA
● Pending
Proposed Effective Date
2026-11-01
Oklahoma HB 3544 — Social AI Companion Protection Act
Prohibits deployers of social AI companions from knowingly making such systems available to minors (under 18) in Oklahoma and requires deployers to implement reasonable measures to prevent minor access. Deployers must also adopt crisis response protocols for detecting and responding to user expressions of suicidal ideation or self-harm, including referral to crisis services such as suicide hotlines. Enforcement is exclusively through civil actions brought by the Attorney General, with civil penalties of up to $2,500 per violation ($7,500 per intentional violation), injunctive relief, and disgorgement of attributable profits. The bill contains broad carve-outs for general-purpose AI, customer service systems, virtual assistants, and search engines.
Summary

Prohibits deployers of social AI companions from knowingly making such systems available to minors (under 18) in Oklahoma and requires deployers to implement reasonable measures to prevent minor access. Deployers must also adopt crisis response protocols for detecting and responding to user expressions of suicidal ideation or self-harm, including referral to crisis services such as suicide hotlines. Enforcement is exclusively through civil actions brought by the Attorney General, with civil penalties of up to $2,500 per violation ($7,500 per intentional violation), injunctive relief, and disgorgement of attributable profits. The bill contains broad carve-outs for general-purpose AI, customer service systems, virtual assistants, and search engines.

Enforcement & Penalties
Enforcement Authority
Enforcement is exclusively by the Attorney General through civil actions. No private right of action is created. The Attorney General may promulgate rules necessary to enforce the act.
Penalties
Civil penalty of up to $2,500 per violation or up to $7,500 per intentional violation, assessed and recovered in a civil action brought by the Attorney General. Each day a violation continues constitutes a separate violation. Deployers are also subject to injunctive relief and disgorgement of profits directly attributable to the violation.
Who Is Covered
"Deployer" means any person, partnership, corporation, or governmental entity that operates, controls, or makes available a social AI companion to users in this state. A deployer does not include a mobile application store, search engine, Internet service provider, or provider of general-purpose artificial intelligence models solely because such entity provides access to, hosts, or transmits a system developed or controlled by another person.
What Is Covered
"Social AI companion" means an artificial intelligence system primarily designed or marketed to simulate sustained interpersonal companionship and emotional attachment or romantic interaction with a user as the system's core functionality. The term "Social AI companion" does not include: a. a system used solely for customer service, business operations, productivity, analysis related to source information, internal purposes, research purposes, or technical assistance, b. a stand-alone consumer electronic device that incorporates a speaker or voice or text command interface, acts as a virtual assistant, and does not sustain a relationship across multiple interactions and generate outputs intended to create emotional attachment with the user, c. a search engine feature that provides information in response to user queries and is not designed or marketed to simulate companionship, d. a system designed to provide outputs relating to a narrow and discrete functional topic and not primarily intended to simulate interpersonal companionship, e. a system that is not primarily designed or marketed for companionship where the developer does not control the specific deployment context in which the system interacts with end users, or f. a general-purpose artificial intelligence system that is not primarily designed or marketed to simulate interpersonal companionship, including systems used for education, counseling, research, productivity, or professional assistance.
Compliance Obligations 2 obligations · click obligation ID to open requirement page
MN-01 Minor User AI Safety Protections · MN-01.1 · Deployer · ChatbotMinors
Section 3(A)(1)-(2), (B) (75A Okl. St. § 11)
Plain Language
Deployers of social AI companions are categorically prohibited from knowingly — or where they reasonably should know — making their systems available to minors. Beyond the knowledge-based prohibition, deployers must also affirmatively implement reasonable measures designed to prevent minor access, creating a dual obligation: both a mens rea-based prohibition and a proactive technical/procedural obligation. The statute expressly preserves lawful adult access to these systems. The bill does not specify what constitutes 'reasonable measures,' leaving room for the Attorney General to define standards via rulemaking.
Statutory Text
A. Each deployer: 1. Shall not knowingly, or under circumstances where the deployer reasonably should know, make a social AI companion available to a minor; and 2. Shall implement reasonable measures designed to prevent minors from accessing a social AI companion. B. Nothing in this section shall be construed to restrict lawful access to such systems by adults.
MN-02 AI Crisis Response Protocols · MN-02.1 · Deployer · Chatbot
Section 4 (75A Okl. St. § 12)
Plain Language
Deployers must adopt and maintain a crisis response protocol for their social AI companions. The protocol must address user prompts indicating suicidal ideation or self-harm threats, and must at minimum include making reasonable efforts to refer users to crisis service providers — such as suicide hotlines, crisis text lines, or equivalent services. The 'includes, but is not limited to' language signals that crisis referral is a floor, not a ceiling — additional measures may be expected. Note that unlike CA SB 243, this bill does not require the deployer to publish the protocol publicly or to report crisis referral metrics to any agency.
Statutory Text
A deployer shall adopt a protocol for a social AI companion to respond to user prompts indicating suicidal ideation or threats of self-harm that includes, but is not limited to, making reasonable efforts to provide a response to the user that refers them to crisis service providers such as a suicide hotline, crisis text line, or other appropriate crisis services.