HB-1188
LA · State · USA
LA
USA
● Pending
Proposed Effective Date
2026-08-01
Louisiana House Bill No. 1188 — Protecting Minors from Chatbot Harm Act (Chapter 35 of Title 51, R.S. 51:2161–2163)
Imposes safety and parental consent obligations on companion chatbot platforms regarding minor users in Louisiana. Platforms must prohibit minors from becoming account holders unless a parent or guardian consents, and must provide consenting parents with tools to monitor interactions, set time and access limits, disable third-party interactions, and receive self-harm notifications. For all minor accounts, platforms must disclose that the user is interacting with AI, provide hourly reminders that the chatbot is not human, and institute reasonable measures to prevent harmful content. Knowing or reckless violations are deemed unfair trade practices enforceable by the attorney general with civil penalties up to $50,000 per violation, and a private right of action permits recovery of up to $10,000 in damages per minor account holder.
Summary

Imposes safety and parental consent obligations on companion chatbot platforms regarding minor users in Louisiana. Platforms must prohibit minors from becoming account holders unless a parent or guardian consents, and must provide consenting parents with tools to monitor interactions, set time and access limits, disable third-party interactions, and receive self-harm notifications. For all minor accounts, platforms must disclose that the user is interacting with AI, provide hourly reminders that the chatbot is not human, and institute reasonable measures to prevent harmful content. Knowing or reckless violations are deemed unfair trade practices enforceable by the attorney general with civil penalties up to $50,000 per violation, and a private right of action permits recovery of up to $10,000 in damages per minor account holder.

Enforcement & Penalties
Enforcement Authority
Attorney general enforcement. The attorney general may bring an action against a companion chatbot platform for a deceptive or unfair trade practice if the attorney general has reason to believe the platform is in violation of the chapter. Private right of action is available on behalf of a minor account holder. A civil action must be brought within two years of the date the complainant knew or reasonably should have known of the alleged violation. For jurisdictional purposes, a companion chatbot platform that allows a minor account holder in Louisiana to create an account is considered to be doing business in the state and subject to jurisdiction of Louisiana courts.
Penalties
Attorney general may impose a civil penalty of up to $50,000 per violation plus reasonable attorney fees and court costs. Punitive damages may be assessed if the platform's failure to comply is part of a consistent pattern of knowing or reckless conduct. Private action: a platform that knowingly or recklessly violates the chapter is liable to a minor account holder for up to $10,000 in damages plus court costs and reasonable attorney fees as ordered by the court. Violations are also deemed deceptive or unfair trade practices under the Unfair Trade Practices and Consumer Protection Law (R.S. 51:1401 et seq.), making additional remedies under that law available. The statute does not preclude any other available remedy at law or equity.
Who Is Covered
What Is Covered
"Companion chatbot" means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a user's social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions. "Companion chatbot" does not include any of the following: (i) A bot used only for customer service or for a business's operational purposes, productivity, or analysis related to source information, internal research, or technical assistance. (ii) A bot that is a feature of a video game, is limited to replies related to that video game, and does not discuss topics related to mental health, self-harm, or material harmful to minors, or maintain a dialogue on other topics unrelated to the video game. (iii) A stand-alone consumer electronic device that functions as a speaker and voice command interface, acts as a voice-activated virtual assistant, and does not sustain a relationship across multiple interactions or generate outputs likely to elicit emotional responses in the user.
"Companion chatbot platform" means a platform that allows a user to engage with a companion chatbot.
Compliance Obligations 6 obligations · click obligation ID to open requirement page
MN-01 Minor User AI Safety Protections · MN-01.2 · Deployer · ChatbotMinors
R.S. 51:2162(A)
Plain Language
Companion chatbot platforms must prohibit minors from creating accounts or maintaining existing accounts unless the minor's parent or guardian provides consent. This is a gating requirement — a minor cannot access the platform at all without parental consent. The platform bears responsibility for enforcing this prohibition, though the statute does not specify a particular age verification mechanism.
Statutory Text
A. A companion chatbot platform shall prohibit a minor from entering into a contract with the platform to become an account holder or from maintaining an existing account, unless the minor's parent or guardian provides consent for the minor to become an account holder or maintain an existing account.
MN-01 Minor User AI Safety Protections · MN-01.3 · Deployer · ChatbotMinors
R.S. 51:2162(A)(1)(a)-(e)
Plain Language
When a parent or guardian has consented to a minor's account, the platform must provide that parent or guardian with a suite of parental control tools: (a) access to full copies of all interactions between the minor and the chatbot; (b) daily time limits; (c) scheduling controls for days and times of access; (d) ability to disable interactions with third-party account holders on the platform; and (e) timely notifications when the minor expresses a desire or intent to self-harm or harm others. These are mandatory platform features — the platform must offer all five, though the parent chooses whether and how to use them. Notably, the self-harm notification in (e) is directed to the parent/guardian, not to crisis services.
Statutory Text
(1) If the minor's parent or guardian provides consent for the minor to become an account holder or maintain an existing account, the companion chatbot platform shall allow the consenting parent or guardian of the minor account holder to do all of the following: (a) Obtain copies of all interactions between the account holder and the companion chatbot. (b) Limit the amount of time that the account holder may interact with the companion chatbot each day. (c) Limit the days of the week and the times during the day when the account holder may interact with the companion chatbot. (d) Disable any of the interactions between the account holder and third-party account holders on the companion chatbot platform. (e) Receive timely notifications if the account holder expresses to the companion chatbot a desire or an intent to engage in self-harm or to harm others.
MN-01 Minor User AI Safety Protections · MN-01.9 · Deployer · ChatbotMinors
R.S. 51:2162(A)(2)(a)-(d)
Plain Language
Platforms must implement four account termination procedures for minor accounts: (a) If the platform already treats or categorizes an account as belonging to a minor for content/ad targeting purposes but the parent has not consented, the platform must terminate the account — but must give the account holder 90 days to dispute the termination before it takes effect. (b) A minor account holder may request their own account termination, which must be completed within 5 business days. (c) A consenting parent or guardian may request termination of the minor's account, effective within 10 business days. (d) Upon any termination, the platform must permanently delete all personal information associated with the terminated account unless retention is required by other law. Note the asymmetric timelines: minor self-requests get 5 days, parent requests get 10 days, and platform-initiated terminations for lack of consent get a 90-day dispute period.
Statutory Text
(2) A companion chatbot platform shall do all of the following: (a) Terminate an account belonging to an account holder who is a minor if the companion chatbot platform treats or categorizes that account as belonging to a minor for purposes of targeting content or advertising and if the minor's parent or guardian has not provided consent for that minor to become an account holder or to maintain an existing account. The companion chatbot platform shall provide ninety days for the account holder to dispute the termination. Termination shall be effective upon the expiration of the ninety-day period if the account holder fails to effectively dispute the termination. (b) Allow an account holder who is a minor to request termination of the account. Termination shall be effective within five business days of the request. (c) Allow the consenting parent or guardian of an account holder who is a minor to request that the minor's account be terminated. Termination shall be effective within ten business days following the request. (d) Permanently delete all personal information held by the companion chatbot platform relating to the terminated account, unless state or federal law requires the platform to maintain the information.
T-01 AI Identity Disclosure · T-01.1T-01.2 · Deployer · ChatbotMinors
R.S. 51:2162(B)(1)-(2)
Plain Language
For all minor account holders, the platform must (1) unconditionally disclose that the user is interacting with AI — there is no 'reasonable person' trigger here; and (2) provide a clear, conspicuous notification by default at the start of each interaction and at least every hour during ongoing sessions reminding the minor to take a break and that the chatbot is AI-generated and not human. The every-hour cadence is more frequent than CA SB 243's every-three-hours requirement. These obligations apply to all minor accounts regardless of whether a reasonable person would be misled.
Statutory Text
B. In connection with all accounts held by account holders who are minors, a companion chatbot platform shall do all of the following: (1) Disclose to the account holder that he is interacting with artificial intelligence. (2) Provide by default a clear and conspicuous notification to the account holder, at the beginning of companion chatbot interactions and at least once every hour during continuing interactions, reminding the minor to take a break and that the companion chatbot is artificially-generated and not human.
MN-01 Minor User AI Safety Protections · MN-01.6 · Deployer · ChatbotMinors
R.S. 51:2162(B)(3)
Plain Language
Platforms must implement reasonable measures to prevent their companion chatbot from (1) producing or sharing material harmful to minors, and (2) encouraging minors to engage in conduct described or depicted in such material. 'Material harmful to minors' is defined by cross-reference to R.S. 51:2121 (Louisiana's existing harmful-to-minors definition). The standard is 'reasonable measures' — not an absolute prohibition — giving platforms some implementation flexibility. This obligation applies to all minor accounts on the platform.
Statutory Text
(3) Institute reasonable measures to prevent its companion chatbot from producing or sharing material harmful to minors or encouraging the account holder to engage in any of the conduct described or depicted in materials harmful to minors.
Other · ChatbotMinors
R.S. 51:2163(A)-(E)
Plain Language
This section establishes the enforcement and remedies framework for the entire chapter. Knowing or reckless violations are deemed unfair trade practices under Louisiana's existing UDAP law. The attorney general may bring enforcement actions with civil penalties up to $50,000 per violation. A private right of action allows recovery of up to $10,000 per minor account holder, with a two-year limitations period and standing limited to claims on behalf of minor account holders. Punitive damages are available for consistent patterns of knowing or reckless conduct. For jurisdictional purposes, allowing a Louisiana minor to create an account is deemed doing business in the state. This provision creates no independent compliance obligation.
Statutory Text
A.(1) A knowing or reckless violation of this Chapter is deemed a deceptive or unfair trade practice or act pursuant to the Unfair Trade Practices and Consumer Protection Law, R.S. 51:1401 et seq. (2) If the attorney general has reason to believe that a companion chatbot platform is in violation of this Chapter, the attorney general may bring an action against that platform for a deceptive or unfair trade practice or act. (3) In addition to other remedies provided for in this Section, the attorney general may impose a civil penalty of up to fifty thousand dollars per violation as well as reasonable attorney fees and court costs. (4) If the companion chatbot platform's failure to comply with this Chapter is part of a consistent pattern of knowing or reckless conduct, punitive damages may be assessed against the companion chatbot platform. B.(1) A companion chatbot platform that knowingly or recklessly violates this Chapter shall be liable to a minor account holder for up to ten thousand dollars in damages, as well as court costs and reasonable attorney fees, as ordered by the court. (2) A civil action for a claim pursuant to this Subsection may be brought within two years of the date the complainant knew, or reasonably should have known, of the alleged violation. (3) An action brought pursuant to this Subsection may be brought only on behalf of a minor account holder. C. For purposes of bringing an action in accordance with this Chapter, a companion chatbot platform that allows a minor account holder in this state to create an account on the platform is considered to be both engaged in substantial and not isolated activities within this state and operating, conducting, engaging in, or carrying on a business and doing business in this state, and is therefore subject to the jurisdiction of the courts of this state. D. If a companion chatbot platform allows a minor account holder to use that companion chatbot platform, the parties have entered into a contract. E. This Section does not preclude any other available remedy at law or equity.