H-5476
SC · State · USA
SC
USA
● Pending
South Carolina H. 5476 — Protecting Children from Chatbots Act (Adding Chapter 81 to Title 39)
Imposes safety and access-control obligations on operators of chatbots with 500,000+ monthly active users worldwide, with a primary focus on protecting minors. Requires covered entities to offer a limited-access mode (no account, no restricted features) as the default for unverified users, implement reasonable age verification before enabling restricted features (personalization, proactive outreach, relationship simulation, explicit content), and obtain verifiable parental consent before granting minors access to restricted features. Prohibits features designed to prioritize engagement at the expense of user wellbeing and requires systems to detect emotional dependence, prevent false claims of humanity, and respond to crisis situations with crisis referrals and emergency service notification. Requires incident reporting to the Attorney General within 15 days of knowledge of a covered incident (death, suicide attempt, self-harm requiring medical attention, psychiatric emergency, or serious physical injury). Enforced by the Attorney General (up to $50,000 per violation per day) and through a private right of action for harmed persons with attorney fees, injunctive relief, and punitive damages available.
Summary

Imposes safety and access-control obligations on operators of chatbots with 500,000+ monthly active users worldwide, with a primary focus on protecting minors. Requires covered entities to offer a limited-access mode (no account, no restricted features) as the default for unverified users, implement reasonable age verification before enabling restricted features (personalization, proactive outreach, relationship simulation, explicit content), and obtain verifiable parental consent before granting minors access to restricted features. Prohibits features designed to prioritize engagement at the expense of user wellbeing and requires systems to detect emotional dependence, prevent false claims of humanity, and respond to crisis situations with crisis referrals and emergency service notification. Requires incident reporting to the Attorney General within 15 days of knowledge of a covered incident (death, suicide attempt, self-harm requiring medical attention, psychiatric emergency, or serious physical injury). Enforced by the Attorney General (up to $50,000 per violation per day) and through a private right of action for harmed persons with attorney fees, injunctive relief, and punitive damages available.

Enforcement & Penalties
Enforcement Authority
Attorney General may initiate an action in the name of the State seeking injunction and civil penalties. Private right of action available to any person harmed by a violation, or a parent or legal guardian of a minor harmed by a violation. No cure period. Rights and remedies may not be waived by contract, and mandatory arbitration clauses purporting to cover claims under this act are void and unenforceable.
Penalties
AG enforcement: civil penalties up to $50,000 per violation; each day of noncompliance is a separate violation; injunctive relief. Private action: monetary damages for harm caused by the violation; reasonable attorney fees and costs; injunctive or declaratory relief; punitive damages if the violation was willful and wanton, reckless, or grossly negligent. Private plaintiffs must prove actual harm — the statute provides for 'monetary damages for the harm caused by the violation' rather than statutory minimum damages.
Who Is Covered
"Covered entity" means an operator of a chatbot that has five hundred thousand or more monthly active users worldwide. A covered entity does not include an operator of a chatbot that is: (i) not offered to the general public, such as internal workplace tools, clinician-supervised clinical tools, or university research systems; or (ii) used by a business entity solely for customer service or to strictly provide users with information about available commercial services or products provided by that entity, customer service account information, or other information strictly related to its customer service.
What Is Covered
"Chatbot" means any artificial intelligence, algorithmic, or automated system that: (a) produces new expressive content or responses not fully predetermined by the operator of the service or application; (b) accepts open-ended, natural-language, or multimodal user input and produces adaptive or context-responsive natural language output; and (c) maintains a conversational state across exchanges and is designed to facilitate multi-turn dialogue rather than to respond to discrete information requests.
Compliance Obligations 12 obligations · click obligation ID to open requirement page
MN-01 Minor User AI Safety Protections · MN-01.1 · Deployer · ChatbotMinors
S.C. Code § 39-81-20(A)(1), (B), (C), (F), (G), (H)
Plain Language
Covered entities must offer a limited-access mode as the default for all users who have not completed age verification. Before enabling any restricted feature — personalization, proactive outreach, extended sessions, relationship simulation, or explicit content — the entity must require account creation, verify the user's age through a reasonable process, and classify the user as a minor or adult. Age verification data must be minimized, used only for verification, not shared or combined with other data, and deleted within 24 hours (except a record that the user is a minor). Users must have a process to appeal age-verification decisions. Covered entities must also proactively monitor for misclassified accounts (e.g., minors using adult accounts) and re-verify them. Existing accounts must be frozen from restricted features within 60 days of the act's effective date unless verified. A safe harbor protects entities from liability when a minor incidentally uses a correctly verified adult account, provided the entity maintains its monitoring obligations.
Statutory Text
(A)(1) A covered entity shall make a limited-access mode available and shall ensure that any unverified user may only access and interact with a chatbot in limited-access mode. (B) Before enabling any restricted feature for a user, a covered entity shall: (1) require the user to create a user account; (2) verify the user's age using a reasonable age verification process, subject to item (3); and (3) using the age data, classify the user as a minor or an adult. (C) When conducting reasonable age verification process under this section, an operator shall: (1) collect only the age verification data that is strictly necessary to reasonably verify age; (2) use age verification data only for age verification; (3) not sell, rent, share, or otherwise disclose age verification data to any third party, except to a service provider performing age verification under a contract prohibiting further disclosure; (4) not combine age verification data with any other personal data about the user; (5) delete age verification data within twenty-four hours of completing the age verification process, except that the operator may retain a record that the user has been verified as a minor; and (6) provide a simple process for a user to appeal or correct an age-verification decision. (F) A covered entity shall implement reasonable systems and processes to identify user accounts that may be inaccurately classified by age, such as patterns of use suggesting a minor is using an adult account or credible reports that an account was created using false age data, and shall re-verify any such account before enabling any restricted feature. (G) A covered entity shall not be liable under this chapter solely because a minor incidentally uses a user account that has been correctly verified and classified as an adult account, provided the covered entity is otherwise in compliance with subsection (F). (H) With respect to each user account of a covered entity that exists as of the effective date of this act, a covered entity shall, within sixty days, disable access to restricted features for any account that has not been classified as an authorized minor account or a verified adult account, unless and until the user completes age verification.
MN-01 Minor User AI Safety Protections · MN-01.2MN-01.3 · Deployer · ChatbotMinors
S.C. Code § 39-81-30(A)-(D)
Plain Language
Minors may always use a chatbot in limited-access mode without parental consent. If a minor wants restricted features (personalization, proactive outreach, relationship simulation, etc.), the covered entity must obtain verifiable parental consent — freely given, specific, informed, and unambiguous — from a parent who has themselves passed age verification. Even with parental consent, explicit content must remain blocked for minors. The entity must implement parental control functions (time limits, content restrictions, notifications, data deletion) and offer parents the option to establish a linked parental account and access chat logs. For users classified as under sixteen, establishing a linked parental account or providing contact information is mandatory rather than optional. This creates a tiered consent model: no consent needed for limited-access, parental consent for restricted features, and enhanced parental linkage for under-16 users.
Statutory Text
(A) Nothing in this act shall be construed to require parental consent for a minor to access or interact with a chatbot in limited-access mode. (B) If the age verification process described in Section 39-81-20 classifies a user as a minor and the user seeks to access any restricted feature, then a covered entity shall offer the user the option of continuing to use the chatbot in limited-access mode or to obtain parental consent to access the restricted features. (C) If the user chooses to get parental consent, then the covered entity shall: (1) obtain verifiable parental consent; (2) remove limited-access mode and enable access to restricted features; (3) ensure that the chatbot continues to restrict access to any explicit content; (4) implement reasonable parental control functions, which may restrict the minor's access to features enabled under item (2); (5) offer the parent the option to provide contact information or establish a linked parental account in order to receive notifications; and (6) offer the parent the option to receive access to chat logs of any interactions between the minor and the chatbot conducted through the authorized minor account. (D) If the age verification process classifies the user as under sixteen, then a covered entity also shall require the consenting parent to provide contact information or establish a linked parental account.
MN-01 Minor User AI Safety Protections · MN-01.10 · Deployer · ChatbotMinors
S.C. Code § 39-81-30(E)
Plain Language
When a minor user triggers a crisis message — because they expressed suicidal thoughts, intent to self-harm, or signs of an acute mental health crisis — and the covered entity has a linked parental account or parent contact information on file, the entity must immediately notify the parent. This obligation is triggered by the crisis detection protocol required under § 39-81-40(B)(3) and applies only when parental contact information is available through the parental consent process.
Statutory Text
(E) If the covered entity has a way to reach the parent through a parental account or contact information provided under subsection (C) or (D), then the covered entity shall notify the parent immediately in the case of any incident provoking a crisis message, pursuant to Section 39-81-40(B)(3).
CP-01 Deceptive & Manipulative AI Conduct · CP-01.2 · Deployer · ChatbotMinors
S.C. Code § 39-81-40(A)(1)-(2)
Plain Language
Covered entities are prohibited from implementing chatbot features designed to prioritize engagement, revenue, or retention metrics — such as session length, frequency of use, or emotional engagement — at the expense of user wellbeing. This is a broad anti-dark-pattern prohibition that applies to all users, not just minors. Separately, features designed to encourage or facilitate minors or unverified users hiding their chatbot use from parents or guardians are prohibited. The statute also defines a 'duty of loyalty' concept reinforcing that entities may not place their own interests in material conflict with users' interests.
Statutory Text
(A) A covered entity shall not implement features designed to: (1) prioritize engagement, revenue, or retention metrics, such as session length, frequency of use, or emotional engagement, at the expense of user wellbeing; or (2) encourage or facilitate a minor user or unverified user concealing the user's use of the chatbot from a parent or guardian.
MN-01 Minor User AI Safety Protections · MN-01.5 · Deployer · ChatbotMinors
S.C. Code § 39-81-40(B)(1)
Plain Language
Covered entities must implement reasonable systems to detect when any user — not just minors — is developing emotional dependence on the chatbot, defined as relying on the chatbot as a primary source of emotional support or social connection. Upon detection, the entity must take reasonable steps to reduce the dependence and associated risks of harm. This is a continuous monitoring and intervention obligation applicable to all users.
Statutory Text
(B) A covered entity shall implement reasonable systems and processes to: (1) identify when a user is developing emotional dependence on the chatbot and take reasonable steps to reduce that dependence and associated risks of harm;
T-01 AI Identity Disclosure · T-01.1 · Deployer · ChatbotMinors
S.C. Code § 39-81-40(B)(2)
Plain Language
Covered entities must implement reasonable systems to ensure their chatbot does not make a materially false representation that it is a human being. Unlike some jurisdictions that require affirmative disclosure at the start of every interaction, this provision is narrower: it prohibits the chatbot from affirmatively and materially misrepresenting itself as human, but does not mandate unprompted AI identity disclosure. The obligation applies to all users.
Statutory Text
(B) A covered entity shall implement reasonable systems and processes to: (2) ensure that a chatbot does not make a materially false representation that it is a human being;
S-04 AI Crisis Response Protocols · S-04.1 · Deployer · ChatbotMinors
S.C. Code § 39-81-40(B)(3)
Plain Language
Covered entities must implement reasonable systems to detect when any user expresses suicidal thoughts, intent to self-harm, or signs of an acute mental health crisis. Upon detection, the entity must promptly provide a clear and prominent crisis message including crisis services information. This is a continuous operating requirement applicable to all users, not just minors. The crisis detection also triggers the parental notification obligation under § 39-81-30(E) for minor users whose parents are reachable.
Statutory Text
(B) A covered entity shall implement reasonable systems and processes to: (3) identify when a user is expressing suicidal thoughts, intent to self-harm, or showing signs of an acute mental health crisis and shall promptly provide a clear and prominent crisis message, including crisis services information to any such user.
S-02 Prohibited Conduct & Output Restrictions · S-02.6 · Deployer · ChatbotMinors
S.C. Code § 39-81-20(E), § 39-81-30(C)(3), § 39-81-10(11), (16)(e)
Plain Language
Minors are categorically prohibited from accessing explicit content through a chatbot, even with parental consent. Explicit content is broadly defined to include: obscene sexual material as applied to minors (using a minor-specific obscenity standard), content providing specific instructions for or glorifying suicide, self-injury, or disordered eating, and gratuitous extreme violence. Because explicit content is classified as a restricted feature, unverified users also cannot access it. For authorized minor accounts (with parental consent), restricted features may be unlocked but explicit content must remain blocked. This creates a hard floor: no minor user may access explicit content under any circumstances.
Statutory Text
Section 39-81-20(E): If the age verification process classifies the user as a minor, then a covered entity shall not enable any restricted feature unless the user is using an authorized minor account subject to Section 39-81-30. Section 39-81-30(C)(3): [If the user chooses to get parental consent, then the covered entity shall:] (3) ensure that the chatbot continues to restrict access to any explicit content; Section 39-81-10(11): "Explicit content" means: (a) any description or representation of nudity, sexual conduct, sexual excitement, or sadomasochistic abuse when the content predominantly appeals to the prurient, shameful, or morbid interest of minors; is patently offensive to prevailing standards in the adult community as a whole with respect to what is suitable material for minors; and is, when taken as a whole, lacking in serious literary, artistic, political, or scientific value for minors; (b) content that provides specific instructions for, or that glorifies or promotes suicide, self-injury, or disordered eating behaviors; or (c) graphic depictions of extreme violence that lack serious literary, artistic, political, or scientific value for minors. Section 39-81-10(16)(e): ["Restricted feature" means:] (e) access to explicit content.
R-01 Incident Reporting · R-01.1 · Deployer · ChatbotMinors
S.C. Code § 39-81-50(A)(1)-(3)
Plain Language
When a covered entity learns that a user faces imminent risk of death or serious physical injury, it must make reasonable efforts within 24 hours to notify emergency services or law enforcement, using information it already has or can obtain through reasonable user-facing prompts. If the operator lacks sufficient information to enable an emergency response, it must instead: provide a prominent message urging the user to contact emergency services with crisis information, encourage the user to seek help, and document the steps taken and why direct notification was not practicable. Good-faith notifications are protected from liability absent willful misconduct or gross negligence. This is an emergency escalation obligation distinct from the crisis messaging requirement in § 39-81-40(B)(3) — it requires affirmative outreach to external emergency services, not just providing crisis information to the user.
Statutory Text
(A)(1) If a covered entity obtains knowledge that a user faces an imminent risk of death or serious physical injury, then the operator must make reasonable efforts, within twenty-four hours, to notify appropriate emergency services or law enforcement, to the extent practicable based on information the operator already possesses or can obtain through reasonable, user-facing prompts for the purpose of facilitating emergency assistance. (2) If the operator cannot make a notification under item (1) because the operator lacks sufficient information to enable an emergency response, then the operator shall: (a) promptly provide a clear and prominent message urging the user to contact emergency services and provide crisis services information, (b) make reasonable efforts to encourage the user to seek immediate help from a trusted adult or emergency services, and (c) document the steps taken and the basis for the operator's determination that notification was not practicable. (3) An operator that makes a notification in good faith under this subsection is not liable for damages solely for making the notification, unless the operator acted with willful misconduct or gross negligence.
R-01 Incident Reporting · R-01.1 · Deployer · ChatbotMinors
S.C. Code § 39-81-50(B)(1)-(2), (C)(1)-(2)
Plain Language
Covered entities must report covered incidents to the Attorney General within 15 days of obtaining knowledge. A covered incident is one where a user suffered a covered harm — death, suicide attempt, self-harm requiring medical attention, psychiatric emergency requiring urgent treatment, or serious physical injury requiring medical attention — arising from chatbot interactions. The report must include the date of knowledge, incident date, description of the incident and its chatbot connection, and responsive actions taken. Supplemental reports may be filed within 60 days. All reports are confidential and FOIA-exempt, though the Attorney General may publish aggregate statistics that do not identify users or disclose trade secrets.
Statutory Text
(B)(1) A covered entity shall submit a report to the Attorney General within fifteen days of obtaining knowledge of a covered incident connected to one or more of its chatbots, which, to the extent known at the time of the report, shall include: (a) the date the operator obtained knowledge of the incident; (b) the date of the incident, if known; (c) a brief description of the incident and the basis for the operator's belief that the incident is connected to the chatbot; and (d) a description of any actions the operator took in response. (2) A covered entity may submit a supplemental report within sixty days after the initial report to update or correct information learned through investigation. (C)(1) Reports submitted under this section shall be confidential and are not subject to disclosure pursuant to Chapter 4, Title 30, the Freedom of Information Act. (2) The Attorney General may publish aggregate information and statistics derived from the reports, so long as the publication does not identify individual users or disclose trade secrets.
MN-01 Minor User AI Safety Protections · MN-01.11 · Deployer · ChatbotMinors
S.C. Code § 39-81-20(A)(1), (E), § 39-81-10(12), (16)
Plain Language
Minors without parental consent are effectively prohibited from accessing the full-featured chatbot product — they are restricted to limited-access mode, which strips out personalization, proactive outreach, extended sessions, relationship simulation, and explicit content. This functions as a categorical prohibition on minors accessing the companion-style features of chatbots absent parental consent. Unlike some jurisdictions that merely restrict specific content types, this bill restricts the product category itself for minors.
Statutory Text
Section 39-81-20(A)(1): A covered entity shall make a limited-access mode available and shall ensure that any unverified user may only access and interact with a chatbot in limited-access mode. Section 39-81-20(E): If the age verification process classifies the user as a minor, then a covered entity shall not enable any restricted feature unless the user is using an authorized minor account subject to Section 39-81-30.
Other · ChatbotMinors
S.C. Code § 39-81-60(C)(1)-(2)
Plain Language
All rights and remedies under this act are non-waivable. Any contractual term that purports to waive or limit statutory rights, shorten the claims period, prevent court enforcement, or require mandatory arbitration is void and unenforceable as against public policy. This means covered entities cannot use terms of service, user agreements, or arbitration clauses to shield themselves from claims under this chapter. This provision protects the enforcement mechanism but creates no new affirmative compliance obligation.
Statutory Text
(C)(1) The rights and remedies provided by this act may not be waived by contract. (2) Any term in a contract or agreement that purports to do any of the following is void and unenforceable as against public policy: (a) waive or limit a right or remedy under this act; (b) shorten the time to bring a claim under this act; (c) prevent a person from enforcing a claim under this act in court; or (d) require arbitration of a claim under this act.