H-5138
SC · State · USA
SC
USA
● Pre-filed
Proposed Effective Date
2026-01-01
South Carolina H. 5138 — Chatbot Protection Act (adding Chapter 80 to Title 39)
The Chatbot Protection Act imposes data governance, transparency, and safety obligations on chatbot providers — any person that creates, distributes, or makes a chatbot available to users. Core obligations include: prohibitions on processing personal data without affirmative consent, restrictions on using chat logs for advertising or profiling, a ban on selling chat logs, mandatory AI identity disclosure before and during every interaction (hourly and on-demand), monthly risk-of-harm evaluations with public disclosure, and a requirement to maintain a publicly available data security program. The bill classifies chatbots as products for product liability purposes and imposes strict liability on providers for user injuries. Enforcement is through the Attorney General, county attorneys, or private right of action, with civil penalties up to $5,000 per violation and punitive damages for reckless or knowing conduct.
Summary

The Chatbot Protection Act imposes data governance, transparency, and safety obligations on chatbot providers — any person that creates, distributes, or makes a chatbot available to users. Core obligations include: prohibitions on processing personal data without affirmative consent, restrictions on using chat logs for advertising or profiling, a ban on selling chat logs, mandatory AI identity disclosure before and during every interaction (hourly and on-demand), monthly risk-of-harm evaluations with public disclosure, and a requirement to maintain a publicly available data security program. The bill classifies chatbots as products for product liability purposes and imposes strict liability on providers for user injuries. Enforcement is through the Attorney General, county attorneys, or private right of action, with civil penalties up to $5,000 per violation and punitive damages for reckless or knowing conduct.

Enforcement & Penalties
Enforcement Authority
The Attorney General or a county attorney may bring a civil action against a chatbot provider that violates the chapter. A user who is injured by a violation of Section 39-80-20 or 39-80-30 may bring a civil action against the chatbot provider. A violation of Section 39-80-20 or 39-80-30 constitutes an injury in fact to a user, lowering the standing threshold for private plaintiffs. No cure period or safe harbor is specified.
Penalties
Civil penalty of not more than $5,000 per violation. Punitive damages available for reckless and knowing conduct. Injunctive relief and declaratory relief available. Reasonable attorney's fees and litigation costs recoverable by prevailing plaintiff. A violation of Section 39-80-20 or 39-80-30 constitutes an injury in fact, so proof of actual monetary harm is not required for standing. The Attorney General or county attorney may also obtain damages, civil penalties, restitution, or other remedies, plus reasonable attorney's fees and litigation costs.
Who Is Covered
"Chatbot provider" means any person that creates, distributes, or otherwise makes a chatbot available to a user.
What Is Covered
"Chatbot" means an algorithmic or automated system that generates information through text, audio, image, or video in a manner that simulates interpersonal interactions or conversations including artificial intelligence.
Compliance Obligations 15 obligations · click obligation ID to open requirement page
D-01 Automated Processing Rights & Data Controls · D-01.4 · DeployerDeveloper · Chatbot
S.C. Code § 39-80-20(A)(1)
Plain Language
Chatbot providers may not process personal data to inform chatbot outputs unless two conditions are met: (1) the processing is necessary to fulfill an express user request, and (2) the user has provided affirmative consent. Affirmative consent has strict requirements — it cannot be obtained through terms of service, dark patterns, or inaction. This effectively creates a purpose-limitation and consent-gating obligation for all personal data processing in chatbot outputs.
Statutory Text
(A) A chatbot provider may not: (1) process personal data to inform a chatbot output unless processing personal data is necessary to fulfill an express request that is made by a user and the user provides affirmative consent;
D-01 Automated Processing Rights & Data Controls · D-01.4 · DeployerDeveloper · Chatbot
S.C. Code § 39-80-20(A)(2)
Plain Language
Chatbot providers are categorically prohibited from using a user's chat logs for any advertising purpose — whether to decide whether to show an ad, select what to advertise, or customize ad content. This is an absolute prohibition with no consent override, unlike the personal data processing restriction in subsection (A)(1). The advertising prohibition extends to any use of chat log data to facilitate targeted or personalized advertising.
Statutory Text
(A) A chatbot provider may not: (2) process a user's chat log: (a) to determine whether to display an advertisement for a product or service to a user; (b) to determine a product or service or category of a product or service to advertise to a user; or (c) to customize an advertisement for presentation to a user;
D-01 Automated Processing Rights & Data Controls · D-01.4 · DeployerDeveloper · ChatbotMinors
S.C. Code § 39-80-20(A)(3)(a)-(d)
Plain Language
Chatbot providers face layered restrictions on processing chat logs and personal data. For minors (known or reasonably should be known): all processing of chat logs and personal data requires parental/guardian affirmative consent, with no exception. For adults: training use of chat logs and personal data requires affirmative consent. For all users: profiling beyond what is necessary to fulfill an express request is prohibited. Training is defined narrowly to exclude safety testing and compliance-related adjustments, meaning providers can still process data for safety purposes without consent.
Statutory Text
(A) A chatbot provider may not: (3) process a user's chat log and personal data: (a) if the chatbot provider knows or reasonably should have known that based on knowledge of objective circumstances the user is a minor and the user's parent or legal guardian did not provide affirmative consent; (b) for training purposes if the chatbot provider knows or reasonably should have known that based on knowledge of objective circumstances the user is a minor and the user's parent or legal guardian did not provide affirmative consent; (c) for training purposes if the user is an adult, unless the chatbot provider first obtains affirmative consent; or (d) to engage in profiling beyond what is necessary to fulfill an express request;
D-01 Automated Processing Rights & Data Controls · D-01.4 · DeployerDeveloper · Chatbot
S.C. Code § 39-80-20(A)(4)
Plain Language
Chatbot providers may not profile users — classifying or designating personality traits or behavioral characteristics — beyond what is strictly necessary to fulfill an express user request. This is a purpose-limitation restriction on profiling activity that applies to all users regardless of age. Safety-related processing is carved out from the definition of profiling and is therefore not restricted by this provision.
Statutory Text
(A) A chatbot provider may not: (4) profile a user based on any classification or designation of the user's personality or behavioral characteristic beyond what is necessary to fulfill an express request made by the user;
D-01 Automated Processing Rights & Data Controls · D-01.4 · DeployerDeveloper · Chatbot
S.C. Code § 39-80-20(A)(5)-(6)
Plain Language
Chatbot providers are categorically prohibited from selling user chat logs — no consent override is available. Chat logs must also not be retained for more than ten years, unless retention is required for compliance with this chapter or other law. The definition of 'sell' has carve-outs for service-provider disclosures, user-directed disclosures with affirmative consent, and data the user made publicly available without restrictions.
Statutory Text
(A) A chatbot provider may not: (5) sell a user's chat logs; (6) retain a user's chat log for more than ten years, unless retention is necessary to comply with this chapter or otherwise required by law;
CP-01 Deceptive & Manipulative AI Conduct · CP-01.3 · DeployerDeveloper · Chatbot
S.C. Code § 39-80-20(A)(7)
Plain Language
Chatbot providers may not discriminate or retaliate against users who refuse to consent to the use of their chat logs or personal data for training. Protected actions include denying services, charging different rates, or providing lower quality products. This prevents providers from using coercive pricing or service degradation as leverage to obtain training data consent.
Statutory Text
(A) A chatbot provider may not: (7) discriminate or retaliate against a user, including: (a) denying products or services to the user; (b) charging different prices or rates for products or services to the user; or (c) providing lower quality products or services to the user for refusing to consent to the use of chat logs or personal data for training purposes.
D-01 Automated Processing Rights & Data Controls · D-01.1D-01.2 · DeployerDeveloper · Chatbot
S.C. Code § 39-80-20(B)
Plain Language
Users have a right to access their own chat logs at any time. Upon request, the chatbot provider must deliver the chat log in a downloadable, easily readable format. Providers may not discriminate or retaliate against users who exercise this access right. This is a data access right analogous to CCPA-style 'right to know' but specific to chatbot interaction records.
Statutory Text
(B) A user has a right to access the user's own chat logs at any time. A chatbot provider shall provide a user's own chat log on request by the user and shall provide the chat log in a downloadable and easy to read format. A chatbot provider may not discriminate or retaliate against a user that requests the user's chat.
Other · Chatbot
S.C. Code § 39-80-20(C)
Plain Language
Government entities may not compel chatbot providers to produce or grant access to user input data or chat logs unless they obtain a wiretap warrant. This creates a heightened Fourth Amendment-style protection for chatbot communications, treating them comparably to wiretapped communications rather than standard subpoena-accessible business records.
Statutory Text
(C) A governmental entity may not compel the production of or access to input data or chat logs from a chatbot provider, except as pursuant to a wiretap warrant.
G-01 AI Governance Program & Documentation · G-01.1 · DeployerDeveloper · Chatbot
S.C. Code § 39-80-20(D)
Plain Language
Chatbot providers must develop, implement, and maintain a written, comprehensive data security program covering administrative, technical, and physical safeguards. Safeguards must be proportionate to the volume and nature of personal data and chat logs maintained. The written program must be made publicly available on the provider's website. This is both a governance obligation (formal program documentation) and a transparency obligation (public publication).
Statutory Text
(D) A chatbot provider shall develop, implement, and maintain a comprehensive data security program that contains administrative, technical, and physical safeguards that are proportionate to the volume and nature of personal data and chat logs that are maintained by the chatbot provider. The program must be written and made publicly available on the chatbot provider's website.
D-01 Automated Processing Rights & Data Controls · D-01.4 · DeployerDeveloper · Chatbot
S.C. Code § 39-80-20(E)
Plain Language
Chatbot providers must implement physical, administrative, and technical measures to prevent reidentification of deidentified data throughout its lifecycle — during processing, retention, and transfer. The data must be maintained without any reasonable means of reidentification. This is a continuing safeguard obligation, not a one-time deidentification step.
Statutory Text
(E) A chatbot provider shall take the necessary physical, administrative, and technical measures to prevent deidentified data from being reidentified and to process, retain, and transfer deidentified data without any reasonable means of reidentification.
CP-01 Deceptive & Manipulative AI Conduct · CP-01.9 · DeployerDeveloper · Chatbot
S.C. Code § 39-80-30(A)(1)
Plain Language
Chatbot providers may not use any language in chatbot advertising, interface design, or output that states or implies the chatbot's content is endorsed by or equivalent to services from a licensed professional — including healthcare professionals, lawyers, CPAs, investment advisors, or licensed fiduciaries. This covers the full surface area of the chatbot experience: marketing, UI, and generated content.
Statutory Text
(A) A chatbot provider may not: (1) use any term, letter, or phrase in the advertising, interface, or output data of a chatbot that states or implies that the advertising, interface, or output data of a chatbot is endorsed by or equivalent to any of the following: (a) any certified, registered, or licensed professional; (b) a licensed legal professional; (c) a certified public accountant; (d) an investment advisor or an investment advisor representative; or (e) a licensed fiduciary;
CP-01 Deceptive & Manipulative AI Conduct · CP-01.5 · DeployerDeveloper · Chatbot
S.C. Code § 39-80-30(A)(2)
Plain Language
Chatbot providers may not represent — in advertising, the chatbot interface, or chatbot outputs — that user input data or chat logs are confidential. This prevents providers from creating a false impression of confidentiality or privilege (such as attorney-client or therapist-patient confidentiality) in the chatbot interaction, given that chat logs are by their nature accessible to the provider and potentially subject to disclosure.
Statutory Text
(A) A chatbot provider may not: (2) include any representation in the advertising, interface, or output data of a chatbot that states or implies the user's input data or chat log is confidential.
T-01 AI Identity Disclosure · T-01.1T-01.2T-01.3 · DeployerDeveloper · Chatbot
S.C. Code § 39-80-30(B)
Plain Language
Before a chatbot generates any output, the provider must give the user clear, conspicuous, and explicit notice that they are interacting with a chatbot, not a human. This notice must be repeated at the beginning of each communication, every hour during continuing interactions, and each time a user asks whether the chatbot is a natural person. The notice must be in the chatbot's communication language, in a font at least as large as the largest font used elsewhere in chatbot communications, and must comply with Attorney General regulations. This is an unconditional disclosure — it applies regardless of whether a reasonable person would be misled.
Statutory Text
(B) A chatbot provider shall provide clear, conspicuous, and explicit notice to a user that the user is interacting with a chatbot rather than a natural person before the chatbot may generate any output data. The chatbot provider shall include this notice at the beginning of each chatbot communication with a user every hour thereafter and each time a user asks whether the chatbot is a natural person. The text of the notice must: (1) be written in the same language that the chatbot communicates with the user and must appear in a font size that is easily readable by an average user and is not smaller than the largest font size used for other chatbot communications; and (2) must comply with the rules adopted and the regulations promulgated by the Attorney General pursuant to Section 39-80-40.
S-01 AI System Safety Program · S-01.4S-01.7 · DeployerDeveloper · Chatbot
S.C. Code § 39-80-30(C)
Plain Language
Chatbot providers must, on a monthly basis, evaluate their chatbot for potential risk of harm to users and publish information about the chatbot on their website. They must also mitigate any identified risks of harm. The specific form and content of the evaluation and public disclosure will be governed by Attorney General regulations. This creates a continuous monthly safety evaluation cycle — not a one-time pre-deployment assessment — paired with an ongoing public transparency obligation and a duty to remediate identified risks.
Statutory Text
(C) In compliance with the rules adopted and the regulations promulgated by the Attorney General pursuant to Section 39-80-40, a chatbot provider shall: (1) on a monthly basis: (a) evaluate its chatbot for potential risk of harm to users; and (b) make information about its chatbot publicly available on its website; and (2) mitigate any risk of harm to users.
Other · Chatbot
S.C. Code § 39-80-50(A)-(C)
Plain Language
Chatbots are classified as products for product liability purposes. Chatbot providers have a duty to ensure their chatbot does not cause user injury. Critically, liability attaches even if the provider exercised all reasonable care (strict liability, not negligence) and even if the provider had no direct contractual relationship with the user (no privity requirement). This is a significant expansion of tort liability — it eliminates both the reasonable care defense and the privity defense for chatbot-related injuries.
Statutory Text
(A) A chatbot is considered a product for the purposes of a product liability action. (B) A chatbot provider has a duty to ensure that the use of the chatbot provider does not cause injury to a user. (C) A chatbot provider is liable for any injury that the chatbot causes to a user if: (1) the chatbot provider exercised all reasonable care in the design and distribution of the chatbot; or (2) the chatbot provider did not directly distribute the chatbot to the user or otherwise enter into a contractual relationship with the user.