S-0896
SC · State · USA
SC
USA
● Pending
South Carolina S. 896 — Chatbot Protection Act (adding Chapter 80 to Title 39)
The Chatbot Protection Act imposes data governance, transparency, and safety obligations on chatbot providers — defined broadly as any person that creates, distributes, or makes a chatbot available to users in South Carolina. Core obligations include prohibitions on using personal data or chat logs for targeted advertising, profiling, and training without affirmative consent; special protections for minor users requiring parental consent; mandatory AI identity disclosure before any output and hourly thereafter; monthly risk-of-harm evaluations with public reporting; and a prohibition on implying chatbot output is equivalent to licensed professional services. The bill creates strict product liability for chatbot-caused injuries and provides both AG enforcement and a private right of action with up to $5,000 per violation, punitive damages, and attorney's fees.
Summary

The Chatbot Protection Act imposes data governance, transparency, and safety obligations on chatbot providers — defined broadly as any person that creates, distributes, or makes a chatbot available to users in South Carolina. Core obligations include prohibitions on using personal data or chat logs for targeted advertising, profiling, and training without affirmative consent; special protections for minor users requiring parental consent; mandatory AI identity disclosure before any output and hourly thereafter; monthly risk-of-harm evaluations with public reporting; and a prohibition on implying chatbot output is equivalent to licensed professional services. The bill creates strict product liability for chatbot-caused injuries and provides both AG enforcement and a private right of action with up to $5,000 per violation, punitive damages, and attorney's fees.

Enforcement & Penalties
Enforcement Authority
The Attorney General or a county attorney may bring a civil action against a chatbot provider for violations. Private right of action is also available: a violation of Section 39-80-20 or 39-80-30 constitutes an injury in fact to a user, and an injured user may bring a civil action. No cure period or safe harbor is specified.
Penalties
A violation of Section 39-80-20 or 39-80-30 constitutes an injury in fact, so actual harm need not be independently proven. In a private action, a court may award a civil penalty of not more than $5,000 per violation, punitive damages for reckless and knowing conduct, injunctive relief, declaratory relief, and reasonable attorney's fees and litigation costs. In an AG or county attorney action, available remedies include injunctive relief, damages, civil penalties, restitution, other remedies, and reasonable attorney's fees and litigation costs. Chatbot providers are also subject to strict product liability for any injury a chatbot causes to a user under Section 39-80-50.
Who Is Covered
"Chatbot provider" means any person that creates, distributes, or otherwise makes a chatbot available to a user.
What Is Covered
"Chatbot" means an algorithmic or automated system that generates information through text, audio, image, or video in a manner that simulates interpersonal interactions or conversations including artificial intelligence.
Compliance Obligations 19 obligations · click obligation ID to open requirement page
D-01 Automated Processing Rights & Data Controls · D-01.4 · DeployerDeveloper · Chatbot
S.C. Code § 39-80-20(A)(1)
Plain Language
Chatbot providers may not process personal data to shape chatbot outputs unless two conditions are met: (1) the processing is necessary to fulfill a specific, express user request, and (2) the user has given affirmative consent. The affirmative consent standard is stringent — it requires a standalone disclosure in plain language, accessible to disabled users, in the chatbot's language, with the decline option at least as prominent as the consent option. Consent cannot be inferred from inaction, continued use, or general terms of service. This effectively creates a purpose limitation and opt-in consent requirement for all personal data used in chatbot responses.
Statutory Text
(A) A chatbot provider may not: (1) process personal data to inform a chatbot output unless processing personal data is necessary to fulfill an express request that is made by a user and the user provides affirmative consent;
D-01 Automated Processing Rights & Data Controls · D-01.4 · DeployerDeveloper · Chatbot
S.C. Code § 39-80-20(A)(2)
Plain Language
Chatbot providers are categorically prohibited from using chat logs for any advertising purpose — whether to decide whether to show an ad, what type of ad to show, or how to customize an ad for a particular user. This is an absolute prohibition with no consent exception. Chat logs include both user inputs and chatbot outputs from the interaction.
Statutory Text
(A) A chatbot provider may not: (2) process a user's chat log: (a) to determine whether to display an advertisement for a product or service to a user; (b) to determine a product or service or category of a product or service to advertise to a user; or (c) to customize an advertisement for presentation to a user;
D-01 Automated Processing Rights & Data Controls · D-01.4 · DeployerDeveloper · Chatbot
S.C. Code § 39-80-20(A)(3)(c)-(d)
Plain Language
Adult users' chat logs and personal data may not be used for model training without affirmative consent. Additionally, chat logs and personal data may not be used for profiling beyond what is necessary to fulfill an express user request. Training is defined broadly as adjusting or modifying a model using input data, but excludes safety testing and harm-mitigation adjustments. Profiling is defined as classifying personality traits and behavioral characteristics, but excludes safety-related processing. These provisions together create a consent-gated training restriction and a necessity-limited profiling restriction for adult users.
Statutory Text
(A) A chatbot provider may not: (3) process a user's chat log and personal data: (c) for training purposes if the user is an adult, unless the chatbot provider first obtains affirmative consent; or (d) to engage in profiling beyond what is necessary to fulfill an express request;
MN-01 Minor User AI Safety Protections · MN-01.2 · DeployerDeveloper · ChatbotMinors
S.C. Code § 39-80-20(A)(3)(a)-(b)
Plain Language
When a chatbot provider knows or reasonably should know that a user is a minor, the provider may not process the minor's chat logs and personal data at all — and may not use them for training — without affirmative consent from the minor's parent or legal guardian. The knowledge standard is constructive: it applies when the provider should have known based on objective circumstances, not just actual knowledge. This effectively requires parental opt-in consent before any processing of minor user data.
Statutory Text
(A) A chatbot provider may not: (3) process a user's chat log and personal data: (a) if the chatbot provider knows or reasonably should have known that based on knowledge of objective circumstances the user is a minor and the user's parent or legal guardian did not provide affirmative consent; (b) for training purposes if the chatbot provider knows or reasonably should have known that based on knowledge of objective circumstances the user is a minor and the user's parent or legal guardian did not provide affirmative consent;
D-01 Automated Processing Rights & Data Controls · D-01.4 · DeployerDeveloper · Chatbot
S.C. Code § 39-80-20(A)(4)
Plain Language
Chatbot providers may not classify users by personality traits or behavioral characteristics beyond what is strictly necessary to fulfill the user's express request. This is a standalone profiling restriction that applies to all users regardless of age. Safety-related processing is carved out from the definition of profiling and is therefore exempt.
Statutory Text
(A) A chatbot provider may not: (4) profile a user based on any classification or designation of the user's personality or behavioral characteristic beyond what is necessary to fulfill an express request made by the user;
Other · DeployerDeveloper · Chatbot
S.C. Code § 39-80-20(A)(5)
Plain Language
Chatbot providers are categorically prohibited from selling users' chat logs — meaning they may not exchange chat log data for monetary or other valuable consideration, nor make it available to third parties for such consideration. Exceptions exist for service-provider disclosures (processors acting on behalf of the chatbot provider), user-directed disclosures with affirmative consent, and data the user intentionally made public. This is an absolute prohibition, not merely a consent-gated restriction.
Statutory Text
(A) A chatbot provider may not: (5) sell a user's chat logs;
Other · DeployerDeveloper · Chatbot
S.C. Code § 39-80-20(A)(6)
Plain Language
Chatbot providers must delete users' chat logs after ten years at the latest, unless retention is necessary to comply with this chapter or another legal requirement. This creates an outer-bound retention limit — providers may choose shorter retention periods. The exception for legal compliance ensures providers can retain data needed for litigation, regulatory obligations, or other statutory mandates.
Statutory Text
(A) A chatbot provider may not: (6) retain a user's chat log for more than ten years, unless retention is necessary to comply with this chapter or otherwise required by law;
Other · DeployerDeveloper · Chatbot
S.C. Code § 39-80-20(A)(7)
Plain Language
Chatbot providers may not discriminate or retaliate against users who decline to consent to having their chat logs or personal data used for training. Prohibited retaliation includes denying access to products or services, charging different prices or rates, or degrading service quality. This ensures users can meaningfully exercise their consent rights without fear of adverse consequences.
Statutory Text
(A) A chatbot provider may not: (7) discriminate or retaliate against a user, including: (a) denying products or services to the user; (b) charging different prices or rates for products or services to the user; or (c) providing lower quality products or services to the user for refusing to consent to the use of chat logs or personal data for training purposes.
D-01 Automated Processing Rights & Data Controls · D-01.1D-01.2 · DeployerDeveloper · Chatbot
S.C. Code § 39-80-20(B)
Plain Language
Users have an unconditional right to access their own chat logs at any time. On request, the chatbot provider must deliver the chat log in a downloadable, easy-to-read format. Providers may not discriminate or retaliate against users who exercise this access right. This is broader than a typical data access right because it covers both user input data and chatbot output data (per the chat log definition), giving users access to the full conversation record.
Statutory Text
(B) A user has a right to access the user's own chat logs at any time. A chatbot provider shall provide a user's own chat log on request by the user and shall provide the chat log in a downloadable and easy to read format. A chatbot provider may not discriminate or retaliate against a user that requests the user's chat.
Other · Chatbot
S.C. Code § 39-80-20(C)
Plain Language
Government entities cannot compel chatbot providers to produce user input data or chat logs except through a wiretap warrant. This elevates the legal standard for government access to chatbot interaction data to the highest standard used for private communications, treating chat logs similarly to wiretap-protected communications. This is a limitation on government power rather than a new compliance obligation on chatbot providers.
Statutory Text
(C) A governmental entity may not compel the production of or access to input data or chat logs from a chatbot provider, except as pursuant to a wiretap warrant.
G-01 AI Governance Program & Documentation · G-01.1 · DeployerDeveloper · Chatbot
S.C. Code § 39-80-20(D)
Plain Language
Chatbot providers must develop, implement, and maintain a written, comprehensive data security program with administrative, technical, and physical safeguards proportionate to the volume and nature of the personal data and chat logs they hold. The program must be publicly available on the provider's website. This is both a governance obligation (maintain a written program) and a transparency obligation (publish it publicly). The proportionality standard means the security measures must scale with the sensitivity and volume of data processed.
Statutory Text
(D) A chatbot provider shall develop, implement, and maintain a comprehensive data security program that contains administrative, technical, and physical safeguards that are proportionate to the volume and nature of personal data and chat logs that are maintained by the chatbot provider. The program must be written and made publicly available on the chatbot provider's website.
Other · DeployerDeveloper · Chatbot
S.C. Code § 39-80-20(E)
Plain Language
Chatbot providers must implement physical, administrative, and technical safeguards to ensure that deidentified data cannot be reidentified. All processing, retention, and transfer of deidentified data must occur without any reasonable means of reidentification. This is a continuing obligation — it applies not just at the point of deidentification but throughout the data lifecycle.
Statutory Text
(E) A chatbot provider shall take the necessary physical, administrative, and technical measures to prevent deidentified data from being reidentified and to process, retain, and transfer deidentified data without any reasonable means of reidentification.
CP-01 Deceptive & Manipulative AI Conduct · CP-01.9 · DeployerDeveloper · Chatbot
S.C. Code § 39-80-30(A)(1)
Plain Language
Chatbot providers may not use any language in their advertising, interface design, or chatbot outputs that states or implies the chatbot's content is endorsed by or equivalent to that of a licensed professional — including any certified/registered/licensed professional, legal professionals, CPAs, investment advisors, or licensed fiduciaries. This prohibits implicit professional endorsement through terms, letters, or phrases, not just explicit claims. The coverage is broad: it applies to advertising about the chatbot, the chatbot's user interface, and the chatbot's actual outputs.
Statutory Text
(A) A chatbot provider may not: (1) use any term, letter, or phrase in the advertising, interface, or output data of a chatbot that states or implies that the advertising, interface, or output data of a chatbot is endorsed by or equivalent to any of the following: (a) any certified, registered, or licensed professional; (b) a licensed legal professional; (c) a certified public accountant; (d) an investment advisor or an investment advisor representative; or (e) a licensed fiduciary;
CP-01 Deceptive & Manipulative AI Conduct · CP-01.3 · DeployerDeveloper · Chatbot
S.C. Code § 39-80-30(A)(2)
Plain Language
Chatbot providers may not represent — whether in advertising, the interface, or chatbot outputs — that a user's input data or chat log is confidential. This prevents providers from creating a false expectation of attorney-client-style or therapist-patient-style confidentiality that does not actually exist. The prohibition covers any statement or implication of confidentiality, not just explicit claims.
Statutory Text
(A) A chatbot provider may not: (2) include any representation in the advertising, interface, or output data of a chatbot that states or implies the user's input data or chat log is confidential.
T-01 AI Identity Disclosure · T-01.1T-01.2T-01.3 · DeployerDeveloper · Chatbot
S.C. Code § 39-80-30(B)
Plain Language
Before generating any output, chatbot providers must give users clear, conspicuous, and explicit notice that they are interacting with a chatbot, not a human. This notice is unconditional — it applies regardless of whether a reasonable person would be misled. The notice must be repeated at the beginning of each communication, every hour during ongoing sessions, and whenever a user asks if the chatbot is human. The notice must be in the chatbot's operating language, in a font at least as large as the largest font used in chatbot communications. The notice must also comply with AG-promulgated regulations specifying form and content requirements.
Statutory Text
(B) A chatbot provider shall provide clear, conspicuous, and explicit notice to a user that the user is interacting with a chatbot rather than a natural person before the chatbot may generate any output data. The chatbot provider shall include this notice at the beginning of each chatbot communication with a user every hour thereafter and each time a user asks whether the chatbot is a natural person. The text of the notice must: (1) be written in the same language that the chatbot communicates with the user and must appear in a font size that is easily readable by an average user and is not smaller than the largest font size used for other chatbot communications; and (2) must comply with the rules adopted and the regulations promulgated by the Attorney General pursuant to Section 39-80-40.
S-01 AI System Safety Program · S-01.4S-01.7 · DeployerDeveloper · Chatbot
S.C. Code § 39-80-30(C)
Plain Language
Chatbot providers must conduct monthly evaluations of their chatbot for potential risk of harm to users, publish information about the chatbot on their website on a monthly basis, and mitigate any identified risks. The specific evaluation methodology and risk categories will be further defined by AG regulations (Section 39-80-40). The monthly cadence is notably frequent compared to most AI safety evaluation requirements. The mitigation obligation is ongoing and not limited to specific risk categories — any identified risk of harm must be addressed.
Statutory Text
(C) In compliance with the rules adopted and the regulations promulgated by the Attorney General pursuant to Section 39-80-40, a chatbot provider shall: (1) on a monthly basis: (a) evaluate its chatbot for potential risk of harm to users; and (b) make information about its chatbot publicly available on its website; and (2) mitigate any risk of harm to users.
Other · Chatbot
S.C. Code § 39-80-40(A)-(B)
Plain Language
The Attorney General is directed to adopt implementing regulations covering: the form and content of the AI identity notice, an example notice template, descriptions of potential risks of harm to users, and risk-reduction requirements for chatbot providers. The AG also has general authority to adopt any other regulations necessary to implement the chapter. This is a rulemaking delegation provision — it creates obligations on the AG to develop regulations, not on chatbot providers directly (though providers will need to comply with the resulting regulations).
Statutory Text
(A) The Attorney General shall adopt rules and promulgate regulations to implement this chapter. The rules and regulations must: (1) describe the form and content of the notice that is required pursuant to Section 39-80-30; (2) provide an example template for the notice that is required pursuant to Section 39-80-30; (3) describe any potential risk of harm to users; and (4) provide requirements for a chatbot provider to implement to reduce the risk of harm to users. (B) The Attorney General may adopt any other rules or promulgate regulations necessary to implement this chapter.
Other · DeployerDeveloper · Chatbot
S.C. Code § 39-80-50(A)-(C)
Plain Language
This provision classifies chatbots as products for product liability purposes and imposes strict liability on chatbot providers for injuries to users. Critically, liability attaches even if the provider exercised all reasonable care (eliminating a due diligence defense) and even if the provider had no direct distribution or contractual relationship with the injured user (reaching upstream developers and distributors). This is an absolute liability standard — there is no care-based defense. Product counsel should note this is significantly more aggressive than typical product liability frameworks, which usually allow reasonable care defenses for non-manufacturing defects.
Statutory Text
(A) A chatbot is considered a product for the purposes of a product liability action. (B) A chatbot provider has a duty to ensure that the use of the chatbot provider does not cause injury to a user. (C) A chatbot provider is liable for any injury that the chatbot causes to a user if: (1) the chatbot provider exercised all reasonable care in the design and distribution of the chatbot; or (2) the chatbot provider did not directly distribute the chatbot to the user or otherwise enter into a contractual relationship with the user.
Other · Chatbot
S.C. Code § 39-80-60(A)-(C)
Plain Language
This provision establishes the enforcement framework. The Attorney General or a county attorney may bring civil actions seeking injunctive relief, damages, civil penalties, restitution, and attorney's fees. Additionally, any user injured by a violation of the data protection (Section 39-80-20) or transparency/safety (Section 39-80-30) provisions may bring a private civil action. Notably, a violation of either section is deemed an injury in fact, eliminating the need to prove actual harm for standing purposes. Private plaintiffs may recover up to $5,000 per violation, punitive damages for reckless or knowing violations, injunctive and declaratory relief, and attorney's fees. This creates no independent compliance obligation.
Statutory Text
(A) The Attorney General or a county attorney may bring a civil action against a chatbot provider that violates this chapter and that includes any of the following: (1) enjoining an act or practice in violation of this chapter; (2) enforcing compliance with this chapter or a rule adopted or regulation promulgated pursuant to this chapter; (3) obtaining damages, civil penalties, restitution, or other remedies; or (4) obtaining reasonable attorney's fees and other litigation costs. (B) A violation of Section 39-80-20 or 39-80-30 constitutes an injury in fact to a user. (C) A user who is injured by a violation of Section 39-80-20 or 39-80-30 may bring a civil action against the chatbot provider, and a court of competent jurisdiction may award a prevailing plaintiff any of the following: (1) a civil penalty of not more than five thousand dollars per violation of this chapter; (2) punitive damages for reckless and knowing conduct; (3) injunctive relief; (4) declaratory relief; or (5) reasonable attorney's fees and litigation costs.