S-0896
SC · State · USA
SC
USA
● Pending
Proposed Effective Date
2025-01-01
South Carolina S 896 — Chatbot Protection Act (Adding Chapter 80 to Title 39)
The Chatbot Protection Act imposes data processing restrictions, transparency requirements, and safety obligations on chatbot providers operating in South Carolina. Chatbot providers may not process personal data or chat logs without affirmative consent, may not use chat logs for advertising purposes, may not sell chat logs, and must obtain parental consent before processing minor users' data. Providers must disclose to users that they are interacting with a chatbot before any output is generated, with hourly re-disclosure and on-demand disclosure when asked. Monthly safety evaluations are required. The bill creates a private right of action with up to $5,000 per violation in civil penalties, classifies chatbots as products for product liability purposes with strict liability, and authorizes the Attorney General and county attorneys to enforce the chapter.
Summary

The Chatbot Protection Act imposes data processing restrictions, transparency requirements, and safety obligations on chatbot providers operating in South Carolina. Chatbot providers may not process personal data or chat logs without affirmative consent, may not use chat logs for advertising purposes, may not sell chat logs, and must obtain parental consent before processing minor users' data. Providers must disclose to users that they are interacting with a chatbot before any output is generated, with hourly re-disclosure and on-demand disclosure when asked. Monthly safety evaluations are required. The bill creates a private right of action with up to $5,000 per violation in civil penalties, classifies chatbots as products for product liability purposes with strict liability, and authorizes the Attorney General and county attorneys to enforce the chapter.

Enforcement & Penalties
Enforcement Authority
The Attorney General or a county attorney may bring a civil action against a chatbot provider that violates the chapter. A user who is injured by a violation of Section 39-80-20 or 39-80-30 may bring a civil action against the chatbot provider. A violation of Section 39-80-20 or 39-80-30 constitutes an injury in fact to a user, establishing standing for private plaintiffs without requiring proof of additional harm. No cure period or safe harbor is specified.
Penalties
For private actions: civil penalty of not more than $5,000 per violation; punitive damages for reckless and knowing conduct; injunctive relief; declaratory relief; reasonable attorney's fees and litigation costs. A violation of Section 39-80-20 or 39-80-30 constitutes an injury in fact, so statutory damages do not require proof of actual monetary harm. For AG/county attorney actions: injunctive relief, damages, civil penalties, restitution, other remedies, and reasonable attorney's fees and litigation costs. Chatbot providers are also subject to strict product liability under Section 39-80-50.
Who Is Covered
"Chatbot provider" means any person that creates, distributes, or otherwise makes a chatbot available to a user.
What Is Covered
"Chatbot" means an algorithmic or automated system that generates information through text, audio, image, or video in a manner that simulates interpersonal interactions or conversations including artificial intelligence.
Compliance Obligations 15 obligations · click obligation ID to open requirement page
D-01 Automated Processing Rights & Data Controls · D-01.4 · Deployer · Chatbot
S.C. Code § 39-80-20(A)(1)
Plain Language
Chatbot providers may not process any personal data to shape chatbot outputs unless two conditions are met: (1) the processing is necessary to fulfill a specific express request made by the user, and (2) the user has provided affirmative consent. The definition of affirmative consent is rigorous — it requires a clear standalone disclosure in accessible and multilingual format, with the decline option equally prominent as the accept option. Consent cannot be inferred from inaction, continued use, or broad terms of service. This effectively prohibits ambient personalization based on personal data unless users specifically ask for and consent to it.
Statutory Text
(A) A chatbot provider may not: (1) process personal data to inform a chatbot output unless processing personal data is necessary to fulfill an express request that is made by a user and the user provides affirmative consent;
D-01 Automated Processing Rights & Data Controls · D-01.4 · Deployer · Chatbot
S.C. Code § 39-80-20(A)(2)
Plain Language
Chatbot providers are categorically prohibited from using a user's chat logs for any advertising purpose. This covers targeting (deciding whether to show an ad), selection (deciding which product category to advertise), and customization (tailoring the ad's content to the user). This is an absolute prohibition — there is no consent exception. The definition of chat log covers both user input data and all chatbot-generated output data.
Statutory Text
(A) A chatbot provider may not: (2) process a user's chat log: (a) to determine whether to display an advertisement for a product or service to a user; (b) to determine a product or service or category of a product or service to advertise to a user; or (c) to customize an advertisement for presentation to a user;
D-01 Automated Processing Rights & Data Controls · D-01.4D-01.6 · Deployer · ChatbotMinors
S.C. Code § 39-80-20(A)(3)(a)-(d)
Plain Language
Chatbot providers face tiered restrictions on processing chat logs and personal data. For known or reasonably-known minor users: all processing of chat logs and personal data requires parental or guardian affirmative consent, and training use specifically requires parental consent. For adult users: training use requires affirmative consent. For all users: profiling is prohibited beyond what is necessary to fulfill an express user request. The definition of training explicitly excludes safety testing, harm-mitigation adjustments, and compliance-related actions, so providers may use chat data for those safety purposes without consent. Similarly, processing chat logs for user safety is excluded from the profiling prohibition.
Statutory Text
(A) A chatbot provider may not: (3) process a user's chat log and personal data: (a) if the chatbot provider knows or reasonably should have known that based on knowledge of objective circumstances the user is a minor and the user's parent or legal guardian did not provide affirmative consent; (b) for training purposes if the chatbot provider knows or reasonably should have known that based on knowledge of objective circumstances the user is a minor and the user's parent or legal guardian did not provide affirmative consent; (c) for training purposes if the user is an adult, unless the chatbot provider first obtains affirmative consent; or (d) to engage in profiling beyond what is necessary to fulfill an express request;
D-01 Automated Processing Rights & Data Controls · D-01.4 · Deployer · Chatbot
S.C. Code § 39-80-20(A)(4)
Plain Language
Chatbot providers may not build user profiles based on personality traits or behavioral characteristics beyond what is strictly necessary to satisfy a specific user request. This prohibits speculative or pre-emptive profiling, background personality modeling, and behavioral classification that the user did not ask for. Processing chat logs for user safety or legal compliance is excluded from this prohibition.
Statutory Text
(A) A chatbot provider may not: (4) profile a user based on any classification or designation of the user's personality or behavioral characteristic beyond what is necessary to fulfill an express request made by the user;
D-01 Automated Processing Rights & Data Controls · Deployer · Chatbot
S.C. Code § 39-80-20(A)(5)-(6)
Plain Language
Chatbot providers are categorically prohibited from selling user chat logs — there is no consent exception for sales. The definition of 'sell' is broad, covering any exchange for monetary or other valuable consideration, but carves out processor disclosures, user-directed disclosures with affirmative consent, and data the user intentionally made public. Additionally, providers must delete chat logs after ten years unless retention is required for compliance with this chapter or other law. These are absolute restrictions with no opt-in override.
Statutory Text
(A) A chatbot provider may not: (5) sell a user's chat logs; (6) retain a user's chat log for more than ten years, unless retention is necessary to comply with this chapter or otherwise required by law;
D-01 Automated Processing Rights & Data Controls · Deployer · Chatbot
S.C. Code § 39-80-20(A)(7)
Plain Language
Chatbot providers may not discriminate or retaliate against users who refuse to consent to training use of their chat logs or personal data. This includes denying services, charging higher prices, or degrading service quality. Users must be able to decline consent to training use without any service consequences.
Statutory Text
(A) A chatbot provider may not: (7) discriminate or retaliate against a user, including: (a) denying products or services to the user; (b) charging different prices or rates for products or services to the user; or (c) providing lower quality products or services to the user for refusing to consent to the use of chat logs or personal data for training purposes.
D-01 Automated Processing Rights & Data Controls · Deployer · Chatbot
S.C. Code § 39-80-20(B)
Plain Language
Users have a right to access and download their own chat logs at any time. Chatbot providers must produce the chat log in a downloadable, easy-to-read format upon request. Providers may not discriminate or retaliate against users who exercise this access right. This is an on-demand data portability right with no limit on frequency of requests.
Statutory Text
(B) A user has a right to access the user's own chat logs at any time. A chatbot provider shall provide a user's own chat log on request by the user and shall provide the chat log in a downloadable and easy to read format. A chatbot provider may not discriminate or retaliate against a user that requests the user's chat.
G-01 AI Governance Program & Documentation · G-01.1 · Deployer · Chatbot
S.C. Code § 39-80-20(D)
Plain Language
Chatbot providers must develop, implement, and maintain a written comprehensive data security program covering administrative, technical, and physical safeguards. The program must be proportionate to the volume and nature of personal data and chat logs the provider maintains. The written program must be made publicly available on the provider's website. This is both an operational requirement (implement and maintain) and a transparency requirement (publish on website).
Statutory Text
(D) A chatbot provider shall develop, implement, and maintain a comprehensive data security program that contains administrative, technical, and physical safeguards that are proportionate to the volume and nature of personal data and chat logs that are maintained by the chatbot provider. The program must be written and made publicly available on the chatbot provider's website.
D-01 Automated Processing Rights & Data Controls · Deployer · Chatbot
S.C. Code § 39-80-20(E)
Plain Language
Chatbot providers must implement physical, administrative, and technical measures to ensure deidentified data cannot be reidentified. The provider must process, retain, and transfer deidentified data without any reasonable means of reidentification. This is a continuing technical obligation — not a one-time deidentification step — requiring ongoing safeguards throughout the data lifecycle.
Statutory Text
(E) A chatbot provider shall take the necessary physical, administrative, and technical measures to prevent deidentified data from being reidentified and to process, retain, and transfer deidentified data without any reasonable means of reidentification.
CP-01 Deceptive & Manipulative AI Conduct · CP-01.9 · Deployer · Chatbot
S.C. Code § 39-80-30(A)(1)
Plain Language
Chatbot providers may not use any language in their advertising, chatbot interface, or chatbot outputs that states or implies the content is endorsed by or equivalent to services from a licensed professional. This covers a broad range of licensed professions: any certified, registered, or licensed professional; attorneys; CPAs; investment advisors and their representatives; and licensed fiduciaries. The prohibition applies across three surfaces — advertising, the chatbot interface, and the chatbot's output data.
Statutory Text
(A) A chatbot provider may not: (1) use any term, letter, or phrase in the advertising, interface, or output data of a chatbot that states or implies that the advertising, interface, or output data of a chatbot is endorsed by or equivalent to any of the following: (a) any certified, registered, or licensed professional; (b) a licensed legal professional; (c) a certified public accountant; (d) an investment advisor or an investment advisor representative; or (e) a licensed fiduciary;
CP-01 Deceptive & Manipulative AI Conduct · CP-01.3 · Deployer · Chatbot
S.C. Code § 39-80-30(A)(2)
Plain Language
Chatbot providers are prohibited from representing — in advertising, the chatbot interface, or chatbot outputs — that a user's input data or chat logs are confidential. This prevents providers from creating a false impression of confidentiality analogous to attorney-client or doctor-patient privilege that does not exist in the chatbot context. Providers must not state or imply confidentiality protections they cannot actually deliver.
Statutory Text
(A) A chatbot provider may not: (2) include any representation in the advertising, interface, or output data of a chatbot that states or implies the user's input data or chat log is confidential.
T-01 AI Identity Disclosure · T-01.1T-01.2T-01.3 · Deployer · Chatbot
S.C. Code § 39-80-30(B)
Plain Language
Chatbot providers must display a clear, conspicuous, and explicit notice that the user is interacting with a chatbot — not a human — before the chatbot generates any output. This is an unconditional obligation; it does not depend on whether a reasonable person could be misled. The notice must be repeated at the beginning of each communication session, every hour during continuing interactions, and each time a user asks whether the chatbot is a natural person. The notice must appear in the chatbot's operating language, in a font size at least as large as the largest font used for other chatbot communications, and must comply with Attorney General regulations. This is among the most demanding AI identity disclosure requirements in the U.S., with hourly re-disclosure and format specifications.
Statutory Text
(B) A chatbot provider shall provide clear, conspicuous, and explicit notice to a user that the user is interacting with a chatbot rather than a natural person before the chatbot may generate any output data. The chatbot provider shall include this notice at the beginning of each chatbot communication with a user every hour thereafter and each time a user asks whether the chatbot is a natural person. The text of the notice must: (1) be written in the same language that the chatbot communicates with the user and must appear in a font size that is easily readable by an average user and is not smaller than the largest font size used for other chatbot communications; and (2) must comply with the rules adopted and the regulations promulgated by the Attorney General pursuant to Section 39-80-40.
S-01 AI System Safety Program · S-01.4S-01.7 · Deployer · Chatbot
S.C. Code § 39-80-30(C)
Plain Language
Chatbot providers must conduct monthly safety evaluations of their chatbots to identify potential risks of harm to users and must publish information about their chatbot on their website on the same monthly cadence. When risks are identified, the provider must mitigate them. The specific content and form of the evaluations and public disclosures will be defined by Attorney General regulations. The monthly frequency makes this one of the most frequent mandatory safety evaluation cadences in U.S. AI legislation. The mitigation obligation is open-ended — any risk of harm identified must be addressed.
Statutory Text
(C) In compliance with the rules adopted and the regulations promulgated by the Attorney General pursuant to Section 39-80-40, a chatbot provider shall: (1) on a monthly basis: (a) evaluate its chatbot for potential risk of harm to users; and (b) make information about its chatbot publicly available on its website; and (2) mitigate any risk of harm to users.
Other · Chatbot
S.C. Code § 39-80-50(A)-(C)
Plain Language
Chatbots are classified as products under South Carolina product liability law. Chatbot providers have a duty to ensure their chatbot does not injure users. Critically, the provider is liable for user injuries even if it exercised all reasonable care in design and distribution (strict liability, no negligence defense) and even if the provider had no direct relationship with the injured user (no privity requirement). This is a remarkably broad liability standard — essentially absolute liability for any user injury caused by the chatbot, regardless of fault or contractual relationship.
Statutory Text
(A) A chatbot is considered a product for the purposes of a product liability action. (B) A chatbot provider has a duty to ensure that the use of the chatbot provider does not cause injury to a user. (C) A chatbot provider is liable for any injury that the chatbot causes to a user if: (1) the chatbot provider exercised all reasonable care in the design and distribution of the chatbot; or (2) the chatbot provider did not directly distribute the chatbot to the user or otherwise enter into a contractual relationship with the user.
Other · Chatbot
S.C. Code § 39-80-20(C)
Plain Language
Government entities cannot compel chatbot providers to produce or grant access to user input data or chat logs unless they obtain a wiretap warrant. This elevates the legal standard for government access to chatbot conversations to the highest level of judicial authorization — equivalent to the standard for intercepting real-time communications. Standard subpoenas and search warrants are insufficient. This provision protects user privacy vis-à-vis the government, not vis-à-vis the chatbot provider.
Statutory Text
(C) A governmental entity may not compel the production of or access to input data or chat logs from a chatbot provider, except as pursuant to a wiretap warrant.