H-5138
SC · State · USA
SC
USA
● Pending
South Carolina H. 5138 — Chatbot Protection Act (adding Chapter 80 to Title 39)
The Chatbot Protection Act imposes data processing, transparency, and safety obligations on chatbot providers operating in South Carolina. Chatbot providers are prohibited from processing personal data or chat logs for advertising, profiling, or training without affirmative consent, and may not sell chat logs. The bill requires clear AI identity disclosure before any output and hourly thereafter, prohibits chatbot providers from implying professional endorsement or confidentiality, and mandates monthly risk-of-harm evaluations with public disclosure. Chatbots are treated as products for product liability purposes with strict liability. Enforcement is through the Attorney General, county attorneys, or private right of action with up to $5,000 per violation and punitive damages for reckless conduct.
Summary

The Chatbot Protection Act imposes data processing, transparency, and safety obligations on chatbot providers operating in South Carolina. Chatbot providers are prohibited from processing personal data or chat logs for advertising, profiling, or training without affirmative consent, and may not sell chat logs. The bill requires clear AI identity disclosure before any output and hourly thereafter, prohibits chatbot providers from implying professional endorsement or confidentiality, and mandates monthly risk-of-harm evaluations with public disclosure. Chatbots are treated as products for product liability purposes with strict liability. Enforcement is through the Attorney General, county attorneys, or private right of action with up to $5,000 per violation and punitive damages for reckless conduct.

Enforcement & Penalties
Enforcement Authority
The Attorney General or a county attorney may bring a civil action against a chatbot provider for violations. A user who is injured by a violation of Section 39-80-20 or 39-80-30 may bring a private civil action. A violation of Section 39-80-20 or 39-80-30 constitutes an injury in fact to a user, lowering the standing threshold for private plaintiffs. No cure period or safe harbor is specified.
Penalties
For AG/county attorney actions: injunctive relief, enforcement of compliance, damages, civil penalties, restitution, other remedies, and reasonable attorney's fees and litigation costs. For private actions: civil penalty of up to $5,000 per violation, punitive damages for reckless and knowing conduct, injunctive relief, declaratory relief, and reasonable attorney's fees and litigation costs. A violation constitutes an injury in fact, so actual monetary harm is not required for standing.
Who Is Covered
"Chatbot provider" means any person that creates, distributes, or otherwise makes a chatbot available to a user.
What Is Covered
"Chatbot" means an algorithmic or automated system that generates information through text, audio, image, or video in a manner that simulates interpersonal interactions or conversations including artificial intelligence.
Compliance Obligations 17 obligations · click obligation ID to open requirement page
D-01 Automated Processing Rights & Data Controls · D-01.4 · Deployer · Chatbot
S.C. Code § 39-80-20(A)(1)
Plain Language
Chatbot providers may not use personal data to generate chatbot outputs unless the processing is strictly necessary to fulfill a specific user request and the user has given affirmative consent. Affirmative consent requires a clear, standalone disclosure with an equally prominent option to decline — consent cannot be inferred from inaction, buried in terms of service, or obtained through dark patterns. This effectively creates a necessity-plus-consent dual requirement: even with consent, the processing must be necessary for an express user request.
Statutory Text
(A) A chatbot provider may not: (1) process personal data to inform a chatbot output unless processing personal data is necessary to fulfill an express request that is made by a user and the user provides affirmative consent;
D-01 Automated Processing Rights & Data Controls · D-01.4 · Deployer · Chatbot
S.C. Code § 39-80-20(A)(2)
Plain Language
Chatbot providers are categorically prohibited from using a user's chat logs — meaning the user's inputs and the chatbot's outputs — for any advertising purpose. This includes deciding whether to show an ad, selecting which product or service to advertise, and customizing ad content. Unlike the personal data processing restriction in § 39-80-20(A)(1), this is an absolute prohibition with no consent exception.
Statutory Text
(A) A chatbot provider may not: (2) process a user's chat log: (a) to determine whether to display an advertisement for a product or service to a user; (b) to determine a product or service or category of a product or service to advertise to a user; or (c) to customize an advertisement for presentation to a user;
D-01 Automated Processing Rights & Data Controls · D-01.4D-01.6 · Deployer · ChatbotMinors
S.C. Code § 39-80-20(A)(3)
Plain Language
Chatbot providers face a tiered consent framework for processing chat logs and personal data. For minors (when the provider knows or should know the user is a minor): no processing of chat logs or personal data is permitted without parental or guardian affirmative consent, including for training. For adults: training use of chat logs and personal data requires the adult user's affirmative consent. For all users: profiling beyond what is necessary to fulfill an express request is prohibited. Training has a narrow definition that excludes safety testing, safety-related modifications, and compliance actions. Profiling also excludes safety-related processing.
Statutory Text
(A) A chatbot provider may not: (3) process a user's chat log and personal data: (a) if the chatbot provider knows or reasonably should have known that based on knowledge of objective circumstances the user is a minor and the user's parent or legal guardian did not provide affirmative consent; (b) for training purposes if the chatbot provider knows or reasonably should have known that based on knowledge of objective circumstances the user is a minor and the user's parent or legal guardian did not provide affirmative consent; (c) for training purposes if the user is an adult, unless the chatbot provider first obtains affirmative consent; or (d) to engage in profiling beyond what is necessary to fulfill an express request;
D-01 Automated Processing Rights & Data Controls · D-01.4 · Deployer · Chatbot
S.C. Code § 39-80-20(A)(4)
Plain Language
Chatbot providers may not build personality or behavioral profiles of users beyond what is strictly necessary to fulfill a user's express request. This is a necessity limitation on profiling — any profiling that goes beyond the immediate request is prohibited. Processing chat logs for safety or regulatory compliance does not count as profiling under this provision.
Statutory Text
(A) A chatbot provider may not: (4) profile a user based on any classification or designation of the user's personality or behavioral characteristic beyond what is necessary to fulfill an express request made by the user;
D-01 Automated Processing Rights & Data Controls · Deployer · Chatbot
S.C. Code § 39-80-20(A)(5)-(6)
Plain Language
Chatbot providers are absolutely prohibited from selling chat logs — meaning they may not exchange user input/output data for monetary or other valuable consideration or make it available to a third party for such consideration. Narrow carve-outs exist for processor disclosures, user-directed disclosures with affirmative consent, and information the user intentionally made public. Separately, chat logs may not be retained for more than ten years, except where retention is necessary for compliance with this chapter or other law.
Statutory Text
(A) A chatbot provider may not: (5) sell a user's chat logs; (6) retain a user's chat log for more than ten years, unless retention is necessary to comply with this chapter or otherwise required by law;
D-01 Automated Processing Rights & Data Controls · D-01.3 · Deployer · Chatbot
S.C. Code § 39-80-20(A)(7)
Plain Language
Chatbot providers may not punish users who decline to consent to the use of their chat logs or personal data for training purposes. Prohibited retaliation includes service denial, differential pricing, and reduced service quality. This protects the meaningfulness of the affirmative consent requirement by preventing providers from making refusal commercially disadvantageous.
Statutory Text
(A) A chatbot provider may not: (7) discriminate or retaliate against a user, including: (a) denying products or services to the user; (b) charging different prices or rates for products or services to the user; or (c) providing lower quality products or services to the user for refusing to consent to the use of chat logs or personal data for training purposes.
D-01 Automated Processing Rights & Data Controls · D-01.1D-01.2 · Deployer · Chatbot
S.C. Code § 39-80-20(B)
Plain Language
Users have an unconditional right to access their own chat logs at any time. Upon request, the chatbot provider must deliver the logs in a downloadable, easy-to-read format. Providers may not discriminate or retaliate against users who exercise this right. This is a straightforward data portability and access right — there is no limit on frequency of requests and no exception for trade secrets or proprietary information.
Statutory Text
(B) A user has a right to access the user's own chat logs at any time. A chatbot provider shall provide a user's own chat log on request by the user and shall provide the chat log in a downloadable and easy to read format. A chatbot provider may not discriminate or retaliate against a user that requests the user's chat.
Other · Chatbot
S.C. Code § 39-80-20(C)
Plain Language
Government entities may not compel chatbot providers to produce user input data or chat logs except via a wiretap warrant. This protects user privacy by requiring the highest level of judicial authorization for government access to chatbot conversations. It creates no new affirmative compliance obligation for chatbot providers — it constrains government actors.
Statutory Text
(C) A governmental entity may not compel the production of or access to input data or chat logs from a chatbot provider, except as pursuant to a wiretap warrant.
G-01 AI Governance Program & Documentation · G-01.1 · Deployer · Chatbot
S.C. Code § 39-80-20(D)
Plain Language
Chatbot providers must establish, document, and maintain a comprehensive written data security program with administrative, technical, and physical safeguards scaled to the volume and sensitivity of the personal data and chat logs they hold. The written program must be publicly posted on the provider's website. This is both a governance obligation (establish a formal program) and a transparency obligation (make it publicly available).
Statutory Text
(D) A chatbot provider shall develop, implement, and maintain a comprehensive data security program that contains administrative, technical, and physical safeguards that are proportionate to the volume and nature of personal data and chat logs that are maintained by the chatbot provider. The program must be written and made publicly available on the chatbot provider's website.
D-01 Automated Processing Rights & Data Controls · Deployer · Chatbot
S.C. Code § 39-80-20(E)
Plain Language
Chatbot providers must implement physical, administrative, and technical safeguards to ensure that deidentified data cannot be reidentified. All processing, retention, and transfer of deidentified data must be conducted without any reasonable means of reidentification. This creates an ongoing obligation to maintain the irreversibility of deidentification.
Statutory Text
(E) A chatbot provider shall take the necessary physical, administrative, and technical measures to prevent deidentified data from being reidentified and to process, retain, and transfer deidentified data without any reasonable means of reidentification.
CP-01 Deceptive & Manipulative AI Conduct · CP-01.9 · Deployer · Chatbot
S.C. Code § 39-80-30(A)(1)
Plain Language
Chatbot providers may not use any language in their advertising, user interface, or chatbot output that states or implies the chatbot's output is endorsed by or equivalent to the services of any licensed, certified, or registered professional — including lawyers, CPAs, investment advisors, and fiduciaries. This covers the full spectrum of user-facing touchpoints: marketing materials, the interface itself, and the chatbot's generated outputs.
Statutory Text
(A) A chatbot provider may not: (1) use any term, letter, or phrase in the advertising, interface, or output data of a chatbot that states or implies that the advertising, interface, or output data of a chatbot is endorsed by or equivalent to any of the following: (a) any certified, registered, or licensed professional; (b) a licensed legal professional; (c) a certified public accountant; (d) an investment advisor or an investment advisor representative; or (e) a licensed fiduciary;
CP-01 Deceptive & Manipulative AI Conduct · CP-01.5 · Deployer · Chatbot
S.C. Code § 39-80-30(A)(2)
Plain Language
Chatbot providers are prohibited from representing — in advertising, the interface, or chatbot outputs — that a user's input data or chat logs are confidential. This prevents providers from creating a false impression of professional-grade confidentiality (e.g., attorney-client privilege) that does not actually exist in the chatbot context.
Statutory Text
(A) A chatbot provider may not: (2) include any representation in the advertising, interface, or output data of a chatbot that states or implies the user's input data or chat log is confidential.
T-01 AI Identity Disclosure · T-01.1T-01.2T-01.3 · Deployer · Chatbot
S.C. Code § 39-80-30(B)
Plain Language
Chatbot providers must disclose — clearly, conspicuously, and explicitly — that the user is interacting with a chatbot rather than a human before the chatbot generates any output. The disclosure must be repeated at the beginning of each communication, every hour during ongoing interactions, and any time a user asks whether the chatbot is a human. The notice must be in the same language as the chatbot's communications and in a font at least as large as the largest font used elsewhere in the chatbot interface. The Attorney General will prescribe the specific form and content of the notice by rule.
Statutory Text
(B) A chatbot provider shall provide clear, conspicuous, and explicit notice to a user that the user is interacting with a chatbot rather than a natural person before the chatbot may generate any output data. The chatbot provider shall include this notice at the beginning of each chatbot communication with a user every hour thereafter and each time a user asks whether the chatbot is a natural person. The text of the notice must: (1) be written in the same language that the chatbot communicates with the user and must appear in a font size that is easily readable by an average user and is not smaller than the largest font size used for other chatbot communications; and (2) must comply with the rules adopted and the regulations promulgated by the Attorney General pursuant to Section 39-80-40.
S-01 AI System Safety Program · S-01.4S-01.7 · Deployer · Chatbot
S.C. Code § 39-80-30(C)
Plain Language
Chatbot providers must conduct monthly safety evaluations of their chatbot for potential risk of harm to users, publish information about their chatbot on their website monthly, and mitigate any identified risks. The specific scope of these evaluations and the form of the public disclosures will be defined by Attorney General rulemaking. This creates a rolling monthly cycle of evaluation, disclosure, and mitigation — significantly more frequent than annual review requirements in other jurisdictions.
Statutory Text
(C) In compliance with the rules adopted and the regulations promulgated by the Attorney General pursuant to Section 39-80-40, a chatbot provider shall: (1) on a monthly basis: (a) evaluate its chatbot for potential risk of harm to users; and (b) make information about its chatbot publicly available on its website; and (2) mitigate any risk of harm to users.
Other · Chatbot
S.C. Code § 39-80-40(A)-(B)
Plain Language
The Attorney General is directed to adopt rules and regulations implementing the chapter, including specifying the form and content of the AI identity notice, providing a notice template, describing potential risks of harm to users, and setting risk-reduction requirements for chatbot providers. The AG may also adopt any other necessary implementing rules. This is a rulemaking delegation provision — it creates no direct compliance obligation for chatbot providers until rules are issued.
Statutory Text
(A) The Attorney General shall adopt rules and promulgate regulations to implement this chapter. The rules and regulations must: (1) describe the form and content of the notice that is required pursuant to Section 39-80-30; (2) provide an example template for the notice that is required pursuant to Section 39-80-30; (3) describe any potential risk of harm to users; and (4) provide requirements for a chatbot provider to implement to reduce the risk of harm to users. (B) The Attorney General may adopt any other rules or promulgate regulations necessary to implement this chapter.
Other · Chatbot
S.C. Code § 39-80-50(A)-(C)
Plain Language
Chatbots are classified as products for product liability purposes, and chatbot providers are subject to strict liability for user injuries. Critically, a provider is liable even if it exercised all reasonable care in design and distribution, and even if it had no direct relationship with the injured user. This eliminates both a due-care defense and a privity defense, establishing a strict liability regime that goes beyond negligence-based product liability. This is a liability classification provision — it does not create an affirmative compliance obligation but rather establishes the legal consequences of causing injury.
Statutory Text
(A) A chatbot is considered a product for the purposes of a product liability action. (B) A chatbot provider has a duty to ensure that the use of the chatbot provider does not cause injury to a user. (C) A chatbot provider is liable for any injury that the chatbot causes to a user if: (1) the chatbot provider exercised all reasonable care in the design and distribution of the chatbot; or (2) the chatbot provider did not directly distribute the chatbot to the user or otherwise enter into a contractual relationship with the user.
Other · Chatbot
S.C. Code § 39-80-60(A)-(C)
Plain Language
This section establishes the enforcement framework for the Chatbot Protection Act. The Attorney General or county attorneys may bring civil actions for injunctive relief, compliance enforcement, damages, civil penalties, restitution, and attorney's fees. Any violation of the substantive provisions (§§ 39-80-20 and 39-80-30) automatically constitutes an injury in fact, enabling private plaintiffs to sue without proving separate standing. Private plaintiffs may recover up to $5,000 per violation, punitive damages for reckless or knowing conduct, injunctive and declaratory relief, and attorney's fees. This creates no new compliance obligation — it provides the mechanism for enforcing the obligations established elsewhere in the chapter.
Statutory Text
(A) The Attorney General or a county attorney may bring a civil action against a chatbot provider that violates this chapter and that includes any of the following: (1) enjoining an act or practice in violation of this chapter; (2) enforcing compliance with this chapter or a rule adopted or regulation promulgated pursuant to this chapter; (3) obtaining damages, civil penalties, restitution, or other remedies; or (4) obtaining reasonable attorney's fees and other litigation costs. (B) A violation of Section 39-80-20 or 39-80-30 constitutes an injury in fact to a user. (C) A user who is injured by a violation of Section 39-80-20 or 39-80-30 may bring a civil action against the chatbot provider, and a court of competent jurisdiction may award a prevailing plaintiff any of the following: (1) a civil penalty of not more than five thousand dollars per violation of this chapter; (2) punitive damages for reckless and knowing conduct; (3) injunctive relief; (4) declaratory relief; or (5) reasonable attorney's fees and litigation costs.