SB-796
VA · State · USA
VA
USA
● Pending
Proposed Effective Date
2026-07-01
Virginia SB 796 — Artificial Intelligence Companion Chatbots and Minors Act; established, enforcement, civil penalty.
Imposes safety, disclosure, and incident reporting obligations on operators of chatbots with 500,000 or more monthly active users worldwide. Covered entities must implement reasonable systems to detect and mitigate user emotional dependence, prevent materially false representations that a chatbot is human, and detect and respond to expressions of suicidal ideation, self-harm, or acute mental health crisis with crisis referrals. All operators must display a persistent static disclaimer that the chatbot is not human and provide pop-up notifications at login, every 30 minutes, on user request, and when providing legally regulated advice. Covered entities must report covered incidents (death, suicide attempt, self-harm requiring medical attention, psychiatric emergency, or serious physical injury) to the Attorney General within 15 days. Enforced by the Attorney General with civil penalties up to $50,000 per violation per day, and by private right of action for actual damages, attorney fees, injunctive relief, and punitive damages for willful or grossly negligent violations. Arbitration clauses and contractual waivers of rights under the chapter are void.
Summary

Imposes safety, disclosure, and incident reporting obligations on operators of chatbots with 500,000 or more monthly active users worldwide. Covered entities must implement reasonable systems to detect and mitigate user emotional dependence, prevent materially false representations that a chatbot is human, and detect and respond to expressions of suicidal ideation, self-harm, or acute mental health crisis with crisis referrals. All operators must display a persistent static disclaimer that the chatbot is not human and provide pop-up notifications at login, every 30 minutes, on user request, and when providing legally regulated advice. Covered entities must report covered incidents (death, suicide attempt, self-harm requiring medical attention, psychiatric emergency, or serious physical injury) to the Attorney General within 15 days. Enforced by the Attorney General with civil penalties up to $50,000 per violation per day, and by private right of action for actual damages, attorney fees, injunctive relief, and punitive damages for willful or grossly negligent violations. Arbitration clauses and contractual waivers of rights under the chapter are void.

Enforcement & Penalties
Enforcement Authority
The Attorney General may initiate an action in the name of the Commonwealth to restrain violations and seek civil penalties. Private right of action available to any person harmed by a violation, or the parent or legal guardian of a minor harmed by a violation. No cure period. Contractual waivers, arbitration clauses, and limitations on rights or remedies under this chapter are void and unenforceable.
Penalties
Attorney General enforcement: civil penalties up to $50,000 per violation; each day of noncompliance is a separate violation; injunctive relief. Private action: actual damages, reasonable attorney fees and costs, injunctive or declaratory relief, and punitive damages if the violation was willful and wanton, reckless, or grossly negligent. No statutory minimum damages for private plaintiffs — recovery requires proof of actual damages. Rights and remedies may not be waived by contract; arbitration clauses and contractual limitations on claims under this chapter are void.
Who Is Covered
"Covered entity" means an operator of a chatbot that has 500,000 or more monthly active users worldwide. "Covered entity" does not include an operator of a chatbot that is: 1. Not offered to the general public, such as internal workplace tools, clinician-supervised clinical tools, or university research systems; or 2. Used by a business entity solely for customer service or strictly to provide users with information about available commercial services or products provided by the business entity, customer service account information, or other information strictly related to customer service. For purposes of determining monthly active users, a covered entity shall aggregate monthly active users across all chatbots offered by the covered entity and such entity's affiliates.
"Operator" means any person or entity that owns, controls, offers, or makes available a website, mobile application, or digital service that provides a chatbot to users in the Commonwealth.
What Is Covered
"Chatbot" means any artificial intelligence, algorithmic, or automated system that (i) produces new expressive content or responses not fully predetermined by the developer or operator of the service or application; (ii) accepts open-ended natural-language or multimodal user input and produces adaptive or context-responsive natural language output; and (iii) maintains a conversational state across exchanges and is designed to facilitate multi-turn dialogue rather than to respond to discrete information requests.
Compliance Obligations 9 obligations · click obligation ID to open requirement page
CP-01 Deceptive & Manipulative AI Conduct · CP-01.1 · Deployer · Chatbot
Va. Code § 59.1-615(1)
Plain Language
Covered entities must implement reasonable systems and processes to detect when any user — not just minors — is developing emotional dependence on a chatbot, and must take reasonable steps to reduce that dependence and associated harm risks. The statute defines emotional dependence by examples: the user treats the chatbot as a primary source of emotional support, expresses distress at losing access, or substitutes the chatbot for human relationships. This is a continuous monitoring and intervention obligation — both detection and mitigation are required. The standard is reasonableness, not perfection.
Statutory Text
A covered entity shall implement reasonable systems and processes to: 1. Identify when a user is developing emotional dependence on the chatbot and take reasonable steps to reduce such dependence and associated risks of harm;
T-01 AI Identity Disclosure · T-01.1 · Deployer · Chatbot
Va. Code § 59.1-615(2)
Plain Language
Covered entities must implement reasonable systems and processes to prevent their chatbots from making materially false representations that the chatbot is a human being. This is framed as a prohibition on affirmative misrepresentation rather than a proactive disclosure duty — the chatbot may not claim to be human, but this provision does not independently require the chatbot to affirmatively disclose that it is AI. The affirmative disclosure obligation is in § 59.1-617. The reasonableness standard acknowledges that edge-case outputs may occur but requires systemic safeguards.
Statutory Text
A covered entity shall implement reasonable systems and processes to: 2. Ensure that a chatbot does not make a materially false representation that it is a human being;
S-04 AI Crisis Response Protocols · S-04.1 · Deployer · Chatbot
Va. Code § 59.1-615(3)
Plain Language
Covered entities must implement reasonable systems and processes to detect when any user — not just minors — expresses suicidal thoughts, intent to self-harm, or signs of an acute mental health crisis. Upon detection, the system must promptly provide a clear and prominent crisis message including crisis services information. This is a continuous operating requirement covering detection, response, and referral. The obligation applies to all users, not only minors, despite the bill's title referencing minors.
Statutory Text
A covered entity shall implement reasonable systems and processes to: 3. Identify when a user is expressing suicidal thoughts, expressing intent to self-harm, or showing signs of an acute mental health crisis and promptly provide a clear and prominent crisis message, including crisis services information, to any such user.
R-01 Incident Reporting · R-01.1 · Deployer · Chatbot
Va. Code § 59.1-616(A)(1)-(3)
Plain Language
When a covered entity learns that a user faces imminent risk of death or serious physical injury, it must make reasonable efforts within 24 hours to notify emergency services or law enforcement, using information it already has or can obtain through reasonable user-facing prompts. If the operator lacks sufficient information to enable emergency notification, it must instead: promptly display a crisis message urging the user to contact emergency services, encourage the user to seek help from a trusted adult, and document the steps taken and why direct notification was not practicable. Good-faith notifications are shielded from liability absent willful misconduct or gross negligence. This is an emergency-response obligation triggered by actual knowledge of imminent risk — not a routine reporting requirement.
Statutory Text
A. 1. If a covered entity obtains knowledge that a user faces an imminent risk of death or serious physical injury, the operator shall make reasonable efforts, within 24 hours, to notify appropriate emergency services or law enforcement to the extent practicable based on information the operator already possesses or can obtain through reasonable, user-facing prompts for the purpose of facilitating emergency assistance. 2. If the operator cannot make a notification under subdivision 1 because the operator lacks sufficient information to enable emergency response, the operator shall: a. Promptly provide a clear and prominent message urging the user to contact emergency services and providing crisis services information; b. Make reasonable efforts to encourage the user to seek immediate help from a trusted adult or emergency services; and c. Document the steps taken and the basis for the operator's determination that notification was not practicable. 3. An operator that makes a notification in good faith under this subsection is not liable for damages solely for making the notification unless the operator acted with willful misconduct or gross negligence.
R-01 Incident Reporting · R-01.1 · Deployer · Chatbot
Va. Code § 59.1-616(B)-(C)
Plain Language
Covered entities must report covered incidents to the Attorney General within 15 days of obtaining knowledge. A covered incident is one where a user suffered death, a suicide attempt, self-harm requiring medical attention, a psychiatric emergency requiring urgent medical treatment, or serious physical injury requiring medical attention arising from chatbot interactions. The report must include dates, a description of the incident and its connection to the chatbot, and any responsive actions taken. A supplemental report may be filed within 60 days to update or correct information. All reports are confidential, though the Attorney General may publish aggregate statistics that do not identify individual users or disclose trade secrets.
Statutory Text
B. A covered entity shall submit a report to the Attorney General within 15 days of obtaining knowledge of a covered incident connected to one or more of its chatbots, which, to the extent known at the time of the report, shall include: 1. The date the operator obtained knowledge of the incident; 2. The date of the incident, if known; 3. A brief description of the incident and the basis for the operator's belief that the incident is connected to the chatbot; and 4. A description of any actions the operator took in response. A covered entity may submit a supplemental report within 60 days of the initial report to update or correct information learned through investigation. C. 1. Reports submitted under this section shall be confidential. 2. The Attorney General may publish aggregate information and statistics derived from such reports, so long as the publication does not identify individual users or disclose trade secrets.
T-01 AI Identity Disclosure · T-01.1T-01.2T-01.3 · Deployer · Chatbot
Va. Code § 59.1-617
Plain Language
All operators — not just covered entities meeting the 500,000-user threshold — must provide two layers of AI identity disclosure: (1) a static, persistent disclaimer visible at all times indicating the chatbot is not human, and (2) pop-up notifications at four specific triggers: login, every 30 minutes of sustained engagement, whenever the user asks, and whenever the chatbot is asked to provide advice in a licensed field such as medicine, finance, or law. The 30-minute interval is more frequent than comparable statutes (e.g., CA SB 243's 3-hour interval). The licensed-advice trigger is unique to this bill and functions as a context-sensitive disclosure requirement. This section applies to all operators of chatbots in Virginia, regardless of user count — it is not limited to covered entities.
Statutory Text
An operator shall (i) include a disclaimer to users of all ages that a chatbot is not a human via a static, persistent disclosure and (ii) notify a user via a pop-up that he is not engaging with a human counterpart at the following intervals: 1. Upon login to the chatbot; 2. Every 30 minutes of sustained user engagement; 3. When prompted by the user; and 4. When asked to provide advice legally regulated by a licensed industry, including medical, financial, or legal advice.
Other · Chatbot
Va. Code § 59.1-618(C)
Plain Language
Any contractual term that waives or limits rights under this chapter, shortens the statute of limitations for claims, bars court enforcement, or requires arbitration is void and unenforceable as against public policy. This is a structural enforcement provision — it prevents covered entities from using standard contract terms (including Terms of Service, arbitration clauses, and liability limitations) to circumvent the statute's protections. It creates no new affirmative compliance obligation of its own but has significant practical implications for product counsel drafting user agreements.
Statutory Text
C. The rights and remedies provided by this chapter shall not be waived by contract. Any term in a contract or agreement that purports to do any of the following is void and unenforceable as against public policy: (i) waive or limit a right or remedy under this chapter; (ii) shorten the time to bring a claim under this chapter; (iii) prevent a person from enforcing a claim under this chapter in court; or (iv) require arbitration of a claim under this chapter.
Other · Chatbot
Va. Code § 59.1-618(D)
Plain Language
This is a standard cumulative-remedies and savings clause. The obligations in this chapter are additive — they do not preempt or replace existing duties under other Virginia or federal law, and they do not limit any existing rights or remedies. This creates no new compliance obligation.
Statutory Text
D. The duties and obligations imposed by this chapter are cumulative with any other duties or obligations imposed under other law and shall not be construed to relieve any party from any duties or obligations imposed under other law and do not limit any rights or remedies under existing law.
Other · Chatbot
Va. Code § 59.1-614 (definition of 'Explicit content')
Plain Language
The bill defines 'explicit content' to cover obscene sexual material, content that provides instructions for or encourages suicide, self-injury, or disordered eating, and graphic extreme violence lacking serious value for minors. However, the engrossed version of the bill does not contain an operative provision that expressly uses this defined term to impose an obligation on covered entities. The term appears only in the definitions section. This may reflect a drafting gap in the floor substitute, or the term may have been intended to support obligations in a prior version. As enacted in this form, the definition alone creates no compliance obligation.
Statutory Text
"Explicit content" means content that meets any of the following: 1. Any description or representation, in whatever form, of nudity, sexual conduct, sexual excitement, or sadomasochistic abuse, as those terms are defined in § 18.2-390, when such content is obscene, as defined in § 18.2-372. 2. Content that provides specific instructions for, or that encourages, advocates for, or incites, suicide, self-injury, or disordered eating behaviors; or 3. Graphic depictions of extreme violence that lack serious literary, artistic, political, or scientific value for minors.