SB-796
VA · State · USA
VA
USA
● Pending
Proposed Effective Date
2026-07-01
Virginia SB 796 — Artificial Intelligence Companion Chatbots and Minors Act; established, enforcement, civil penalty
Imposes safety, disclosure, and incident reporting obligations on operators of AI chatbots with 500,000 or more monthly active users worldwide. Requires covered entities to implement systems to detect and mitigate user emotional dependence, prevent false claims of being human, and detect and respond to suicidal ideation, self-harm, and acute mental health crises with crisis referrals. Mandates persistent AI identity disclosure plus pop-up reminders every 30 minutes, on login, on user request, and when providing regulated professional advice. Requires 24-hour emergency notification to law enforcement when imminent risk of death or serious physical injury is detected, and 15-day incident reports to the Attorney General for covered harms including death, suicide attempts, and serious injury. Enforced by the Attorney General with civil penalties up to $50,000 per violation per day, and by private right of action for actual damages, attorney fees, and punitive damages for willful or grossly negligent violations. Contractual waivers, arbitration clauses, and limitations on claims are void.
Summary

Imposes safety, disclosure, and incident reporting obligations on operators of AI chatbots with 500,000 or more monthly active users worldwide. Requires covered entities to implement systems to detect and mitigate user emotional dependence, prevent false claims of being human, and detect and respond to suicidal ideation, self-harm, and acute mental health crises with crisis referrals. Mandates persistent AI identity disclosure plus pop-up reminders every 30 minutes, on login, on user request, and when providing regulated professional advice. Requires 24-hour emergency notification to law enforcement when imminent risk of death or serious physical injury is detected, and 15-day incident reports to the Attorney General for covered harms including death, suicide attempts, and serious injury. Enforced by the Attorney General with civil penalties up to $50,000 per violation per day, and by private right of action for actual damages, attorney fees, and punitive damages for willful or grossly negligent violations. Contractual waivers, arbitration clauses, and limitations on claims are void.

Enforcement & Penalties
Enforcement Authority
Attorney General may initiate an action in the name of the Commonwealth to seek injunctive relief and civil penalties. Private right of action available to any person harmed by a violation, or the parent or legal guardian of a minor harmed by a violation. No cure period or safe harbor is specified. Rights and remedies may not be waived by contract; arbitration clauses and contractual limitations on claims are void and unenforceable.
Penalties
Attorney General enforcement: civil penalties of up to $50,000 per violation; each day of noncompliance constitutes a separate violation; injunctive relief. Private action: actual damages, reasonable attorney fees and costs, injunctive or declaratory relief, and punitive damages if the violation was willful and wanton, reckless, or grossly negligent. Private plaintiffs must prove actual damages; no statutory minimum is specified.
Who Is Covered
"Covered entity" means an operator of a chatbot that has 500,000 or more monthly active users worldwide. "Covered entity" does not include an operator of a chatbot that is: 1. Not offered to the general public, such as internal workplace tools, clinician-supervised clinical tools, or university research systems; or 2. Used by a business entity solely for customer service or strictly to provide users with information about available commercial services or products provided by the business entity, customer service account information, or other information strictly related to customer service. For purposes of determining monthly active users, a covered entity shall aggregate monthly active users across all chatbots offered by the covered entity and such entity's affiliates.
"Operator" means any person or entity that owns, controls, offers, or makes available a website, mobile application, or digital service that provides a chatbot to users in the Commonwealth.
What Is Covered
"Chatbot" means any artificial intelligence, algorithmic, or automated system that (i) produces new expressive content or responses not fully predetermined by the developer or operator of the service or application; (ii) accepts open-ended natural-language or multimodal user input and produces adaptive or context-responsive natural language output; and (iii) maintains a conversational state across exchanges and is designed to facilitate multi-turn dialogue rather than to respond to discrete information requests.
Compliance Obligations 7 obligations · click obligation ID to open requirement page
CP-01 Deceptive & Manipulative AI Conduct · CP-01.1CP-01.4 · Deployer · Chatbot
Va. Code § 59.1-615(1)
Plain Language
Covered entities must build and maintain reasonable systems capable of detecting when a user is developing emotional dependence on the chatbot — meaning the user is relying on the chatbot as a primary source of emotional support, expressing distress at losing access, or substituting the chatbot for human relationships. Upon detecting such patterns, the operator must take reasonable steps to reduce the dependence and mitigate associated harm risks. The standard is reasonableness, not perfection — but the obligation requires both detection capability and affirmative intervention.
Statutory Text
A covered entity shall implement reasonable systems and processes to: 1. Identify when a user is developing emotional dependence on the chatbot and take reasonable steps to reduce such dependence and associated risks of harm;
T-01 AI Identity Disclosure · T-01.1 · Deployer · Chatbot
Va. Code § 59.1-615(2)
Plain Language
Covered entities must implement reasonable systems and processes to prevent their chatbots from making materially false representations that they are human beings. This goes beyond a disclosure obligation — it requires affirmative technical measures to ensure the chatbot itself does not claim to be human in its outputs. The standard is 'materially false representation,' which implies that incidental anthropomorphic language may not trigger a violation, but affirmative claims of being human would.
Statutory Text
A covered entity shall implement reasonable systems and processes to: 2. Ensure that a chatbot does not make a materially false representation that it is a human being;
MN-02 AI Crisis Response Protocols · MN-02.1 · Deployer · Chatbot
Va. Code § 59.1-615(3)
Plain Language
Covered entities must implement systems to detect when users express suicidal thoughts, intent to self-harm, or signs of an acute mental health crisis. Upon detection, the system must promptly deliver a clear, prominent crisis message that includes crisis services information. This is a continuous operating requirement — the detection and response capability must be active at all times the chatbot is available. The obligation covers all users, not just minors.
Statutory Text
A covered entity shall implement reasonable systems and processes to: 3. Identify when a user is expressing suicidal thoughts, expressing intent to self-harm, or showing signs of an acute mental health crisis and promptly provide a clear and prominent crisis message, including crisis services information, to any such user.
R-01 Incident Reporting · R-01.1 · Deployer · Chatbot
Va. Code § 59.1-616(A)(1)-(3)
Plain Language
When a covered entity learns that a user faces imminent risk of death or serious physical injury, it must make reasonable efforts within 24 hours to notify emergency services or law enforcement, using information it already has or can obtain through reasonable user-facing prompts. If the operator lacks sufficient information to make the notification, it must instead: (1) display a clear, prominent crisis message urging the user to contact emergency services, (2) encourage the user to seek help from a trusted adult or emergency services, and (3) document the steps taken and why direct notification was not practicable. A good-faith safe harbor protects operators from liability for making the notification itself, unless the operator acted with willful misconduct or gross negligence.
Statutory Text
A. 1. If a covered entity obtains knowledge that a user faces an imminent risk of death or serious physical injury, the operator shall make reasonable efforts, within 24 hours, to notify appropriate emergency services or law enforcement to the extent practicable based on information the operator already possesses or can obtain through reasonable, user-facing prompts for the purpose of facilitating emergency assistance. 2. If the operator cannot make a notification under subdivision 1 because the operator lacks sufficient information to enable emergency response, the operator shall: a. Promptly provide a clear and prominent message urging the user to contact emergency services and providing crisis services information; b. Make reasonable efforts to encourage the user to seek immediate help from a trusted adult or emergency services; and c. Document the steps taken and the basis for the operator's determination that notification was not practicable. 3. An operator that makes a notification in good faith under this subsection is not liable for damages solely for making the notification unless the operator acted with willful misconduct or gross negligence.
R-01 Incident Reporting · R-01.1 · Deployer · Chatbot
Va. Code § 59.1-616(B)-(C)
Plain Language
Within 15 days of learning that a user suffered a covered harm (death, suicide attempt, self-harm requiring medical attention, psychiatric emergency, or serious physical injury) connected to one of its chatbots, a covered entity must submit a confidential report to the Attorney General. The report must include the date of knowledge, date of incident if known, a description of what happened and why the operator believes it is connected to the chatbot, and the operator's response actions. A supplemental report may be filed within 60 days to update or correct information. Reports are confidential, though the Attorney General may publish aggregate statistics that do not identify users or disclose trade secrets.
Statutory Text
B. A covered entity shall submit a report to the Attorney General within 15 days of obtaining knowledge of a covered incident connected to one or more of its chatbots, which, to the extent known at the time of the report, shall include: 1. The date the operator obtained knowledge of the incident; 2. The date of the incident, if known; 3. A brief description of the incident and the basis for the operator's belief that the incident is connected to the chatbot; and 4. A description of any actions the operator took in response. A covered entity may submit a supplemental report within 60 days of the initial report to update or correct information learned through investigation. C. 1. Reports submitted under this section shall be confidential. 2. The Attorney General may publish aggregate information and statistics derived from such reports, so long as the publication does not identify individual users or disclose trade secrets.
T-01 AI Identity Disclosure · T-01.1T-01.2T-01.3 · Deployer · Chatbot
Va. Code § 59.1-617
Plain Language
All operators (not just covered entities meeting the 500,000-user threshold) must provide two layers of AI identity disclosure: (1) a static, persistent disclaimer visible to all users at all times indicating the chatbot is not human, and (2) pop-up notifications at four specific trigger points — upon login, every 30 minutes of sustained engagement, whenever a user asks, and whenever the chatbot is about to provide advice in a licensed field such as medical, financial, or legal advice. The 30-minute interval is notably more frequent than some jurisdictions (e.g., California SB 243's 3-hour interval). The obligation applies to users of all ages and is unconditional — no 'reasonable person would be misled' threshold applies.
Statutory Text
An operator shall (i) include a disclaimer to users of all ages that a chatbot is not a human via a static, persistent disclosure and (ii) notify a user via a pop-up that he is not engaging with a human counterpart at the following intervals: 1. Upon login to the chatbot; 2. Every 30 minutes of sustained user engagement; 3. When prompted by the user; and 4. When asked to provide advice legally regulated by a licensed industry, including medical, financial, or legal advice.
Other · Chatbot
Va. Code § 59.1-618(C)-(D)
Plain Language
Operators cannot use contracts, terms of service, or arbitration agreements to limit users' rights under this chapter. Any contractual provision that waives remedies, shortens limitations periods, bars court claims, or requires arbitration is void as against public policy. Additionally, this chapter's obligations are cumulative with all other applicable law — compliance here does not excuse noncompliance with other legal obligations. Practitioners should review existing user agreements and arbitration clauses for conflicts.
Statutory Text
C. The rights and remedies provided by this chapter shall not be waived by contract. Any term in a contract or agreement that purports to do any of the following is void and unenforceable as against public policy: (i) waive or limit a right or remedy under this chapter; (ii) shorten the time to bring a claim under this chapter; (iii) prevent a person from enforcing a claim under this chapter in court; or (iv) require arbitration of a claim under this chapter. D. The duties and obligations imposed by this chapter are cumulative with any other duties or obligations imposed under other law and shall not be construed to relieve any party from any duties or obligations imposed under other law and do not limit any rights or remedies under existing law.