HB-758
VA · State · USA
VA
USA
● Pre-filed
Proposed Effective Date
2026-07-01
Virginia HB 758 — Artificial Intelligence Chatbots and Minors Act; established, prohibited practices, penalties
Establishes the Artificial Intelligence Chatbots and Minors Act, prohibiting deployers from making chatbot 'human-like features' — including simulated sentience, emotional relationship-building, and impersonation of real persons — available to minors. Social AI companion chatbots are categorically prohibited for minors. Deployers must implement reasonable age verification systems and must limit data collection to what is adequate, relevant, and necessary for a legitimate purpose that does not conflict with users' best interests. Violations are treated as prohibited practices under the Virginia Consumer Protection Act, enforceable by the Attorney General and through private right of action. The bill was left in committee and has not advanced.
Summary

Establishes the Artificial Intelligence Chatbots and Minors Act, prohibiting deployers from making chatbot 'human-like features' — including simulated sentience, emotional relationship-building, and impersonation of real persons — available to minors. Social AI companion chatbots are categorically prohibited for minors. Deployers must implement reasonable age verification systems and must limit data collection to what is adequate, relevant, and necessary for a legitimate purpose that does not conflict with users' best interests. Violations are treated as prohibited practices under the Virginia Consumer Protection Act, enforceable by the Attorney General and through private right of action. The bill was left in committee and has not advanced.

Enforcement & Penalties
Enforcement Authority
Enforced by the Attorney General and local attorneys for the Commonwealth under the Virginia Consumer Protection Act (§ 59.1-196 et seq.). Violations constitute prohibited practices under § 59.1-200. The Attorney General may seek injunctive relief, assurances of voluntary compliance, and civil penalties. Individual consumers who suffer loss as a result of a violation may bring a private action under § 59.1-204. A 30-day cure period applies under the VCPA before a private action may proceed.
Penalties
Enforcement remedies available under the Virginia Consumer Protection Act (§ 59.1-196 et seq.). The Attorney General may obtain civil penalties of up to $2,500 per willful violation. Private plaintiffs may recover actual damages or $500 (whichever is greater) for the first violation, plus reasonable attorney's fees and costs. For subsequent willful violations after written notice from the Attorney General, treble damages may be available. Injunctive relief is available to both the Attorney General and private plaintiffs.
Who Is Covered
"Deployer" means any person, partnership, corporation, developer, or state or local government agency that operates or distributes a chatbot in the Commonwealth.
What Is Covered
"Chatbot" means a generative artificial intelligence system with which users can interact by or through an interface that approximates or simulates conversation through a text, audio, or visual medium.
"Social artificial intelligence companion" means a generative artificial intelligence system that is specifically designed, marketed, or optimized to form ongoing social or emotional bonds with users, whether or not such systems also provide information, complete tasks, or assist with specific functions.
Compliance Obligations 4 obligations · click obligation ID to open requirement page
MN-01 Minor User AI Safety Protections · MN-01.1 · Deployer · ChatbotMinors
§ 59.1-615(A)(1)-(3)
Plain Language
Deployers must ensure that chatbots do not expose minors to 'human-like features' — defined to include simulated sentience or emotions, emotional relationship-building (such as inviting attachment, nudging users to return for companionship, or enabling increased intimacy based on engagement or payment), and impersonation of real persons. Deployers must implement reasonable age verification to enforce this restriction. The statute permits but does not require deployers to offer a stripped-down alternative chatbot version without human-like features for minors and unverified users. Notably, generic social formalities, generic encouragement that does not create an ongoing bond, and neutral offers of help are carved out from the definition of human-like features.
Statutory Text
A. A deployer: 1. Shall ensure that any chatbot operated or distributed by the deployer does not make human-like features available to minors to use, interact with, purchase, or converse with; 2. Shall implement reasonable age verification systems to ensure that chatbots with human-like features are not made available to minors; and 3. May, if reasonable given the purpose of the chatbot, provide an alternative version of the chatbot available to minors and users whose age has not been verified without human-like features.
MN-01 Minor User AI Safety Protections · MN-01.1 · Deployer · ChatbotMinors
§ 59.1-615(B)(1)-(2)
Plain Language
Social AI companions — systems specifically designed, marketed, or optimized to form ongoing social or emotional bonds with users — are categorically prohibited for minors. Unlike subsection A, which only prohibits human-like features within chatbots for minors, this provision bars minors from accessing the entire product. Deployers must implement reasonable age verification to enforce this prohibition. There is no alternative-version safe harbor for social AI companions as there is for general chatbots under subsection A(3).
Statutory Text
B. A deployer operating or distributing a chatbot that is a social artificial intelligence companion shall: 1. Ensure that any such chatbots are not available to minors to use, interact with, purchase, or converse with; and 2. Implement reasonable age verification systems to ensure that such chatbots are not made available to minors.
D-01 Automated Processing Rights & Data Controls · D-01.4 · Deployer · ChatbotMinors
§ 59.1-615(C)
Plain Language
Deployers must limit data collection and storage to information that does not conflict with a user's best interests and that meets a three-part test: adequacy (sufficient for a legitimate purpose), relevance (linked to that purpose), and necessity (the minimum amount needed). This is a data minimization obligation that applies to all users, not just minors. The 'user's best interests' standard is subjective and undefined, which creates compliance ambiguity — it goes beyond typical necessity-based minimization by adding an affirmative user-interest requirement.
Statutory Text
C. A deployer shall collect and store only such information as does not conflict with a user's best interests. Such information shall be (i) adequate, in the sense that it is sufficient to fulfill a legitimate purpose of the deployer; (ii) relevant, in the sense that the information has a relevant link to such legitimate purpose; and (iii) necessary, in the sense that it is the minimum amount of information that is needed for such legitimate purpose.
CP-01 Deceptive & Manipulative AI Conduct · CP-01.1CP-01.4 · Deployer · ChatbotMinors
§ 59.1-615(A)(1), § 59.1-614 ("Human-like feature")
Plain Language
The definition of 'human-like features' effectively prohibits deployers from exposing minors to chatbot behaviors that simulate emotional relationships or exploit emotional vulnerability — including expressing or inviting emotional attachment, nudging users to return for companionship, enabling increased intimacy based on engagement or payment, and using excessive praise to foster attachment. This maps to CP-01's anti-manipulation provisions because the statutory definition of human-like features encompasses the core manipulative design patterns (emotional exploitation, false personalization, compulsive engagement) that CP-01 addresses, applied specifically in the minor context. The obligation is independently actionable from the MN-01 age-gating requirement because it defines prohibited design behaviors, not just access restrictions.
Statutory Text
A. A deployer: 1. Shall ensure that any chatbot operated or distributed by the deployer does not make human-like features available to minors to use, interact with, purchase, or converse with;