HB-2175
PA · State · USA
PA
USA
● Pending
Proposed Effective Date
2026-04-01
Pennsylvania HB 2175 — An Act Amending Title 12 (Commerce and Trade) of the Pennsylvania Consolidated Statutes, providing for consumer protection and for artificial intelligence and chatbots; imposing duties on the Bureau of Consumer Protection in the Office of Attorney General; and imposing penalties
PA HB 2175 imposes consumer protection obligations on suppliers and operators of AI chatbots that use generative AI to converse with consumers and are represented as being able to help manage situations or treat conditions, including mental health. Core obligations include: prohibiting sale or sharing of consumer health information and chatbot inputs except with consent for healthcare providers, health plans, or functional necessity; banning targeted advertising using chatbot conversations or consumer input; requiring suppliers to develop, maintain, and file with the Bureau of Consumer Protection a written disclosure policy covering the chatbot's purposes, capabilities, limitations, safety testing procedures, and anti-discrimination measures; requiring consumer acknowledgment and consent before chatbot access; and mandating documentation of foundation models, training data, privacy compliance, and data practices. Enforcement is through the Bureau of Consumer Protection with administrative fines up to $2,500 per violation and court-imposed civil penalties up to $5,000 for order violations. All collected fines are deposited into the 988 Suicide and Crisis Lifeline fund.
Summary

PA HB 2175 imposes consumer protection obligations on suppliers and operators of AI chatbots that use generative AI to converse with consumers and are represented as being able to help manage situations or treat conditions, including mental health. Core obligations include: prohibiting sale or sharing of consumer health information and chatbot inputs except with consent for healthcare providers, health plans, or functional necessity; banning targeted advertising using chatbot conversations or consumer input; requiring suppliers to develop, maintain, and file with the Bureau of Consumer Protection a written disclosure policy covering the chatbot's purposes, capabilities, limitations, safety testing procedures, and anti-discrimination measures; requiring consumer acknowledgment and consent before chatbot access; and mandating documentation of foundation models, training data, privacy compliance, and data practices. Enforcement is through the Bureau of Consumer Protection with administrative fines up to $2,500 per violation and court-imposed civil penalties up to $5,000 for order violations. All collected fines are deposited into the 988 Suicide and Crisis Lifeline fund.

Enforcement & Penalties
Enforcement Authority
The Bureau of Consumer Protection in the Office of Attorney General administers and enforces the chapter. The bureau may impose administrative fines and bring court actions against suppliers. The Attorney General may bring civil actions on behalf of the bureau to collect fines or civil penalties. Enforcement is agency-initiated. No private right of action is explicitly created, though § 7106(f) provides that nothing in the enforcement section shall be construed to limit any other remedy available at law.
Penalties
Administrative fines up to $2,500 per act or omission constituting a violation. Court-imposed fines up to $2,500 per violation. Civil penalties up to $5,000 per violation of an administrative or court order. Courts may also grant declaratory relief, injunctive relief, disgorgement of money received in violation, payment of disgorged money to injured consumers, and other relief deemed reasonable and necessary. If the bureau prevails, the court shall award reasonable attorney fees, court costs, and investigative fees. All fines and civil penalties are deposited into the fund designated for the 988 Suicide and Crisis Lifeline.
Who Is Covered
"Operator." An individual or entity, including a corporation, partnership, limited liability company, business trust, estate, foundation, association, organization or trust, or an agent or subsidiary thereof, that offers the use of or provides a chatbot to a consumer, if the chatbot is bought from or otherwise provided by a supplier.
"Supplier." As follows: (1) A seller, lessor, assignor, offeror, broker or other person that regularly solicits, engages in or enforces transactions with a consumer regarding a chatbot, whether or not the person deals directly with the consumer. (2) The term includes an operator.
What Is Covered
"Chatbot." As follows: (1) Artificial intelligence technology that: (i) Uses generative artificial intelligence to engage in interactive conversations with a consumer. (ii) A supplier represents, or a reasonable person would believe, can or will provide information to a consumer to help a consumer manage a situation or treat a condition, including a situation or treatment involving mental health care. (2) The term does not include artificial intelligence technology that only: (i) provides scripted output, including a guided meditation or mindfulness exercise; or (ii) analyzes an individual's input for the purpose of connecting the individual with a human mental health professional.
Compliance Obligations 8 obligations · click obligation ID to open requirement page
D-01 Automated Processing Rights & Data Controls · D-01.4 · Deployer · ChatbotHealthcare
12 Pa.C.S. § 7103(a)-(d)
Plain Language
Suppliers may not sell or share with third parties either a consumer's individually identifiable health information or the content the consumer provides to the chatbot ('consumer input'). Three narrow exceptions exist: (1) a health care provider requests the health information and the consumer gives written consent; (2) the consumer requests that a health plan receive the information and consents in writing; or (3) the sharing is necessary for chatbot functionality with a contractually bound third party and the consumer consents in writing. When sharing under the functionality exception, both the supplier and the third party must comply with HIPAA privacy and security rules as if they were a covered entity and business associate, respectively. Written consent may be obtained via signature, checkbox, electronic signature, or button click.
Statutory Text
(a) Prohibition.--Except as provided under subsections (b) and (c), a supplier may not sell to or share with a third party the following: (1) Individually identifiable health information of a consumer. (2) Consumer input. (b) Applicability.--The prohibition under subsection (a) shall not apply if: (1) Either: (i) A health care provider requests access to the individually identifiable health information of the consumer and the consumer consents to the access in accordance with subsection (d). (ii) The consumer requests that a health plan be provided access to the individually identifiable health information of the consumer and the consumer consents to the access in accordance with subsection (d). (2) The individually identifiable health information is shared in accordance with subsection (c). (c) Sharing information.-- (1) A supplier may share a consumer's individually identifiable health information if: (i) the sharing of the information is necessary to ensure the effective functionality of the chatbot with a third party with which the supplier has a contract related to the functionality; and (ii) the consumer consents to the sharing of the information in accordance with subsection (d). (2) When sharing information in accordance with this subsection, the supplier and the third party shall comply with all applicable privacy and security provisions of 45 CFR Pts. 160 (relating to general administrative requirements) and 164 (relating to security and privacy), as if the supplier were a covered entity and the third party were a business associate. (d) Consent.-- (1) A consumer may consent to access to individually identifiable health information of the consumer by a health care provider or health plan in accordance with this section. (2) To be effective, the consent under this subsection must: (i) Be in writing. (ii) Acknowledge that the consumer understands and agrees to the access of the individually identifiable health information of the consumer by a health care provider or health plan. (3) The consent under this subsection may involve the consumer initialing or signing the acknowledgment described in paragraph (2)(ii), checking a box, providing an electronic signature or hitting a button.
CP-01 Deceptive & Manipulative AI Conduct · Deployer · ChatbotHealthcare
12 Pa.C.S. § 7104(a)-(b)
Plain Language
Suppliers are prohibited from using chatbot conversations to serve advertisements for specific products or services to consumers. They also may not use consumer input to target, select, or customize advertisements — with one exception: advertising for the chatbot itself. This is a broad ban on in-conversation advertising and consumer-input-driven ad targeting. Importantly, the chatbot may still recommend that a consumer seek counseling, therapy, or other assistance from a mental health professional — that is not considered prohibited advertising.
Statutory Text
(a) Supplier.--A supplier may not: (1) Use a chatbot to advertise a specific product or service to a consumer in a conversation between the consumer and the chatbot. (2) Use consumer input to: (i) Determine whether to display an advertisement for a product or service to the consumer, unless the advertisement is for the chatbot itself. (ii) Determine a product, service or category of product or service to advertise to the consumer. (iii) Customize how an advertisement is presented to the consumer. (b) Construction.--This section shall not be construed to prohibit a chatbot from recommending a consumer to seek counseling, therapy or other assistance from a mental health professional.
T-01 AI Identity Disclosure · T-01.1T-01.3 · Deployer · ChatbotHealthcare
12 Pa.C.S. § 7105(a)-(c)(1)-(3)
Plain Language
Suppliers must develop, implement, and maintain a written disclosure policy that clearly and conspicuously states the chatbot's intended purposes, its abilities and limitations, and that it is an AI and not a human. Consumers must acknowledge they have read, understood, and consent to this policy before accessing the chatbot. The AI identity statement must be restated each time a consumer asks or prompts the chatbot about whether AI is being used — creating an on-demand disclosure obligation. Written consent may be provided via signature, checkbox, electronic signature, or button click. Trade secrets and proprietary information must be protected in the policy.
Statutory Text
(a) Policy required.-- (1) Subject to paragraph (2), a supplier of a chatbot shall develop, implement and maintain a written policy containing disclosures regarding the chatbot in accordance with subsection (c). (2) In complying with paragraph (1), a supplier shall protect any trade secret or other proprietary information regarding the chatbot. (b) Consent required.-- (1) Before accessing the features of a chatbot or entering the chat page of a chatbot, a consumer must acknowledge that the consumer has read, understands and consents to the policy described under subsection (a) and the purpose, capabilities and limitations of the chatbot. (2) The consent under this subsection must be in writing and may involve the consumer initialing or signing the acknowledgment described in paragraph (1), checking a box, providing an electronic signature or hitting a button. (c) Specific disclosures.--The policy described under subsection (a) must clearly and conspicuously provide the following: (1) The intended purposes of the chatbot. (2) The abilities and limitations of the chatbot. (3) A statement that the chatbot is an artificial intelligence technology and is not a human, which must be provided each time that the consumer asks or otherwise prompts the chatbot about whether artificial intelligence is being used.
S-01 AI System Safety Program · S-01.1S-01.4S-01.5S-01.7 · Deployer · ChatbotHealthcare
12 Pa.C.S. § 7105(c)(4)(i)-(xi)
Plain Language
Suppliers must disclose in their written policy the specific procedures they use to ensure chatbot safety, covering eleven enumerated topics: pre-launch and ongoing testing ensuring outputs pose no greater risk than human interaction; identification of foreseeable adverse outcomes and harmful interactions; a mechanism for consumers to report harmful interactions; protocols for assessing and responding to risk of harm; actions taken to prevent or mitigate adverse outcomes; protocols for responding promptly to acute physical harm risks; regular objective reviews of safety, accuracy, and efficacy (including possible audits); instructions for safe use of the chatbot; prioritization of consumer mental health and safety over engagement metrics or profit; measures to prevent discriminatory treatment; and compliance with HIPAA privacy and security rules as if the supplier were a covered entity. This is both a disclosure obligation (requiring these procedures to be described in the policy) and a substantive safety obligation (requiring the procedures to actually exist and be implemented, per § 7105(g)).
Statutory Text
(4) The procedures by which the supplier: (i) Conducts testing, prior to making the chatbot publicly available and regularly thereafter, to ensure that the output of the chatbot poses no greater risk to a consumer than that posed to an individual communicating with a human. (ii) Identifies reasonably foreseeable adverse outcomes to, and potentially harmful interactions with, consumers that could result from using the chatbot. (iii) Provides a mechanism for a consumer to report any potentially harmful interactions from the use of the chatbot. (iv) Implements protocols to assess and respond to risk of harm to consumers or other individuals. (v) Details actions taken to prevent or mitigate any adverse outcomes or potentially harmful interactions. (vi) Implements protocols to respond, as soon as practicable, to acute risks of physical harm. (vii) Reasonably ensures regular, objective reviews of safety, accuracy and efficacy, which may include internal or external audits. (viii) Provides consumers with instructions on the safe use of the chatbot. (ix) Prioritizes consumer mental health and safety over engagement metrics or profit. (x) Implements measures to prevent discriminatory treatment of consumers. (xi) Ensures compliance with the security and privacy provisions of 45 CFR Pts. 160 (relating to general administrative requirements) and 164 (relating to security and privacy), as if the supplier were a covered entity.
G-01 AI Governance Program & Documentation · G-01.3 · Deployer · ChatbotHealthcare
12 Pa.C.S. § 7105(d)
Plain Language
Suppliers must create and maintain internal documentation covering five categories related to the chatbot's development and implementation: the foundation models used, training data, privacy law compliance, consumer data collection and sharing practices, and ongoing accuracy/reliability/fairness/safety efforts. This documentation is distinct from the consumer-facing disclosure policy — it is an internal recordkeeping obligation. The statute does not specify a retention period or require the documentation to be produced to regulators on demand, but it must be maintained.
Statutory Text
(d) Documentation.--A supplier shall maintain documentation regarding the development and implementation of the chatbot that describes: (1) Foundation models used in development. (2) Training data used. (3) Compliance with Federal and State privacy law. (4) Consumer data collection and sharing practices. (5) Ongoing efforts to ensure accuracy, reliability, fairness and safety.
R-02 Regulatory Disclosure & Submissions · R-02.3 · Deployer · ChatbotHealthcare
12 Pa.C.S. § 7105(e)-(f)
Plain Language
Suppliers must file their written chatbot disclosure policy with the Bureau of Consumer Protection, along with the supplier's name and address, the chatbot's name, and an annual filing fee set by the bureau. This is a mandatory registration-style obligation — suppliers cannot operate without filing. Suppliers may also voluntarily submit policy revisions and any other documentation they deem appropriate. The bureau prescribes the form and manner of filing.
Statutory Text
(e) Filing.--A supplier shall file the policy described under subsection (a) with the bureau, in the form and manner as prescribed by the bureau, along with: (1) The name and address of the supplier. (2) The name of the chatbot. (3) An annual filing fee as prescribed by the bureau. (f) Additional information.--A supplier may provide to the bureau, in the form and manner prescribed by the bureau: (1) Any revision to the policy described under subsection (a) and filed in accordance with subsection (e). (2) Any other documentation that the supplier deems appropriate to provide.
G-01 AI Governance Program & Documentation · Deployer · ChatbotHealthcare
12 Pa.C.S. § 7105(g)
Plain Language
Suppliers are legally bound to comply with the disclosure policy they file with the Bureau of Consumer Protection. This transforms the filed policy from a mere disclosure document into an enforceable set of commitments — any deviation from the filed policy's stated procedures constitutes a violation of the chapter. This is a compliance pass-through mechanism: the specific obligations vary by supplier based on what they disclosed in their policy, but the requirement to adhere to it is mandatory.
Statutory Text
(g) Compliance.--A supplier shall comply with the requirements of the policy filed in accordance with this section.
CP-01 Deceptive & Manipulative AI Conduct · CP-01.9 · Deployer · ChatbotHealthcare
12 Pa.C.S. § 7107(2)
Plain Language
The chapter may not be interpreted as endorsing or implying that a chatbot is equivalent to, or can replace, a mental health professional or emotional support professional. While framed as a construction clause, this operates as a prohibition: no party may claim, imply, advertise, or otherwise represent that a chatbot is or replaces a licensed mental health or emotional support professional. This reinforces the boundary between AI chatbot services and licensed professional mental health practice.
Statutory Text
Nothing in this chapter shall be construed to: (2) Claim, imply, advertise or otherwise recognize that a chatbot is, or replaces services rendered by, a mental health professional or emotional support professional.