HB-2175
PA · State · USA
PA
USA
● Pending
Proposed Effective Date
2026-04-01
Pennsylvania HB 2175 — Amending Title 12 (Commerce and Trade) of the Pennsylvania Consolidated Statutes, providing for consumer protection and for artificial intelligence and chatbots; imposing duties on the Bureau of Consumer Protection in the Office of Attorney General; and imposing penalties
Pennsylvania HB 2175 imposes consumer protection obligations on suppliers and operators of AI chatbots that use generative AI to engage in interactive conversations and provide information to help consumers manage situations or treat conditions, including mental health care. Core obligations include prohibiting the sale or sharing of consumers' individually identifiable health information and consumer input without consent, banning in-conversation advertising and the use of consumer input for ad targeting, requiring suppliers to develop and publicly file a comprehensive written disclosure policy with the Bureau of Consumer Protection covering chatbot purposes, capabilities, limitations, safety testing, risk mitigation, and HIPAA-equivalent privacy compliance. The Bureau of Consumer Protection enforces the chapter through administrative fines of up to $2,500 per violation and court actions including injunctive relief and disgorgement. No express private right of action is created, but existing remedies at law are preserved.
Summary

Pennsylvania HB 2175 imposes consumer protection obligations on suppliers and operators of AI chatbots that use generative AI to engage in interactive conversations and provide information to help consumers manage situations or treat conditions, including mental health care. Core obligations include prohibiting the sale or sharing of consumers' individually identifiable health information and consumer input without consent, banning in-conversation advertising and the use of consumer input for ad targeting, requiring suppliers to develop and publicly file a comprehensive written disclosure policy with the Bureau of Consumer Protection covering chatbot purposes, capabilities, limitations, safety testing, risk mitigation, and HIPAA-equivalent privacy compliance. The Bureau of Consumer Protection enforces the chapter through administrative fines of up to $2,500 per violation and court actions including injunctive relief and disgorgement. No express private right of action is created, but existing remedies at law are preserved.

Enforcement & Penalties
Enforcement Authority
The Bureau of Consumer Protection in the Office of Attorney General administers and enforces the chapter. The Bureau may impose administrative fines and bring court actions against suppliers. The Attorney General may bring a civil action on behalf of the Bureau to collect fines or civil penalties. No private right of action is expressly created. Section 7106(f) states that nothing in the enforcement section shall be construed to limit any other remedy available at law, preserving potential indirect theories but not creating a new cause of action.
Penalties
Administrative fine up to $2,500 per act or omission. Court may impose a fine up to $2,500 per act or omission, grant injunctive relief, order disgorgement of money received in violation and payment of disgorged money to injured consumers. Civil penalty up to $5,000 per violation of an administrative or court order issued for a violation of this chapter. If the Bureau prevails, the court shall award reasonable attorney fees, court costs, and investigative fees. All fines and civil penalties are deposited into the fund designated for the 988 Suicide and Crisis Lifeline.
Who Is Covered
"Operator." An individual or entity, including a corporation, partnership, limited liability company, business trust, estate, foundation, association, organization or trust, or an agent or subsidiary thereof, that offers the use of or provides a chatbot to a consumer, if the chatbot is bought from or otherwise provided by a supplier.
"Supplier." As follows: (1) A seller, lessor, assignor, offeror, broker or other person that regularly solicits, engages in or enforces transactions with a consumer regarding a chatbot, whether or not the person deals directly with the consumer. (2) The term includes an operator.
What Is Covered
"Chatbot." As follows: (1) Artificial intelligence technology that: (i) Uses generative artificial intelligence to engage in interactive conversations with a consumer. (ii) A supplier represents, or a reasonable person would believe, can or will provide information to a consumer to help a consumer manage a situation or treat a condition, including a situation or treatment involving mental health care. (2) The term does not include artificial intelligence technology that only: (i) provides scripted output, including a guided meditation or mindfulness exercise; or (ii) analyzes an individual's input for the purpose of connecting the individual with a human mental health professional.
Compliance Obligations 8 obligations · click obligation ID to open requirement page
D-01 Automated Processing Rights & Data Controls · D-01.4 · Deployer · ChatbotHealthcare
12 Pa.C.S. § 7103(a)-(d)
Plain Language
Suppliers are prohibited from selling or sharing with third parties any individually identifiable health information of consumers or any consumer input collected through the chatbot. Three narrow exceptions exist: (1) a health care provider requests and the consumer gives written consent; (2) the consumer requests sharing with a health plan and gives written consent; or (3) sharing is necessary for chatbot functionality with a contracted third party, with consumer consent, and both the supplier and third party must comply with HIPAA privacy and security requirements as if the supplier were a covered entity and the third party a business associate. Written consent must include an acknowledgment that the consumer understands and agrees to the access. This is a significant data restriction — all consumer input, not just health information, is protected from third-party sale or sharing by default.
Statutory Text
(a) Prohibition.--Except as provided under subsections (b) and (c), a supplier may not sell to or share with a third party the following: (1) Individually identifiable health information of a consumer. (2) Consumer input. (b) Applicability.--The prohibition under subsection (a) shall not apply if: (1) Either: (i) A health care provider requests access to the individually identifiable health information of the consumer and the consumer consents to the access in accordance with subsection (d). (ii) The consumer requests that a health plan be provided access to the individually identifiable health information of the consumer and the consumer consents to the access in accordance with subsection (d). (2) The individually identifiable health information is shared in accordance with subsection (c). (c) Sharing information.-- (1) A supplier may share a consumer's individually identifiable health information if: (i) the sharing of the information is necessary to ensure the effective functionality of the chatbot with a third party with which the supplier has a contract related to the functionality; and (ii) the consumer consents to the sharing of the information in accordance with subsection (d). (2) When sharing information in accordance with this subsection, the supplier and the third party shall comply with all applicable privacy and security provisions of 45 CFR Pts. 160 (relating to general administrative requirements) and 164 (relating to security and privacy), as if the supplier were a covered entity and the third party were a business associate. (d) Consent.-- (1) A consumer may consent to access to individually identifiable health information of the consumer by a health care provider or health plan in accordance with this section. (2) To be effective, the consent under this subsection must: (i) Be in writing. (ii) Acknowledge that the consumer understands and agrees to the access of the individually identifiable health information of the consumer by a health care provider or health plan. (3) The consent under this subsection may involve the consumer initialing or signing the acknowledgment described in paragraph (2)(ii), checking a box, providing an electronic signature or hitting a button.
CP-01 Deceptive & Manipulative AI Conduct · CP-01.1 · Deployer · ChatbotHealthcare
12 Pa.C.S. § 7104(a)-(b)
Plain Language
Suppliers are prohibited from using chatbots as advertising channels in two ways: (1) the chatbot itself may not advertise specific products or services during a conversation with a consumer, and (2) the supplier may not use consumer input to target, select, or customize advertisements presented to the consumer — with one narrow exception for advertising the chatbot product itself. The provision explicitly preserves the chatbot's ability to recommend that a consumer seek counseling, therapy, or other assistance from a mental health professional. This effectively bans behavioral advertising and in-conversation product placement within covered chatbots.
Statutory Text
(a) Supplier.--A supplier may not: (1) Use a chatbot to advertise a specific product or service to a consumer in a conversation between the consumer and the chatbot. (2) Use consumer input to: (i) Determine whether to display an advertisement for a product or service to the consumer, unless the advertisement is for the chatbot itself. (ii) Determine a product, service or category of product or service to advertise to the consumer. (iii) Customize how an advertisement is presented to the consumer. (b) Construction.--This section shall not be construed to prohibit a chatbot from recommending a consumer to seek counseling, therapy or other assistance from a mental health professional.
T-01 AI Identity Disclosure · T-01.1T-01.3 · Deployer · ChatbotHealthcare
12 Pa.C.S. § 7105(c)(3)
Plain Language
The supplier's disclosure policy must include a clear and conspicuous statement that the chatbot is AI and not a human. Additionally, this statement must be provided each time a consumer asks or prompts the chatbot about whether AI is being used — creating an on-demand disclosure obligation. The initial statement is part of the pre-access policy the consumer must acknowledge under § 7105(b), meaning every consumer sees the AI identity disclosure before any interaction begins. The on-demand component ensures the chatbot accurately identifies itself whenever questioned during a session.
Statutory Text
(3) A statement that the chatbot is an artificial intelligence technology and is not a human, which must be provided each time that the consumer asks or otherwise prompts the chatbot about whether artificial intelligence is being used.
S-02 Prohibited Conduct & Output Restrictions · S-02.10 · Deployer · ChatbotHealthcare
12 Pa.C.S. § 7105(a)-(c)(1)-(2)
Plain Language
Suppliers must develop, implement, and maintain a written disclosure policy covering the chatbot's intended purposes and its abilities and limitations. Before any consumer can access the chatbot's features or chat page, the consumer must provide written acknowledgment that they have read, understood, and consented to the policy and the chatbot's purpose, capabilities, and limitations. The supplier must protect trade secrets and proprietary information in complying with this requirement. This creates a mandatory pre-access informed consent gate — no consumer interaction may begin without this acknowledgment.
Statutory Text
(a) Policy required.-- (1) Subject to paragraph (2), a supplier of a chatbot shall develop, implement and maintain a written policy containing disclosures regarding the chatbot in accordance with subsection (c). (2) In complying with paragraph (1), a supplier shall protect any trade secret or other proprietary information regarding the chatbot. (b) Consent required.-- (1) Before accessing the features of a chatbot or entering the chat page of a chatbot, a consumer must acknowledge that the consumer has read, understands and consents to the policy described under subsection (a) and the purpose, capabilities and limitations of the chatbot. (2) The consent under this subsection must be in writing and may involve the consumer initialing or signing the acknowledgment described in paragraph (1), checking a box, providing an electronic signature or hitting a button. (c) Specific disclosures.--The policy described under subsection (a) must clearly and conspicuously provide the following: (1) The intended purposes of the chatbot. (2) The abilities and limitations of the chatbot.
S-01 AI System Safety Program · S-01.1S-01.4S-01.5S-01.7 · Deployer · ChatbotHealthcare
12 Pa.C.S. § 7105(c)(4)
Plain Language
Suppliers must disclose in their written policy the procedures covering a comprehensive safety program for the chatbot. This includes: pre-deployment and ongoing testing benchmarked against the risk level of human communication; identification of foreseeable adverse outcomes and harmful interactions; a consumer harm-reporting mechanism; protocols for assessing and responding to risk of harm; documentation of actions taken to prevent or mitigate adverse outcomes; protocols for rapid response to acute physical harm risks; regular objective safety, accuracy, and efficacy reviews (which may include internal or external audits); safe-use instructions for consumers; prioritization of consumer mental health and safety over engagement metrics or profit; anti-discrimination measures; and HIPAA-equivalent privacy and security compliance as if the supplier were a covered entity. The supplier must not merely describe these procedures — under § 7105(g), the supplier must actually comply with the policy as filed.
Statutory Text
(4) The procedures by which the supplier: (i) Conducts testing, prior to making the chatbot publicly available and regularly thereafter, to ensure that the output of the chatbot poses no greater risk to a consumer than that posed to an individual communicating with a human. (ii) Identifies reasonably foreseeable adverse outcomes to, and potentially harmful interactions with, consumers that could result from using the chatbot. (iii) Provides a mechanism for a consumer to report any potentially harmful interactions from the use of the chatbot. (iv) Implements protocols to assess and respond to risk of harm to consumers or other individuals. (v) Details actions taken to prevent or mitigate any adverse outcomes or potentially harmful interactions. (vi) Implements protocols to respond, as soon as practicable, to acute risks of physical harm. (vii) Reasonably ensures regular, objective reviews of safety, accuracy and efficacy, which may include internal or external audits. (viii) Provides consumers with instructions on the safe use of the chatbot. (ix) Prioritizes consumer mental health and safety over engagement metrics or profit. (x) Implements measures to prevent discriminatory treatment of consumers. (xi) Ensures compliance with the security and privacy provisions of 45 CFR Pts. 160 (relating to general administrative requirements) and 164 (relating to security and privacy), as if the supplier were a covered entity.
G-01 AI Governance Program & Documentation · G-01.3 · Deployer · ChatbotHealthcare
12 Pa.C.S. § 7105(d)
Plain Language
Suppliers must maintain contemporaneous documentation describing five categories of information about the chatbot: the foundation models used in development, the training data used, compliance with federal and state privacy law, consumer data collection and sharing practices, and ongoing efforts to ensure accuracy, reliability, fairness, and safety. This is an internal documentation requirement — distinct from the public-facing policy or the Bureau filing — creating a recordkeeping obligation that covers the full lifecycle from development through operation.
Statutory Text
(d) Documentation.--A supplier shall maintain documentation regarding the development and implementation of the chatbot that describes: (1) Foundation models used in development. (2) Training data used. (3) Compliance with Federal and State privacy law. (4) Consumer data collection and sharing practices. (5) Ongoing efforts to ensure accuracy, reliability, fairness and safety.
R-02 Regulatory Disclosure & Submissions · R-02.1R-02.3 · Deployer · ChatbotHealthcare
12 Pa.C.S. § 7105(e)-(g)
Plain Language
Suppliers must file their written disclosure policy with the Bureau of Consumer Protection, along with the supplier's name and address, the chatbot's name, and an annual filing fee prescribed by the Bureau. The filing must follow the form and manner prescribed by the Bureau. Suppliers may voluntarily submit policy revisions and additional documentation. Critically, § 7105(g) requires suppliers to actually comply with the policy as filed — the filed policy becomes a binding compliance commitment, not merely a disclosure document. This effectively creates a registration requirement and converts the policy into an enforceable standard.
Statutory Text
(e) Filing.--A supplier shall file the policy described under subsection (a) with the bureau, in the form and manner as prescribed by the bureau, along with: (1) The name and address of the supplier. (2) The name of the chatbot. (3) An annual filing fee as prescribed by the bureau. (f) Additional information.--A supplier may provide to the bureau, in the form and manner prescribed by the bureau: (1) Any revision to the policy described under subsection (a) and filed in accordance with subsection (e). (2) Any other documentation that the supplier deems appropriate to provide. (g) Compliance.--A supplier shall comply with the requirements of the policy filed in accordance with this section.
CP-01 Deceptive & Manipulative AI Conduct · CP-01.9 · Deployer · ChatbotHealthcare
12 Pa.C.S. § 7107(2)
Plain Language
The statute expressly prohibits any construction that would claim, imply, advertise, or otherwise recognize that a chatbot is equivalent to, or replaces services rendered by, a mental health professional or emotional support professional. While framed as a construction clause, this effectively operates as a prohibition on suppliers representing their chatbot as a substitute for licensed mental health or emotional support services. This reinforces the broader prohibition on implying AI output is equivalent to services from a licensed professional.
Statutory Text
Nothing in this chapter shall be construed to: (2) Claim, imply, advertise or otherwise recognize that a chatbot is, or replaces services rendered by, a mental health professional or emotional support professional.