S-243
MA · State · USA
MA
USA
● Pre-filed
Proposed Effective Date
2025-01-17
Massachusetts Senate No. 243 — An Act requiring consumer notification for chatbot systems
Declares it an unfair and deceptive trade practice under Massachusetts Chapter 93A for any person to use a bot in a commercial transaction or trade practice with a consumer where the bot may mislead or deceive a reasonable person into believing they are engaging with a human — regardless of whether the consumer is actually misled or harmed. Provides a safe harbor: no liability attaches if the consumer is notified in a clear and conspicuous fashion that they are communicating with a computer. The bill imposes obligations on any 'person' utilizing or deploying a bot, without a formal covered-entity definition. Enforcement runs through the existing Chapter 93A framework, including AG enforcement and private right of action for injured consumers.
Summary

Declares it an unfair and deceptive trade practice under Massachusetts Chapter 93A for any person to use a bot in a commercial transaction or trade practice with a consumer where the bot may mislead or deceive a reasonable person into believing they are engaging with a human — regardless of whether the consumer is actually misled or harmed. Provides a safe harbor: no liability attaches if the consumer is notified in a clear and conspicuous fashion that they are communicating with a computer. The bill imposes obligations on any 'person' utilizing or deploying a bot, without a formal covered-entity definition. Enforcement runs through the existing Chapter 93A framework, including AG enforcement and private right of action for injured consumers.

Enforcement & Penalties
Enforcement Authority
Enforced through Massachusetts General Laws Chapter 93A. The Attorney General has enforcement authority under § 4 of Chapter 93A. Private enforcement is available under § 9 (consumers) and § 11 (businesses) of Chapter 93A for persons who suffer a loss of money or property as a result of unfair or deceptive acts. No new enforcement mechanism is created by this bill — it declares the covered conduct to be a per se violation of the existing Chapter 93A framework.
Penalties
By declaring the covered conduct a violation of Chapter 93A § 2, the bill activates Chapter 93A remedies. Under § 9, consumers may recover actual damages or $25, whichever is greater; up to treble damages for willful or knowing violations; and reasonable attorney's fees and costs. Under § 11, business plaintiffs may recover actual damages or treble damages for willful or knowing violations. Injunctive relief is also available. The Attorney General may seek civil penalties of up to $5,000 per violation under § 4.
Who Is Covered
What Is Covered
"Bot", an automated online account wherein all or substantially all of the actions or posts of that account are not the result of a person including, but not limited to, a chatbot, artificial intelligence agent, avatar or other computer technology that engages in a textual or aural conversation.
Compliance Obligations 1 obligation · click obligation ID to open requirement page
T-01 AI Identity Disclosure · T-01.1 · Deployer · Chatbot
Mass. Gen. Laws ch. 93, § 115(b)
Plain Language
Any person who deploys a bot in a commercial context where a reasonable person could be misled into thinking they are talking to a human commits a per se Chapter 93A violation — regardless of whether the consumer is actually misled or damaged. The safe harbor is clear and conspicuous disclosure: if you notify the consumer that they are communicating with a computer rather than a human, no liability attaches. Importantly, the trigger is objective (could a reasonable person be misled?) and does not require proof of actual deception or harm. The scope is limited to commercial transactions or trade practices — purely non-commercial bot interactions are not covered.
Statutory Text
(b) It is hereby declared to be an unfair and deceptive act or practice in violation of section 2 of chapter 93A for any person to engage in a commercial transaction or trade practice with a consumer of any kind in which the consumer is communicating or otherwise interacting with a bot that may mislead or deceive a reasonable person to believe they are engaging with a human, regardless of whether such consumer is in fact misled, deceived or damaged thereby; provided, however, that a person utilizing or deploying a bot shall not be liable under this section if the consumer is notified in a clear and conspicuous fashion that they are communicating with a computer rather than a human being.