HP-1154
ME · State · USA
ME
USA
● Enacted
Effective Date
2025-09-13
Maine H.P. 1154 — An Act to Ensure Transparency in Consumer Transactions Involving Artificial Intelligence (Public Law Chapter 294)
Requires any person using an AI chatbot or other computer technology in trade and commerce with Maine consumers to provide clear and conspicuous notice that the consumer is not interacting with a human being, when the interaction could mislead or deceive a reasonable consumer. The disclosure obligation is conditional — it triggers only when a reasonable consumer could be misled into believing they are engaging with a human. Violations are treated as violations of the Maine Unfair Trade Practices Act, enforceable by the Attorney General. The law is notably brief and narrow, covering only trade and commerce contexts and imposing a single disclosure obligation.
Summary

Requires any person using an AI chatbot or other computer technology in trade and commerce with Maine consumers to provide clear and conspicuous notice that the consumer is not interacting with a human being, when the interaction could mislead or deceive a reasonable consumer. The disclosure obligation is conditional — it triggers only when a reasonable consumer could be misled into believing they are engaging with a human. Violations are treated as violations of the Maine Unfair Trade Practices Act, enforceable by the Attorney General. The law is notably brief and narrow, covering only trade and commerce contexts and imposing a single disclosure obligation.

Enforcement & Penalties
Enforcement Authority
Enforced by the Maine Attorney General as a violation of the Maine Unfair Trade Practices Act (5 MRSA § 205-A et seq.). The Attorney General may initiate enforcement actions. The Maine UTPA does not create a direct private right of action for individual consumers; private enforcement is available only through indirect theories (e.g., if a court construes a violation as actionable under general tort or contract principles), but the statute itself does not expressly grant one.
Penalties
Remedies are those available under the Maine Unfair Trade Practices Act (5 MRSA § 205-A et seq.), which authorizes the Attorney General to seek injunctive relief, civil penalties of up to $10,000 per violation, and restitution. The UTPA also permits recovery of costs of investigation and litigation.
Who Is Covered
A person may not use an artificial intelligence chatbot or any other computer technology to engage in trade and commerce with a consumer in a manner that may mislead or deceive a reasonable consumer into believing that the consumer is engaging with a human being unless the consumer is notified in a clear and conspicuous manner that the consumer is not engaging with a human being.
What Is Covered
"Artificial intelligence chatbot" means a software application, web interface or computer program that simulates human conversation and interaction through textual or aural communications.
Compliance Obligations 1 obligation · click obligation ID to open requirement page
T-01 AI Identity Disclosure · T-01.1 · ChatbotGeneral Consumer App
10 MRSA § 1500-Y(2)
Plain Language
Any person using an AI chatbot or other computer technology to interact with consumers in trade and commerce must provide clear and conspicuous notice that the consumer is not interacting with a human — but only when the interaction could mislead or deceive a reasonable consumer into believing they are dealing with a human. This is a conditional trigger: if the AI system clearly presents itself as non-human from the outset or no reasonable person would be confused, no disclosure is required. The scope is limited to trade and commerce contexts. A violation constitutes a violation of the Maine Unfair Trade Practices Act, enforceable by the Attorney General with civil penalties up to $10,000 per violation.
Statutory Text
A person may not use an artificial intelligence chatbot or any other computer technology to engage in trade and commerce with a consumer in a manner that may mislead or deceive a reasonable consumer into believing that the consumer is engaging with a human being unless the consumer is notified in a clear and conspicuous manner that the consumer is not engaging with a human being.