HB-1742
MO · State · USA
MO
USA
● Pre-filed
Proposed Effective Date
2026-08-28
Missouri HB 1742 — An Act to amend chapter 1, RSMo, by adding thereto one new section relating to companion chatbots
Missouri HB 1742 would categorically prohibit minors from accessing companion chatbots for recreational, relational, or companion purposes, requiring age verification before granting access. It also imposes design and operational restrictions on any person who owns or controls a companion chatbot platform: prohibiting deceptive design regarding the chatbot's nonhuman nature, requiring systems to detect and prevent emotional dependence, and banning human-like avatars including cartoon or anime representations. The bill does not designate any enforcement authority, create a private right of action, or specify penalties — leaving enforcement mechanisms unclear. This is a notably restrictive approach that goes beyond other companion chatbot bills by outright banning minor access and prohibiting all human-like avatars.
Summary

Missouri HB 1742 would categorically prohibit minors from accessing companion chatbots for recreational, relational, or companion purposes, requiring age verification before granting access. It also imposes design and operational restrictions on any person who owns or controls a companion chatbot platform: prohibiting deceptive design regarding the chatbot's nonhuman nature, requiring systems to detect and prevent emotional dependence, and banning human-like avatars including cartoon or anime representations. The bill does not designate any enforcement authority, create a private right of action, or specify penalties — leaving enforcement mechanisms unclear. This is a notably restrictive approach that goes beyond other companion chatbot bills by outright banning minor access and prohibiting all human-like avatars.

Enforcement & Penalties
Enforcement Authority
No enforcement mechanism is specified in the bill. No agency is designated with enforcement authority, no private right of action is created, and no penalties or remedies are established. Enforcement would depend on existing Missouri statutory frameworks.
Penalties
The bill does not specify any penalties, damages, or remedies for violations.
Who Is Covered
"Covered platform", any person who provides companion chatbot services to a user in this state;.
What Is Covered
"Companion chatbot", an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a user's social needs, which includes exhibiting anthropomorphic features and sustaining a relationship across multiple interactions. A "companion chatbot" does not include the following: (a) A bot that is used only for customer service, a business's operational purpose, productivity, and analysis related to source information, internal research, or technical assistance; (b) A bot that is a feature of a video game and is limited to replies related to the video game. Limited video game replies include replies that cannot discuss topics related to mental health, self-harm, or sexually explicit conduct or maintain a dialogue on other topics unrelated to the video game; or (c) A stand-alone consumer electronic device that functions as a speaker and voice command interface, acts as a voice-activated virtual assistant, and does not sustain a relationship across multiple interactions or generate outputs that are likely to elicit emotional responses in the user;
Compliance Obligations 4 obligations · click obligation ID to open requirement page
MN-01 Minor User AI Safety Protections · MN-01.1 · Deployer · ChatbotMinors
§ 1.2055(2)
Plain Language
This provision imposes a categorical ban on minors accessing companion chatbots for recreational, relational, or companion purposes — not merely heightened safeguards, but a complete prohibition. Any person who owns or controls a website, app, software, or program hosting a companion chatbot must require proof of age before granting access. Additionally, companion chatbots may not be installed on any device assigned to or regularly used by a minor. This is significantly more restrictive than other state companion chatbot bills (e.g., CA SB 243), which permit minor access subject to parental consent and safety guardrails. The bill does not specify what constitutes acceptable 'proof of age,' leaving the verification standard undefined.
Statutory Text
It shall be unlawful for a person who owns or controls a website, application, software, or program to allow a minor to access a companion chatbot for recreational, relational, or companion purposes. A person who offers companion chatbot services for recreational, relational, or companion purposes shall require an individual to provide proof of the individual's age before allowing the individual to access a companion chatbot. No companion chatbot shall be installed on any device assigned to, or regularly used by, anyone who is a minor.
T-01 AI Identity Disclosure · T-01.1 · Deployer · Chatbot
§ 1.2055(3)(1)
Plain Language
Any person who owns or controls a platform offering a companion chatbot must not process data or design systems in ways that deceive or mislead users about the fact that the chatbot is not human. This is framed as a prohibition on deception rather than an affirmative disclosure requirement — the operator need not proactively disclose AI identity in every interaction, but may not design the system in a way that would lead users to believe they are interacting with a human. This is a design-level obligation covering both data processing choices and system design decisions.
Statutory Text
Any person who owns or controls a website, application, software, or program: (1) Shall not process data or design systems in ways that deceive or mislead users of such website, application, software, or program regarding the nonhuman nature of the companion chatbot;
CP-01 Deceptive & Manipulative AI Conduct · CP-01.2 · Deployer · Chatbot
§ 1.2055(3)(2)
Plain Language
Operators must implement and maintain reasonably effective systems to detect and prevent users from developing emotional dependence on a companion chatbot. This is a continuous operational obligation — not a one-time design review. The requirement applies to any covered platform using a chatbot designed to generate social connections, engage in extended human-mimicking conversations, or provide emotional support or companionship. The bill does not define 'emotional dependence' or specify what constitutes a 'reasonably effective system,' leaving significant interpretive uncertainty about the compliance standard.
Statutory Text
Any person who owns or controls a website, application, software, or program: (2) Shall implement and maintain reasonably effective systems to detect and prevent emotional dependence of a user on a companion chatbot. Such systems shall apply to any covered platform that utilizes a companion chatbot designed to generate social connections with users, engages in extended conversations mimicking human interactions, or provides emotional support or companionship;
CP-01 Deceptive & Manipulative AI Conduct · CP-01.4 · Deployer · Chatbot
§ 1.2055(3)(3)
Plain Language
Operators are categorically prohibited from implementing or allowing the use of any human-like avatar for companion chatbots, including cartoon or anime-style representations of humans. This is an absolute ban — not a conditional restriction tied to deception risk or minor status. It applies to all users, not just minors. This is one of the most restrictive avatar provisions in any U.S. companion chatbot bill and would effectively require all companion chatbots to use non-human visual representations (abstract icons, animal characters, geometric shapes, etc.).
Statutory Text
Any person who owns or controls a website, application, software, or program: (3) Shall not implement or allow the use of a human-like avatar, including cartoon- or anime-like representations of humans.