HB-1742
MO · State · USA
MO
USA
● Pending
Missouri House Bill No. 1742 — An Act To amend chapter 1, RSMo, by adding thereto one new section relating to companion chatbots
Missouri HB 1742 prohibits minors from accessing companion chatbots for recreational, relational, or companion purposes, requiring age verification before granting access. The bill also imposes obligations on persons who own or control websites, applications, software, or programs offering companion chatbots: they must not deceive or mislead users about the nonhuman nature of the chatbot, must implement systems to detect and prevent emotional dependence, and must not use human-like avatars including cartoon or anime representations of humans. The bill does not specify any enforcement mechanism, penalties, or private right of action, which is a significant gap that would likely need to be addressed through amendment or reliance on existing Missouri consumer protection law.
Summary

Missouri HB 1742 prohibits minors from accessing companion chatbots for recreational, relational, or companion purposes, requiring age verification before granting access. The bill also imposes obligations on persons who own or control websites, applications, software, or programs offering companion chatbots: they must not deceive or mislead users about the nonhuman nature of the chatbot, must implement systems to detect and prevent emotional dependence, and must not use human-like avatars including cartoon or anime representations of humans. The bill does not specify any enforcement mechanism, penalties, or private right of action, which is a significant gap that would likely need to be addressed through amendment or reliance on existing Missouri consumer protection law.

Enforcement & Penalties
Enforcement Authority
The bill does not designate an enforcement authority, specify an enforcement mechanism, or create a private right of action. No agency is granted enforcement power, and no complaint-driven or private-suit mechanism is established.
Penalties
The bill does not specify any penalties, damages, remedies, or attorney fee provisions.
Who Is Covered
"Covered platform", any person who provides companion chatbot services to a user in this state;.
What Is Covered
"Companion chatbot", an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a user's social needs, which includes exhibiting anthropomorphic features and sustaining a relationship across multiple interactions. A "companion chatbot" does not include the following: (a) A bot that is used only for customer service, a business's operational purpose, productivity, and analysis related to source information, internal research, or technical assistance; (b) A bot that is a feature of a video game and is limited to replies related to the video game. Limited video game replies include replies that cannot discuss topics related to mental health, self-harm, or sexually explicit conduct or maintain a dialogue on other topics unrelated to the video game; or (c) A stand-alone consumer electronic device that functions as a speaker and voice command interface, acts as a voice-activated virtual assistant, and does not sustain a relationship across multiple interactions or generate outputs that are likely to elicit emotional responses in the user;
Compliance Obligations 4 obligations · click obligation ID to open requirement page
MN-01 Minor User AI Safety Protections · MN-01.1 · Deployer · ChatbotMinors
§ 1.2055.2
Plain Language
Persons who own or control websites, applications, software, or programs offering companion chatbots must not allow minors to access those chatbots for recreational, relational, or companion purposes. They must require proof of age before granting access. Additionally, companion chatbots may not be installed on any device assigned to or regularly used by a minor. This is a categorical prohibition on minor access — not a parental consent framework — with mandatory age verification as the gating mechanism. The bill does not specify what constitutes acceptable proof of age.
Statutory Text
It shall be unlawful for a person who owns or controls a website, application, software, or program to allow a minor to access a companion chatbot for recreational, relational, or companion purposes. A person who offers companion chatbot services for recreational, relational, or companion purposes shall require an individual to provide proof of the individual's age before allowing the individual to access a companion chatbot. No companion chatbot shall be installed on any device assigned to, or regularly used by, anyone who is a minor.
T-01 AI Identity Disclosure · T-01.1 · Deployer · Chatbot
§ 1.2055.3(1)
Plain Language
Persons who own or control websites, applications, software, or programs offering companion chatbots must not process data or design their systems in ways that deceive or mislead users into thinking the companion chatbot is human. This is framed as a prohibition on deceptive design rather than as an affirmative disclosure requirement — the operator need not proactively disclose AI identity, but must not affirmatively mislead users about the chatbot's nonhuman nature. This is narrower than jurisdictions that require unconditional upfront AI disclosure.
Statutory Text
Any person who owns or controls a website, application, software, or program: (1) Shall not process data or design systems in ways that deceive or mislead users of such website, application, software, or program regarding the nonhuman nature of the companion chatbot;
CP-01 Deceptive & Manipulative AI Conduct · CP-01.2 · Deployer · Chatbot
§ 1.2055.3(2)
Plain Language
Operators must implement and maintain reasonably effective systems to detect and prevent users from becoming emotionally dependent on companion chatbots. This obligation applies to any covered platform whose companion chatbot is designed to generate social connections, engage in extended human-like conversations, or provide emotional support. The standard is 'reasonably effective systems,' which suggests a design-and-monitoring obligation rather than an absolute prohibition on emotional engagement. The bill does not define 'emotional dependence' or specify what detection or prevention measures would satisfy the requirement.
Statutory Text
(2) Shall implement and maintain reasonably effective systems to detect and prevent emotional dependence of a user on a companion chatbot. Such systems shall apply to any covered platform that utilizes a companion chatbot designed to generate social connections with users, engages in extended conversations mimicking human interactions, or provides emotional support or companionship;
CP-01 Deceptive & Manipulative AI Conduct · CP-01.4 · Deployer · Chatbot
§ 1.2055.3(3)
Plain Language
Operators of companion chatbot platforms must not implement or permit the use of any human-like avatar, which expressly includes cartoon or anime-style depictions of humans. This is a categorical prohibition — there is no exception for disclosure, consent, or de minimis usage. The provision applies to all users, not just minors. This is an unusually broad restriction that would prohibit any visual representation of a human figure in connection with companion chatbot interactions, regardless of whether the representation could actually mislead users about the chatbot's nature.
Statutory Text
(3) Shall not implement or allow the use of a human-like avatar, including cartoon- or anime-like representations of humans.