S-0002
FL · State · USA
FL
USA
● Failed
Effective Date
2026-07-01
Florida SB 2-D — Artificial Intelligence Bill of Rights
Comprehensive AI consumer protection bill creating an 'Artificial Intelligence Bill of Rights' covering companion chatbots, bots, AI data practices, AI-generated likeness rights, AI in education, and government AI procurement. Requires companion chatbot platforms to obtain parental consent before allowing minors to use the platform, provide extensive parental controls, disclose AI identity, deliver hourly reminders to minors, and prevent harmful content. Requires all bot operators to disclose non-human status at the start and hourly during interactions. Prohibits AI technology companies from selling or disclosing personal information unless deidentified. Extends Florida's right of publicity to cover AI-generated name, image, or likeness. Restricts AI instructional tool use in education before grade 6 and requires parental notice, opt-out, and account access. Prohibits government entities from contracting with foreign-country-of-concern entities for AI technology. Enforcement is primarily through the Department of Legal Affairs under FDUTPA, with a narrow private right of action for minor account holders on companion chatbot platforms. The bill died in committee.
Summary

Comprehensive AI consumer protection bill creating an 'Artificial Intelligence Bill of Rights' covering companion chatbots, bots, AI data practices, AI-generated likeness rights, AI in education, and government AI procurement. Requires companion chatbot platforms to obtain parental consent before allowing minors to use the platform, provide extensive parental controls, disclose AI identity, deliver hourly reminders to minors, and prevent harmful content. Requires all bot operators to disclose non-human status at the start and hourly during interactions. Prohibits AI technology companies from selling or disclosing personal information unless deidentified. Extends Florida's right of publicity to cover AI-generated name, image, or likeness. Restricts AI instructional tool use in education before grade 6 and requires parental notice, opt-out, and account access. Prohibits government entities from contracting with foreign-country-of-concern entities for AI technology. Enforcement is primarily through the Department of Legal Affairs under FDUTPA, with a narrow private right of action for minor account holders on companion chatbot platforms. The bill died in committee.

Enforcement & Penalties
Enforcement Authority
Department of Legal Affairs (Attorney General) is the sole enforcing authority for §§ 501.9984, 501.9985, and 501.9986. Enforcement is agency-initiated; the department may bring actions under the Florida Deceptive and Unfair Trade Practices Act (FDUTPA Part II) when it has reason to believe a violation has occurred. For §§ 501.9985 and 501.9986, the private right of action provisions of §§ 501.211 and 501.212 are expressly excluded. However, § 501.9984(5) creates a separate private right of action available only on behalf of a minor account holder for knowing or reckless violations of the companion chatbot provisions. A 45-calendar-day cure period may be granted at the department's discretion after written notice, except where the platform willfully or knowingly disregarded the account holder's age. For § 540.08 (name/image/likeness), the individual or authorized representative may bring a private action.
Penalties
For companion chatbot violations (§ 501.9984): civil penalties up to $50,000 per violation, reasonable attorney fees and court costs, and punitive damages for consistent patterns of knowing or reckless conduct (department enforcement). Private right of action on behalf of minor account holders allows up to $10,000 in damages plus court costs and reasonable attorney fees. For bot disclosure violations (§ 501.9985) and deidentified data violations (§ 501.9986): civil penalties up to $50,000 per violation plus reasonable attorney fees and court costs (department enforcement only). For investigative subpoena noncompliance (§ 501.9987): civil penalty of up to $5,000 per week in violation plus reasonable attorney fees and costs. For unauthorized AI-generated name/image/likeness (§ 540.08): injunctive relief, actual damages including reasonable royalty, punitive or exemplary damages, and for servicemember violations, an additional civil penalty of up to $1,000 per commercial transaction.
Who Is Covered
"Companion chatbot platform" means a platform that allows a user to engage with companion chatbots.
"Operator" means a person who owns, operates, or otherwise makes available a bot to individuals in this state.
"Artificial intelligence technology company" means a business or organization that produces, develops, creates, designs, or manufactures artificial intelligence technology or products, collects data for use in artificial intelligence products, or implements artificial intelligence technology.
What Is Covered
"Companion chatbot" means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a user's social needs by retaining information on prior interactions or user sessions and user preferences to personalize the interaction and facilitate ongoing engagement, asking unprompted or unsolicited emotion-based questions that go beyond a direct response to a user prompt, and sustaining an ongoing dialogue personalized to the user. The term does not include: (a) A chatbot used only for customer service; a business's internal operational purposes, productivity and analysis; or uses related to source information, internal research, or technical assistance; (b) A chatbot that is a feature of a video game or theme park and is limited to replies related to the video game or theme park experience and does not discuss topics related to mental health, self-harm, or material harmful to minors or maintain a dialogue on other topics unrelated to the video game or theme park; (c) A stand-alone consumer electronic device that functions as a speaker and voice command interface, acts as a voice activated virtual assistant, and does not sustain a relationship across multiple interactions or generate outputs likely to elicit emotional responses in the user; or (d) An artificial intelligence instructional tool, as defined in s. 1006.1495.
"Bot" means an automated online software application in which all or substantially all of the actions or posts of the account are not the result of a natural person.
"Artificial intelligence instructional tool" means a software application or service that uses artificial intelligence, including machine learning, which is made available to a student by an educational entity for educational purposes, including instruction, tutoring, practice, feedback, or completing educator-directed assignments, and that is not designed, marketed, or configured to: 1. Meet a student's social needs; 2. Simulate friendship, companionship, or an emotional relationship with a student; or 3. Employ relationship-building or anthropomorphic design features for the purpose of encouraging a student to continue interacting with the system.
Compliance Obligations 15 obligations · click obligation ID to open requirement page
MN-01 Minor User AI Safety Protections · MN-01.2 · Deployer · ChatbotMinors
Fla. Stat. § 501.9984(1)
Plain Language
Companion chatbot platforms must block minors from becoming or maintaining accounts unless the minor's parent or guardian provides consent. This is a gating requirement — no minor access without parental consent. The act of allowing a minor to become an account holder is treated as contract formation, which triggers the full suite of parental control and disclosure obligations in the remainder of § 501.9984.
Statutory Text
A companion chatbot platform shall prohibit a minor from becoming or being an account holder unless the minor's parent or guardian provides consent. If a companion chatbot platform allows a minor to become or be an account holder, the parties have entered into a contract.
MN-01 Minor User AI Safety Protections · MN-01.3 · Deployer · ChatbotMinors
Fla. Stat. § 501.9984(1)(a)
Plain Language
When a parent consents to a minor's account, the companion chatbot platform must provide the consenting parent or guardian with robust controls: the ability to receive copies of all past and present interactions, set daily time limits, restrict access to specific days and times, disable interactions with third-party users, and receive timely notifications if the minor expresses self-harm or harm-to-others intent. Item 5 (harm notifications) overlaps with the crisis response concept but is structured here as a parental control tool rather than a crisis protocol.
Statutory Text
If the minor's parent or guardian provides consent for the minor to become an account holder or maintain an existing account, the companion chatbot platform must allow the consenting parent or guardian of the minor account holder to: 1. Receive copies of all past or present interactions between the account holder and the companion chatbot; 2. Limit the amount of time that the account holder may interact with the companion chatbot each day; 3. Limit the days of the week and the times during the day when the account holder may interact with the companion chatbot; 4. Disable any of the interactions between the account holder and third-party account holders on the companion chatbot platform; and 5. Receive timely notifications if the account holder expresses to the companion chatbot a desire or an intent to engage in harm to self or others.
MN-01 Minor User AI Safety Protections · MN-01.9 · Deployer · ChatbotMinors
Fla. Stat. § 501.9984(1)(b)
Plain Language
Companion chatbot platforms must terminate minor accounts that lack parental consent (with a 90-day dispute window), honor minor-initiated termination requests within 5 business days, honor parent/guardian-initiated termination requests within 10 business days, and permanently delete all personal information associated with terminated minor accounts unless retention is required by law. The 90-day dispute period applies only to platform-initiated terminations of accounts identified as minor accounts without consent — user-initiated and parent-initiated terminations have shorter, fixed deadlines.
Statutory Text
A companion chatbot platform shall do all of the following: 1. Terminate any account or identifier belonging to an account holder who is a minor if the companion chatbot platform treats or categorizes the account or identifier as belonging to a minor for purposes of targeting content or advertising and if the minor's parent or guardian has not provided consent for the minor pursuant to subsection (1). The companion chatbot platform shall provide 90 days for the account holder to dispute the termination. Termination must be effective upon the expiration of the 90 days if the account holder fails to effectively dispute the termination. 2. Allow an account holder who is a minor to request to terminate the account or identifier. Termination must be effective within 5 business days after the request. 3. Allow the consenting parent or guardian of an account holder who is a minor to request that the minor's account or identifier be terminated. Termination must be effective within 10 business days after the request. 4. Permanently delete all personal information held by the companion chatbot platform relating to the terminated minor account or identifier, unless state or federal law requires the platform to maintain the information.
T-01 AI Identity Disclosure · T-01.1T-01.2 · Deployer · ChatbotMinors
Fla. Stat. § 501.9984(2)(a)-(b)
Plain Language
For all minor account holders, companion chatbot platforms must: (1) unconditionally disclose that the user is interacting with AI, and (2) provide a clear and conspicuous notification at the beginning of each interaction and at least every hour during continuing interactions reminding the minor to take a break and that the chatbot is AI, not human. The hourly interval is notably more frequent than some other jurisdictions (e.g., California SB 243 requires every three hours). These are default-on obligations — not configurable by the minor.
Statutory Text
In connection to all accounts or identifiers held by account holders who are minors, the companion chatbot platform shall do all of the following: (a) Disclose to the account holder that he or she is interacting with artificial intelligence. (b) Provide by default a clear and conspicuous notification to the account holder, at the beginning of companion chatbot interactions and at least once every hour during continuing interactions, reminding the minor to take a break and that the companion chatbot is artificially generated and not human.
MN-01 Minor User AI Safety Protections · MN-01.6 · Deployer · ChatbotMinors
Fla. Stat. § 501.9984(2)(c)
Plain Language
Companion chatbot platforms must implement reasonable measures to prevent the chatbot from producing or sharing material harmful to minors and from encouraging minors to engage in any conduct depicted in such material. The 'reasonable measures' standard gives platforms some flexibility but requires affirmative action. In the cure period context (§ 501.9984(4)(a)(2)), platforms may demonstrate compliance by showing alignment with the NIST AI RMF and ISO 42001, including structured interaction logs, parental access controls, harm-signal detection and response procedures, and verified deletion events.
Statutory Text
Institute reasonable measures to prevent the companion chatbot from producing or sharing materials harmful to minors or encouraging the account holder to engage in any of the conduct described or depicted in materials harmful to minors.
T-01 AI Identity Disclosure · T-01.1T-01.2 · Deployer · Chatbot
Fla. Stat. § 501.9985(1)
Plain Language
All bot operators must display a pop-up or other prominent notification at the start of every user interaction, and at least every hour during continuing interactions, informing the user they are not speaking with a human. For non-screen interactions, the operator must otherwise inform the user. This is an unconditional obligation — it applies regardless of whether a reasonable person would be misled. The only carve-out is for bots used solely by employees for internal business operations. Operators may demonstrate compliance during a cure period by showing persistent and conspicuous identity indicators aligned with the NIST AI RMF and ISO 42001.
Statutory Text
At the beginning of an interaction between a user and a bot, and at least once every hour during the interaction, an operator shall display a pop-up message or other prominent notification notifying the user or, if the interaction is not through a device with a screen, otherwise inform the user, that he or she is not engaging in dialogue with a human counterpart. This section does not apply to a bot that is used solely by employees within a business for its internal operational purposes.
D-01 Automated Processing Rights & Data Controls · D-01.4 · Developer · ChatbotGeneral Consumer App
Fla. Stat. § 501.9986(1)-(2)
Plain Language
AI technology companies may not sell or disclose user personal information unless it is deidentified. This is a near-absolute prohibition on personal data sales — the only exceptions are for data that has been properly deidentified or for disclosures specifically authorized by federal law. Companies holding deidentified data must take reasonable measures to prevent re-association, maintain data in deidentified form, impose contractual flow-down on recipients requiring compliance, and implement business processes to prevent inadvertent release. The only permitted reidentification attempt is for testing the company's own deidentification processes. Compliance can be demonstrated through a risk management program aligned with the NIST AI RMF and ISO 42001 with controls for deidentification, contractual flow-down, non-reidentification, inadvertent release prevention, monitoring, and auditing.
Statutory Text
(1) An artificial intelligence technology company may not sell or disclose personal information of users unless the information is deidentified data. This subsection does not prohibit the sale or disclosure of information specifically authorized by federal law. (2) An artificial intelligence technology company in possession of deidentified data shall do all of the following: (a) Take reasonable measures to ensure that the data cannot be associated with a user. (b) Maintain and use the data in deidentified form. An artificial intelligence technology company may not attempt to reidentify the data, except that the artificial intelligence technology company may attempt to reidentify the data solely for the purpose of determining whether its deidentification processes satisfy the requirements of this section. (c) Contractually obligate a recipient of the deidentified data to comply with this section. (d) Implement business processes to prevent the inadvertent release of deidentified data.
CP-02 Non-Consensual Intimate Imagery · CP-02.4 · Publisher · Content Generation
Fla. Stat. § 540.08(2)
Plain Language
No person may commercially publish, print, display, or otherwise publicly use an individual's name, portrait, photograph, image, or other likeness created through generative AI without express written or oral consent from the individual, an authorized representative, or (if the individual is deceased) a designated heir — the surviving spouse or surviving children. Post-mortem rights extend 40 years after death. This extends Florida's existing right of publicity to AI-generated likenesses. Violations are actionable by injunction, actual damages (including reasonable royalty), and punitive damages. Servicemember violations carry an additional civil penalty of up to $1,000 per commercial transaction.
Statutory Text
A person may not publish, print, display, or otherwise publicly use for trade or for any commercial or advertising purpose the name, portrait, photograph, image, or other likeness of an individual created through generative artificial intelligence without the express written or oral consent to such use given by any of the following: (a) The individual. (b) Any other person authorized in writing by the individual to license the commercial use of the individual's name, image, or likeness. (c) If the individual is deceased: 1. A person authorized in writing to license the commercial use of the individual's name, image, or likeness; or 2. If a person is not authorized, any one individual from a class composed of the deceased individual's surviving spouse and surviving children. A legal parent or guardian may give consent on behalf of a minor surviving child.
PS-01 Government AI Accountability · PS-01.4 · Government · Government System
Fla. Stat. § 287.138(3)(b), (7)
Plain Language
Beginning July 1, 2026, Florida governmental entities may not contract with any entity for AI technology, software, or products — including when AI is a portion or option of a broader contract — if the entity is owned by, has controlling interest from, or is organized under the laws of or headquartered in a foreign country of concern. Vendors must provide a sworn affidavit attesting they do not meet any of these criteria as a condition of bid or proposal acceptance. Existing contracts with such entities may not be extended or renewed after July 1, 2026.
Statutory Text
(3)(b) Beginning July 1, 2026, a governmental entity may not accept a bid on, a proposal for, or a reply to, or enter into a contract with, an entity to provide artificial intelligence technology, software, or products, including as a portion or an option to the products or services provided under the contract, unless the entity provides the governmental entity with an affidavit signed by an officer or a representative of the entity under penalty of perjury attesting that the entity does not meet any of the criteria in paragraph (7)(a), paragraph (7)(b), or paragraph (7)(c). (7) A governmental entity may not knowingly enter into a contract with an entity for artificial intelligence technology, software, or products, including as a portion or an option to the products or services provided under the contract, if: (a) The entity is owned by the government of a foreign country of concern; (b) A government of a foreign country of concern has a controlling interest in the entity; or (c) The entity is organized under the laws of or has its principal place of business in a foreign country of concern.
MN-01 Minor User AI Safety Protections · MN-01.11 · Government · EducationMinors
Fla. Stat. § 1006.1495(2)
Plain Language
Educational entities may not give students access to AI instructional tools before grade 6, with three narrow exceptions: (1) use directed and supervised by school personnel, (2) translation or ELL support, and (3) accommodations or assistive technology for students with documented disabilities. This is effectively a ban on unsupervised AI instructional tool use in PreK through grade 5. Private schools that provide access to AI instructional tools must also comply with this section.
Statutory Text
An educational entity may not provide students with access to an artificial intelligence instructional tool before grade 6 unless such use is: (a) Directed and supervised by school personnel; (b) For translation or similar support necessary for a student identified as an English language learner; or (c) For accommodations, assistive technology, or similar support necessary for a student with a documented disability.
T-01 AI Identity Disclosure · T-01.1 · Government · EducationMinors
Fla. Stat. § 1006.1495(3)
Plain Language
Before any minor student receives access credentials for an AI instructional tool, the educational entity must give the parent written notice identifying the tool and its educational purpose, describing how it will be used, explaining the opt-out process, and explaining how the parent can access the student's account or request access to account information and activity. This is a pre-access notice requirement — credentials may not be issued until notice has been provided.
Statutory Text
Before a student is provided access credentials for an artificial intelligence instructional tool, the educational entity must provide the parent of a minor student with notice that: (a) Identifies the tool and its educational purpose; (b) Describes, in general terms, the manner in which the tool will be used by students; (c) Explains how the parent may exercise the opt-out process under subsection (4); and (d) Explains how the parent may access the student's account or request access to information and account activity under subsection (5), including the method for submitting a written request.
D-01 Automated Processing Rights & Data Controls · D-01.3 · Government · EducationMinors
Fla. Stat. § 1006.1495(4)
Plain Language
Parents must be given the opportunity to opt their minor student out of using an AI instructional tool. The opt-out process must align with the educational entity's existing policies for instructional materials and digital tools. If a parent opts out and the student attends a public school, the school must provide an alternative instructional activity that allows the student to meet a comparable educational requirement without penalty. This ensures no student is academically disadvantaged for a parent's decision to opt out of AI tools.
Statutory Text
(a) A parent of a minor student must be provided the opportunity to opt out of the student's use of an artificial intelligence instructional tool. (b) The opt-out process must align with the educational entity's existing policies for parental notice, consent, objection, or opt out for instructional materials, digital tools, or online accounts, as applicable. (c) If a parent opts out of a student's use of an artificial intelligence instructional tool and the student is enrolled in a public school, the school district or public school must provide an alternative instructional activity that allows the student to meet a comparative educational requirement without penalty.
MN-01 Minor User AI Safety Protections · MN-01.3 · DeployerGovernment · EducationMinors
Fla. Stat. § 1006.1495(5)(a)-(d)
Plain Language
When an AI instructional tool operator provides student access credentials to an educational entity, the operator must simultaneously provide the educational entity with a means to authorize parental access to the student's account information and activity. This can be satisfied by either providing parents with read-only credentials at the time of student access, or by providing access within 30 days of a written parental request. Neither the operator nor the educational entity is required to create or retain transcripts of student interactions beyond what is ordinarily maintained. This ensures parents have visibility into their minor student's AI tool usage without imposing new record-creation burdens on operators.
Statutory Text
(a) At the time an operator provides a student's access credentials or otherwise provides or enables student access to an educational entity for an artificial intelligence instructional tool, the operator shall simultaneously provide to the educational entity a means to authorize the parent of a minor student to access information and account activity maintained within the artificial intelligence instructional tool. (b) The operator may satisfy paragraph (a) by: 1. Providing the parent of a minor student credentials or another method for read-only access to the student's account; or 2. Upon written request from the parent of a minor student, providing access to the information and account activity maintained within the tool, in accordance with applicable state and federal law, within 30 days after receipt of the request. The educational entity shall inform the parent of the right to make such a request and the method for submitting the request. (c) If an educational entity satisfies subparagraph (b)1., the educational entity must provide the credentials or other access method at the time the educational entity provides the student with access credentials or otherwise enables student access. (d) This subsection does not require an operator or educational entity to create or retain a transcript or record of student interactions beyond information otherwise maintained in the ordinary course of providing access to the tool.
Other · General Consumer App
Fla. Stat. § 501.9982
Plain Language
This is a statement of rights and principles — a 'bill of rights' preamble — that enumerates the rights Florida residents have with respect to AI. However, subsection (2) explicitly states that this section 'may not be construed as creating new or independent rights or entitlements' and that residents exercise these rights 'in accordance with existing law.' This means the section is aspirational and declaratory, not operative. It creates no independent compliance obligation — the operative obligations are found in the subsequent sections of the act.
Statutory Text
(1) Residents are entitled to certain rights with respect to the use of artificial intelligence, including, but not limited to: (a) The right to use artificial intelligence to improve their own lives and the lives of family members, fellow residents, and the world at large in accordance with the law. (b) The right to supervise, access, limit, and control their minor children's use of artificial intelligence. (c) The right to know whether they are communicating with a human being or an artificial intelligence system, program, or chatbot. (d) The right to know whether artificial intelligence technology companies are collecting personal information or biometric data, and the right to expect artificial intelligence technology companies to protect and deidentify that information or data in accordance with the law. (e) The right to pursue civil remedies authorized by law against persons who use artificial intelligence to appropriate the name, image, or likeness of others for commercial purposes without their consent. (f) The right to be protected by law from criminal acts, such as fraud, exploitation, identity theft, stalking, and cyberbullying, regardless of whether artificial intelligence is used in the commission of those acts. (g) The right to be protected by law from criminal acts relating to the alteration of existing images to create sexual or lewd or lascivious images or child pornography, regardless of whether artificial intelligence is used in the commission of those acts. (h) The right to know whether political advertisements, electioneering communications, or similar advertisements were created in whole or in part with the use of artificial intelligence. (i) The right to pursue civil remedies authorized by law against others who use artificial intelligence to slander, libel, or defame them. (j) The right to prevent a companion chatbot from engaging with a user as a character that is protected by federal copyright law without the express written consent of the copyright owner. (k) The right to prevent a companion chatbot from engaging with a user as a character that is a living individual without the express written consent of that individual. (l) The right to prevent generative artificial intelligence from using a character that is protected by federal copyright law without the express written consent of the copyright owner. (2) Residents may exercise the rights described in this section in accordance with existing law. This section may not be construed as creating new or independent rights or entitlements.
Other · ChatbotGeneral Consumer App
Fla. Stat. § 501.9987
Plain Language
This section grants the Department of Legal Affairs investigative authority to enforce the Artificial Intelligence Bill of Rights, including subpoena power, oath administration, and evidence collection. It establishes procedures for challenging subpoenas, compelling compliance, and handling out-of-state evidence. Persons who obstruct investigations face civil penalties of up to $5,000 per week. This section creates no new compliance obligation on covered entities — it establishes the enforcement infrastructure. The substantive obligations are found in §§ 501.9984–501.9986.
Statutory Text
(1) If, by its own inquiry or as a result of complaints, the department has reason to believe that a person has engaged in, or is engaging in, a practice or an act that violates this part, the department may administer oaths and affirmations, subpoena witnesses or matter, and collect evidence. Within 5 days, excluding weekends and legal holidays, after service of a subpoena, or at any time before the return date specified in the subpoena, whichever time period is longer, the party served may file in the circuit court in the county in which it resides or in which it transacts business and serve upon the enforcing authority a petition for an order modifying or setting aside the subpoena. The petitioner may raise any objection or privilege that would be available upon service of a subpoena in a civil action. The subpoena must inform the party served of the party's rights under this subsection. (2) If the matter that the department seeks to obtain by subpoena is located outside this state, the person subpoenaed may make the matter available to the department or its representative at the place where it is located. The department may designate representatives, including officials of the state in which the matter is located, to inspect the matter on its behalf and may respond to similar requests from officials of other states. (3) Upon the failure of a person, without lawful excuse, to obey a subpoena and upon reasonable notice to all persons affected, the department may apply to the circuit court for an order compelling compliance. (4) The department may request that a person who refuses to comply with a subpoena on the grounds that the testimony or matter may be self-incriminating be ordered by the court to provide the testimony or matter. Except in a prosecution for perjury, a person who complies with a court order to provide testimony or matter after asserting a valid privilege against self-incrimination may not have the testimony or matter so provided, or evidence derived from the testimony or matter, received against the person in any criminal investigation or proceeding. (5) A person upon whom a subpoena is served pursuant to this part must comply with its terms unless otherwise provided by order of the court. A person who fails to appear, with the intent to avoid, evade, or prevent compliance in whole or in part with an investigation under this part, or who removes from any place, conceals, withholds, mutilates, alters, or destroys, or by any other means falsifies any documentary material in the possession, custody, or control of a person subject to a subpoena, or who knowingly conceals relevant information with the intent to avoid, evade, or prevent compliance, is liable for a civil penalty of not more than $5,000 per week in violation, reasonable attorney fees, and costs. (6) The department may adopt rules to implement this section.