S-0482
FL · State · USA
FL
USA
● Pending
Proposed Effective Date
2026-07-01
Florida CS for SB 482 — Artificial Intelligence Bill of Rights
Comprehensive AI consumer protection bill creating the Florida 'Artificial Intelligence Bill of Rights.' Imposes obligations on companion chatbot platforms regarding minor users, including parental consent requirements, parental controls, hourly break reminders, and restrictions on harmful content. Requires all bot operators to disclose AI identity at the start of and hourly during interactions. Prohibits AI technology companies from selling or disclosing user personal information unless deidentified. Extends Florida's right of publicity statute to cover AI-generated likenesses. Restricts AI instructional tool use for students below grade 6 and requires parental notice, opt-out, and account access. Prohibits government contracting with foreign-country-of-concern-affiliated AI vendors. Enforced primarily by the Department of Legal Affairs under the FDUTPA framework with civil penalties up to $50,000 per violation, with a limited private right of action for minor account holders on companion chatbot platforms.
Summary

Comprehensive AI consumer protection bill creating the Florida 'Artificial Intelligence Bill of Rights.' Imposes obligations on companion chatbot platforms regarding minor users, including parental consent requirements, parental controls, hourly break reminders, and restrictions on harmful content. Requires all bot operators to disclose AI identity at the start of and hourly during interactions. Prohibits AI technology companies from selling or disclosing user personal information unless deidentified. Extends Florida's right of publicity statute to cover AI-generated likenesses. Restricts AI instructional tool use for students below grade 6 and requires parental notice, opt-out, and account access. Prohibits government contracting with foreign-country-of-concern-affiliated AI vendors. Enforced primarily by the Department of Legal Affairs under the FDUTPA framework with civil penalties up to $50,000 per violation, with a limited private right of action for minor account holders on companion chatbot platforms.

Enforcement & Penalties
Enforcement Authority
Department of Legal Affairs (Attorney General) is the sole enforcing authority for the companion chatbot (§ 501.9984), bot disclosure (§ 501.9985), and deidentified data (§ 501.9986) provisions. Enforcement is agency-initiated; the department may bring actions under the Florida Deceptive and Unfair Trade Practices Act (Part II, Ch. 501). Sections 501.211 and 501.212 (which provide private enforcement under FDUTPA) are expressly excluded for the bot disclosure and deidentified data provisions. For companion chatbot violations, a private right of action exists on behalf of a minor account holder for knowing or reckless violations (§ 501.9984(5)). The department may grant a 45-calendar-day cure period after written notice of an alleged violation; if the violation is cured to the department's satisfaction, the department may not bring an action but may issue a letter of guidance foreclosing future cure periods. The 45-day cure period does not apply where the platform willfully or knowingly disregarded the account holder's age. For the right of publicity provisions (§ 540.08), the individual or authorized person may bring a private action. The department has investigative subpoena authority under § 501.9987.
Penalties
Companion chatbot provisions (§ 501.9984): Department may collect civil penalties up to $50,000 per violation plus reasonable attorney fees and court costs. Punitive damages may be assessed for a consistent pattern of knowing or reckless conduct. Private right of action on behalf of minor account holders: up to $10,000 in damages plus court costs and reasonable attorney fees for knowing or reckless violations. Bot disclosure provisions (§ 501.9985): Civil penalty up to $50,000 per violation plus reasonable attorney fees and court costs; no private right of action. Deidentified data provisions (§ 501.9986): Civil penalty up to $50,000 per violation plus reasonable attorney fees and court costs; no private right of action. Investigative subpoena noncompliance (§ 501.9987): Civil penalty up to $5,000 per week in violation plus reasonable attorney fees and costs. Right of publicity (§ 540.08): Injunctive relief, damages for loss or injury including reasonable royalty, and punitive or exemplary damages. Servicemember violations: additional civil penalty up to $1,000 per violation per commercial transaction.
Who Is Covered
"Companion chatbot platform" means a platform that allows a user to engage with companion chatbots.
"Operator" means a person who owns, operates, or otherwise makes available a bot to individuals in this state.
"Artificial intelligence technology company" means a business or organization that produces, develops, creates, designs, or manufactures artificial intelligence technology or products, collects data for use in artificial intelligence products, or implements artificial intelligence technology.
What Is Covered
"Companion chatbot" means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a user's social needs by retaining information on prior interactions or user sessions and user preferences to personalize the interaction and facilitate ongoing engagement, asking unprompted or unsolicited emotion-based questions that go beyond a direct response to a user prompt, and sustaining an ongoing dialogue personalized to the user. The term does not include: (a) A chatbot used only for customer service; a business's internal operational purposes, productivity and analysis; or uses related to source information, internal research, or technical assistance; (b) A chatbot that is a feature of a video game or theme park and is limited to replies related to the video game or theme park experience and does not discuss topics related to mental health, self-harm, or material harmful to minors or maintain a dialogue on other topics unrelated to the video game or theme park; (c) A stand-alone consumer electronic device that functions as a speaker and voice command interface, acts as a voice activated virtual assistant, and does not sustain a relationship across multiple interactions or generate outputs likely to elicit emotional responses in the user; or (d) An artificial intelligence instructional tool, as defined in s. 1006.1495.
"Bot" means an automated online software application in which all or substantially all of the actions or posts of the account are not the result of a natural person.
"Artificial intelligence instructional tool" means a software application or service that uses artificial intelligence, including machine learning, which is made available to a student by an educational entity for educational purposes, including instruction, tutoring, practice, feedback, or completing educator-directed assignments, and that is not designed, marketed, or configured to: 1. Meet a student's social needs; 2. Simulate friendship, companionship, or an emotional relationship with a student; or 3. Employ relationship-building or anthropomorphic design features for the purpose of encouraging a student to continue interacting with the system.
Compliance Obligations 14 obligations · click obligation ID to open requirement page
MN-01 Minor User AI Safety Protections · MN-01.2 · Deployer · ChatbotMinors
Fla. Stat. § 501.9984(1)
Plain Language
Companion chatbot platforms must block minors (17 and under) from creating or maintaining accounts unless the minor's parent or guardian provides consent. When consent is given and a minor is permitted to hold an account, a contractual relationship is deemed to exist between the platform and the minor. This is a gatekeeping obligation — platforms must verify or determine minor status and obtain parental consent before permitting access.
Statutory Text
A companion chatbot platform shall prohibit a minor from becoming or being an account holder unless the minor's parent or guardian provides consent. If a companion chatbot platform allows a minor to become or be an account holder, the parties have entered into a contract.
MN-01 Minor User AI Safety Protections · MN-01.3 · Deployer · ChatbotMinors
Fla. Stat. § 501.9984(1)(a)
Plain Language
When a parent or guardian consents to a minor holding a companion chatbot account, the platform must provide the consenting parent or guardian with a suite of parental control tools: (1) access to all past and present interaction transcripts, (2) daily time limits, (3) day-of-week and time-of-day access restrictions, (4) the ability to disable interactions with third-party account holders, and (5) timely notifications when the minor expresses self-harm or harm-to-others intent. These are not optional features — all five must be made available to the consenting parent or guardian.
Statutory Text
If the minor's parent or guardian provides consent for the minor to become an account holder or maintain an existing account, the companion chatbot platform must allow the consenting parent or guardian of the minor account holder to: 1. Receive copies of all past or present interactions between the account holder and the companion chatbot; 2. Limit the amount of time that the account holder may interact with the companion chatbot each day; 3. Limit the days of the week and the times during the day when the account holder may interact with the companion chatbot; 4. Disable any of the interactions between the account holder and third-party account holders on the companion chatbot platform; and 5. Receive timely notifications if the account holder expresses to the companion chatbot a desire or an intent to engage in harm to self or others.
MN-01 Minor User AI Safety Protections · MN-01.9 · Deployer · ChatbotMinors
Fla. Stat. § 501.9984(1)(b)
Plain Language
Companion chatbot platforms must terminate minor accounts lacking parental consent (with a 90-day dispute window), honor minor self-initiated termination requests within 5 business days, and honor parent/guardian-initiated termination requests within 10 business days. Upon termination, all personal information associated with the minor's account must be permanently deleted unless retention is required by law. The platform must proactively identify and terminate unconsented minor accounts that it already treats as belonging to minors for content or advertising targeting purposes.
Statutory Text
A companion chatbot platform shall do all of the following: 1. Terminate any account or identifier belonging to an account holder who is a minor if the companion chatbot platform treats or categorizes the account or identifier as belonging to a minor for purposes of targeting content or advertising and if the minor's parent or guardian has not provided consent for the minor pursuant to subsection (1). The companion chatbot platform shall provide 90 days for the account holder to dispute the termination. Termination must be effective upon the expiration of the 90 days if the account holder fails to effectively dispute the termination. 2. Allow an account holder who is a minor to request to terminate the account or identifier. Termination must be effective within 5 business days after the request. 3. Allow the consenting parent or guardian of an account holder who is a minor to request that the minor's account or identifier be terminated. Termination must be effective within 10 business days after the request. 4. Permanently delete all personal information held by the companion chatbot platform relating to the terminated minor account or identifier, unless state or federal law requires the platform to maintain the information.
T-01 AI Identity Disclosure · T-01.1T-01.2 · Deployer · ChatbotMinors
Fla. Stat. § 501.9984(2)(a)-(b)
Plain Language
For all minor account holders, companion chatbot platforms must: (1) unconditionally disclose that the user is interacting with AI, and (2) provide a clear and conspicuous notification at the beginning of each interaction and at least every hour during continuing interactions reminding the minor to take a break and that the chatbot is AI-generated, not human. The hourly notification is a default setting — it applies automatically without requiring the minor or parent to enable it. Compare to CA SB 243's three-hour interval; Florida's one-hour interval is more frequent.
Statutory Text
In connection to all accounts or identifiers held by account holders who are minors, the companion chatbot platform shall do all of the following: (a) Disclose to the account holder that he or she is interacting with artificial intelligence. (b) Provide by default a clear and conspicuous notification to the account holder, at the beginning of companion chatbot interactions and at least once every hour during continuing interactions, reminding the minor to take a break and that the companion chatbot is artificially generated and not human.
S-02 Prohibited Conduct & Output Restrictions · S-02.6 · Deployer · ChatbotMinors
Fla. Stat. § 501.9984(2)(c)
Plain Language
Companion chatbot platforms must implement reasonable measures to prevent their chatbots from producing or sharing material harmful to minors, and from encouraging minor account holders to engage in conduct described in such material, when interacting with minor accounts. The standard is 'reasonable measures' — not an absolute prohibition — and a platform may demonstrate compliance by showing controls aligned with NIST AI RMF and ISO 42001 (per the cure provision in § 501.9984(4)(a)(2)). 'Material harmful to minors' is defined by cross-reference to Fla. Stat. § 501.1737(1), which covers content that appeals to prurient interest, is patently offensive, and lacks serious value for minors.
Statutory Text
Institute reasonable measures to prevent the companion chatbot from producing or sharing materials harmful to minors or encouraging the account holder to engage in any of the conduct described or depicted in materials harmful to minors.
T-01 AI Identity Disclosure · T-01.1T-01.2 · Deployer · Chatbot
Fla. Stat. § 501.9985(1)
Plain Language
All bot operators must display a pop-up or other prominent notification at the start of every user interaction, and at least hourly during continuing interactions, informing the user they are not speaking with a human. For non-screen interactions (e.g., voice), the operator must otherwise inform the user. This applies to all bots — not just companion chatbots — and to all users regardless of age. The only exemption is bots used solely by employees for internal business operations. Operators may demonstrate compliance during a cure period by showing persistent and conspicuous identity indicators conforming to NIST AI RMF and ISO 42001.
Statutory Text
At the beginning of an interaction between a user and a bot, and at least once every hour during the interaction, an operator shall display a pop-up message or other prominent notification notifying the user or, if the interaction is not through a device with a screen, otherwise inform the user, that he or she is not engaging in dialogue with a human counterpart. This section does not apply to a bot that is used solely by employees within a business for its internal operational purposes.
D-01 Automated Processing Rights & Data Controls · D-01.4 · Developer · ChatbotGeneral Consumer App
Fla. Stat. § 501.9986(1)-(2)
Plain Language
AI technology companies may not sell or disclose user personal information unless it has been deidentified — meaning it cannot reasonably be linked to an identified or identifiable individual or their device. Sales or disclosures specifically authorized by federal law are exempt. When holding deidentified data, the company must: (1) take reasonable measures to prevent re-association with users, (2) maintain and use data only in deidentified form (reidentification is permitted only to test deidentification processes), (3) contractually bind data recipients to these same requirements, and (4) implement business processes to prevent inadvertent release. A company may demonstrate compliance during a cure period by showing a risk management program validated against NIST AI RMF / ISO 42001 with assessed controls for deidentification, contractual flow-down, non-reidentification, release prevention, monitoring, and auditing.
Statutory Text
(1) An artificial intelligence technology company may not sell or disclose personal information of users unless the information is deidentified data. This subsection does not prohibit the sale or disclosure of information specifically authorized by federal law. (2) An artificial intelligence technology company in possession of deidentified data shall do all of the following: (a) Take reasonable measures to ensure that the data cannot be associated with a user. (b) Maintain and use the data in deidentified form. An artificial intelligence technology company may not attempt to reidentify the data, except that the artificial intelligence technology company may attempt to reidentify the data solely for the purpose of determining whether its deidentification processes satisfy the requirements of this section. (c) Contractually obligate a recipient of the deidentified data to comply with this section. (d) Implement business processes to prevent the inadvertent release of deidentified data.
CP-02 Non-Consensual Intimate Imagery · CP-02.4 · DeployerDeveloper · Content Generation
Fla. Stat. § 540.08(2)
Plain Language
No person may commercially publish, print, display, or otherwise publicly use an individual's name, portrait, photograph, image, or other likeness created through generative AI without express written or oral consent from the individual, their authorized representative, or (if deceased) their authorized representative, surviving spouse, or surviving children. This extends Florida's existing right of publicity to AI-generated likenesses. Post-mortem rights apply for 40 years after death (per § 540.08(7)). A news media exemption exists for bona fide news reports that acknowledge the speculative authenticity of AI-generated materials. Violations involving servicemembers carry an additional civil penalty of up to $1,000 per commercial transaction.
Statutory Text
A person may not publish, print, display, or otherwise publicly use for trade or for any commercial or advertising purpose the name, portrait, photograph, image, or other likeness of an individual created through generative artificial intelligence without the express written or oral consent to such use given by any of the following: (a) The individual. (b) Any other person authorized in writing by the individual to license the commercial use of the individual's name, image, or likeness. (c) If the individual is deceased: 1. A person authorized in writing to license the commercial use of the individual's name, image, or likeness; or 2. If a person is not authorized, any one individual from a class composed of the deceased individual's surviving spouse and surviving children. A legal parent or guardian may give consent on behalf of a minor surviving child.
PS-01 Government AI Accountability · PS-01.4 · Government · Government System
Fla. Stat. § 287.138(3)(b), (7)
Plain Language
Beginning July 1, 2026, Florida governmental entities may not contract with AI technology vendors that are owned by, controlled by, or organized under the laws of a foreign country of concern. Before accepting any bid or entering a contract for AI technology, software, or products, the government entity must obtain a sworn affidavit from the vendor attesting it has no such foreign-country-of-concern ties. This extends Florida's existing foreign-country-of-concern contracting prohibitions to the AI procurement context. Vendors selling AI to Florida government agencies must be prepared to execute the required affidavit.
Statutory Text
(3)(b) Beginning July 1, 2026, a governmental entity may not accept a bid on, a proposal for, or a reply to, or enter into a contract with, an entity to provide artificial intelligence technology, software, or products, including as a portion or an option to the products or services provided under the contract, unless the entity provides the governmental entity with an affidavit signed by an officer or a representative of the entity under penalty of perjury attesting that the entity does not meet any of the criteria in paragraph (7)(a), paragraph (7)(b), or paragraph (7)(c). (7) A governmental entity may not knowingly enter into a contract with an entity for artificial intelligence technology, software, or products, including as a portion or an option to the products or services provided under the contract, if: (a) The entity is owned by the government of a foreign country of concern; (b) A government of a foreign country of concern has a controlling interest in the entity; or (c) The entity is organized under the laws of or has its principal place of business in a foreign country of concern.
Other · ChatbotGeneral Consumer App
Fla. Stat. § 501.9982
Plain Language
This section enumerates a set of aspirational rights for Florida residents regarding AI — including the right to know when interacting with AI, data protection expectations, right of publicity protections, and political ad transparency. However, subsection (2) expressly states that these rights are exercisable only under existing law and the section does not create new or independent rights. This is a policy declaration that contextualizes the bill's operative provisions but does not itself impose any compliance obligation.
Statutory Text
(1) Residents are entitled to certain rights with respect to the use of artificial intelligence, including, but not limited to: (a) The right to use artificial intelligence to improve their own lives and the lives of family members, fellow residents, and the world at large in accordance with the law. (b) The right to supervise, access, limit, and control their minor children's use of artificial intelligence. (c) The right to know whether they are communicating with a human being or an artificial intelligence system, program, or chatbot. (d) The right to know whether artificial intelligence technology companies are collecting personal information or biometric data, and the right to expect artificial intelligence technology companies to protect and deidentify that information or data in accordance with the law. (e) The right to pursue civil remedies authorized by law against persons who use artificial intelligence to appropriate the name, image, or likeness of others for commercial purposes without their consent. (f) The right to be protected by law from criminal acts, such as fraud, exploitation, identity theft, stalking, and cyberbullying, regardless of whether artificial intelligence is used in the commission of those acts. (g) The right to be protected by law from criminal acts relating to the alteration of existing images to create sexual or lewd or lascivious images or child pornography, regardless of whether artificial intelligence is used in the commission of those acts. (h) The right to know whether political advertisements, electioneering communications, or similar advertisements were created in whole or in part with the use of artificial intelligence. (i) The right to pursue civil remedies authorized by law against others who use artificial intelligence to slander, libel, or defame them. (j) The right to prevent a companion chatbot from engaging with a user as a character that is protected by federal copyright law without the express written consent of the copyright owner. (k) The right to prevent a companion chatbot from engaging with a user as a character that is a living individual without the express written consent of that individual. (l) The right to prevent generative artificial intelligence from using a character that is protected by federal copyright law without the express written consent of the copyright owner. (2) Residents may exercise the rights described in this section in accordance with existing law. This section may not be construed as creating new or independent rights or entitlements.
Other · Government · Education
Fla. Stat. § 1006.1495(2)
Plain Language
Educational entities — including school districts, public schools, and private schools — may not provide students below grade 6 with access to AI instructional tools unless the use is: (1) directed and supervised by school personnel, (2) for translation support for English language learners, or (3) for accommodations or assistive technology for students with documented disabilities. This is a blanket restriction on unsupervised AI tool deployment for younger students, with narrow exceptions for supervised use and accessibility needs.
Statutory Text
An educational entity may not provide students with access to an artificial intelligence instructional tool before grade 6 unless such use is: (a) Directed and supervised by school personnel; (b) For translation or similar support necessary for a student identified as an English language learner; or (c) For accommodations, assistive technology, or similar support necessary for a student with a documented disability.
Other · Government · Education
Fla. Stat. § 1006.1495(3)
Plain Language
Before providing a student with access credentials for an AI instructional tool, the educational entity must give the parent written notice identifying the tool, its educational purpose, how it will be used, how to opt out, and how to access the student's account or request account activity information. This is a pre-access parental notification requirement — the notice must be delivered before the student receives credentials, not after.
Statutory Text
Before a student is provided access credentials for an artificial intelligence instructional tool, the educational entity must provide the parent of a minor student with notice that: (a) Identifies the tool and its educational purpose; (b) Describes, in general terms, the manner in which the tool will be used by students; (c) Explains how the parent may exercise the opt-out process under subsection (4); and (d) Explains how the parent may access the student's account or request access to information and account activity under subsection (5), including the method for submitting a written request.
Other · Government · Education
Fla. Stat. § 1006.1495(4)
Plain Language
Parents must be given the opportunity to opt their minor child out of using an AI instructional tool. The opt-out process must align with the school's existing parental opt-out policies. If a parent exercises the opt-out and the student attends a public school, the school must provide an alternative instructional activity that enables the student to meet the same educational requirements without penalty. This ensures no academic disadvantage from opting out of AI tools.
Statutory Text
(a) A parent of a minor student must be provided the opportunity to opt out of the student's use of an artificial intelligence instructional tool. (b) The opt-out process must align with the educational entity's existing policies for parental notice, consent, objection, or opt out for instructional materials, digital tools, or online accounts, as applicable. (c) If a parent opts out of a student's use of an artificial intelligence instructional tool and the student is enrolled in a public school, the school district or public school must provide an alternative instructional activity that allows the student to meet a comparative educational requirement without penalty.
Other · Deployer · Education
Fla. Stat. § 1006.1495(5)
Plain Language
AI instructional tool operators must simultaneously provide the educational entity with a mechanism to authorize parental access to student account information and activity when student access credentials are issued. This can be satisfied by either: (1) providing parents with read-only credentials to the student's account, or (2) fulfilling written parental requests for account information within 30 days. Educational entities must inform parents of their right to request access. Neither the operator nor educational entity is required to create or retain interaction transcripts beyond what is maintained in ordinary course. This imposes obligations on both the AI tool vendor (operator) and the school (educational entity).
Statutory Text
(a) At the time an operator provides a student's access credentials or otherwise provides or enables student access to an educational entity for an artificial intelligence instructional tool, the operator shall simultaneously provide to the educational entity a means to authorize the parent of a minor student to access information and account activity maintained within the artificial intelligence instructional tool. (b) The operator may satisfy paragraph (a) by: 1. Providing the parent of a minor student credentials or another method for read-only access to the student's account; or 2. Upon written request from the parent of a minor student, providing access to the information and account activity maintained within the tool, in accordance with applicable state and federal law, within 30 days after receipt of the request. The educational entity shall inform the parent of the right to make such a request and the method for submitting the request. (c) If an educational entity satisfies subparagraph (b)1., the educational entity must provide the credentials or other access method at the time the educational entity provides the student with access credentials or otherwise enables student access. (d) This subsection does not require an operator or educational entity to create or retain a transcript or record of student interactions beyond information otherwise maintained in the ordinary course of providing access to the tool.