S-0482
FL · State · USA
FL
USA
● Failed
Effective Date
2026-07-01
Florida CS for SB 482 — Artificial Intelligence Bill of Rights
Florida's Artificial Intelligence Bill of Rights is a multi-part bill imposing obligations across several AI contexts. It requires companion chatbot platforms to obtain parental consent before allowing minors to hold accounts, provide parental controls (including interaction logs, time limits, and harm notifications), disclose AI identity, deliver hourly break reminders to minors, and prevent harmful content. Bot operators must disclose non-human status at the start and hourly during interactions. AI technology companies are prohibited from selling or disclosing user personal information unless it is deidentified data, with specific safeguards required. The bill also prohibits government AI contracting with foreign-country-of-concern entities, restricts AI instructional tool use before grade 6, requires parental notice and opt-out for school AI tools, and extends right-of-publicity protections to AI-generated likenesses. Enforcement is primarily through the Department of Legal Affairs under FDUTPA, with a separate private right of action for minor account holders on companion chatbot platforms. The bill died in committee.
Summary

Florida's Artificial Intelligence Bill of Rights is a multi-part bill imposing obligations across several AI contexts. It requires companion chatbot platforms to obtain parental consent before allowing minors to hold accounts, provide parental controls (including interaction logs, time limits, and harm notifications), disclose AI identity, deliver hourly break reminders to minors, and prevent harmful content. Bot operators must disclose non-human status at the start and hourly during interactions. AI technology companies are prohibited from selling or disclosing user personal information unless it is deidentified data, with specific safeguards required. The bill also prohibits government AI contracting with foreign-country-of-concern entities, restricts AI instructional tool use before grade 6, requires parental notice and opt-out for school AI tools, and extends right-of-publicity protections to AI-generated likenesses. Enforcement is primarily through the Department of Legal Affairs under FDUTPA, with a separate private right of action for minor account holders on companion chatbot platforms. The bill died in committee.

Enforcement & Penalties
Enforcement Authority
Department of Legal Affairs (Attorney General) is the sole enforcing authority for companion chatbot (§ 501.9984), bot disclosure (§ 501.9985), and deidentified data (§ 501.9986) provisions, enforced as deceptive or unfair trade practices under Part II of Chapter 501. For companion chatbot violations, ss. 501.211 and 501.212 are not expressly excluded, so the general FDUTPA private right of action may apply; however, the statute states enforcement is 'solely by the department.' For bot disclosure and deidentified data violations, ss. 501.211 and 501.212 are expressly excluded, eliminating private suits under FDUTPA. A separate private right of action exists under § 501.9984(5) for minor account holders for knowing or reckless violations of the companion chatbot section. The department may grant a 45-calendar-day cure period after written notice of an alleged violation, considering the number and frequency of violations, likelihood of public injury, and safety of persons or property. The cure period does not apply where the platform willfully or knowingly disregarded an account holder's age. For unauthorized use of name, image, or likeness under § 540.08, the affected individual or authorized representative may bring a private action. The Department of Legal Affairs also has investigative subpoena authority under § 501.9987.
Penalties
For companion chatbot violations enforced by the department: civil penalty of up to $50,000 per violation, plus reasonable attorney fees and court costs; punitive damages available where violations are part of a consistent pattern of knowing or reckless conduct. Private right of action for minor account holders under § 501.9984(5): up to $10,000 in damages per violation plus court costs and reasonable attorney fees. For bot disclosure and deidentified data violations enforced by the department: civil penalty of up to $50,000 per violation, plus reasonable attorney fees and court costs. For unauthorized use of name/image/likeness (§ 540.08): injunctive relief, actual damages including reasonable royalty, and punitive or exemplary damages; additional civil penalty of up to $1,000 per violation for servicemember likeness. For obstruction of investigation subpoenas: civil penalty of up to $5,000 per week in violation, plus reasonable attorney fees and costs.
Who Is Covered
"Companion chatbot platform" means a platform that allows a user to engage with companion chatbots.
"Operator" means a person who owns, operates, or otherwise makes available a bot to individuals in this state.
"Artificial intelligence technology company" means a business or organization that produces, develops, creates, designs, or manufactures artificial intelligence technology or products, collects data for use in artificial intelligence products, or implements artificial intelligence technology.
What Is Covered
"Companion chatbot" means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a user's social needs by retaining information on prior interactions or user sessions and user preferences to personalize the interaction and facilitate ongoing engagement, asking unprompted or unsolicited emotion-based questions that go beyond a direct response to a user prompt, and sustaining an ongoing dialogue personalized to the user. The term does not include: (a) A chatbot used only for customer service; a business's internal operational purposes, productivity and analysis; or uses related to source information, internal research, or technical assistance; (b) A chatbot that is a feature of a video game or theme park and is limited to replies related to the video game or theme park experience and does not discuss topics related to mental health, self-harm, or material harmful to minors or maintain a dialogue on other topics unrelated to the video game or theme park; (c) A stand-alone consumer electronic device that functions as a speaker and voice command interface, acts as a voice activated virtual assistant, and does not sustain a relationship across multiple interactions or generate outputs likely to elicit emotional responses in the user; or (d) An artificial intelligence instructional tool, as defined in s. 1006.1495.
"Bot" means an automated online software application in which all or substantially all of the actions or posts of the account are not the result of a natural person.
"Artificial intelligence instructional tool" means a software application or service that uses artificial intelligence, including machine learning, which is made available to a student by an educational entity for educational purposes, including instruction, tutoring, practice, feedback, or completing educator-directed assignments, and that is not designed, marketed, or configured to: 1. Meet a student's social needs; 2. Simulate friendship, companionship, or an emotional relationship with a student; or 3. Employ relationship-building or anthropomorphic design features for the purpose of encouraging a student to continue interacting with the system.
Compliance Obligations 14 obligations · click obligation ID to open requirement page
MN-01 Minor User AI Safety Protections · MN-01.2 · Deployer · ChatbotMinors
Fla. Stat. § 501.9984(1)
Plain Language
Companion chatbot platforms must block minors (17 and under) from becoming or maintaining an account unless a parent or guardian has consented. The statute treats the allowance of a minor account holder as formation of a contract between the platform and the minor. This applies when the platform knows or has reason to believe the individual is a Florida resident.
Statutory Text
A companion chatbot platform shall prohibit a minor from becoming or being an account holder unless the minor's parent or guardian provides consent. If a companion chatbot platform allows a minor to become or be an account holder, the parties have entered into a contract.
MN-01 Minor User AI Safety Protections · MN-01.3 · Deployer · ChatbotMinors
Fla. Stat. § 501.9984(1)(a)
Plain Language
Once a parent consents to a minor's account, the platform must provide the parent or guardian with five specific control tools: (1) access to full interaction history (past and present), (2) daily time limits, (3) day-of-week and time-of-day access controls, (4) ability to disable third-party interactions on the platform, and (5) timely notifications when the minor expresses self-harm or intent to harm others. These controls must be made available to the consenting parent — not merely offered as optional features. The self-harm notification requirement (item 5) also maps to MN-02.4 (parental notification on crisis detection).
Statutory Text
If the minor's parent or guardian provides consent for the minor to become an account holder or maintain an existing account, the companion chatbot platform must allow the consenting parent or guardian of the minor account holder to: 1. Receive copies of all past or present interactions between the account holder and the companion chatbot; 2. Limit the amount of time that the account holder may interact with the companion chatbot each day; 3. Limit the days of the week and the times during the day when the account holder may interact with the companion chatbot; 4. Disable any of the interactions between the account holder and third-party account holders on the companion chatbot platform; and 5. Receive timely notifications if the account holder expresses to the companion chatbot a desire or an intent to engage in harm to self or others.
MN-01 Minor User AI Safety Protections · MN-01.9 · Deployer · ChatbotMinors
Fla. Stat. § 501.9984(1)(b)
Plain Language
Companion chatbot platforms must terminate minor accounts lacking parental consent (with 90 days to dispute), honor minor self-termination requests within 5 business days, honor parent/guardian termination requests within 10 business days, and permanently delete all personal information associated with terminated minor accounts unless retention is required by law. The 90-day dispute window applies only to platform-initiated terminations of accounts identified as belonging to minors for content/advertising targeting purposes. The deletion obligation is mandatory and automatic upon termination.
Statutory Text
A companion chatbot platform shall do all of the following: 1. Terminate any account or identifier belonging to an account holder who is a minor if the companion chatbot platform treats or categorizes the account or identifier as belonging to a minor for purposes of targeting content or advertising and if the minor's parent or guardian has not provided consent for the minor pursuant to subsection (1). The companion chatbot platform shall provide 90 days for the account holder to dispute the termination. Termination must be effective upon the expiration of the 90 days if the account holder fails to effectively dispute the termination. 2. Allow an account holder who is a minor to request to terminate the account or identifier. Termination must be effective within 5 business days after the request. 3. Allow the consenting parent or guardian of an account holder who is a minor to request that the minor's account or identifier be terminated. Termination must be effective within 10 business days after the request. 4. Permanently delete all personal information held by the companion chatbot platform relating to the terminated minor account or identifier, unless state or federal law requires the platform to maintain the information.
T-01 AI Identity Disclosure · T-01.1T-01.2 · Deployer · ChatbotMinors
Fla. Stat. § 501.9984(2)(a)-(b)
Plain Language
For all minor account holders, companion chatbot platforms must (1) unconditionally disclose that the user is interacting with AI, and (2) provide a clear, conspicuous notification at the start of every interaction and at least once every hour during continuing interactions reminding the minor to take a break and that the chatbot is AI-generated, not human. The hourly notification is a minimum — platforms may notify more frequently. Both obligations are unconditional for minor accounts; there is no 'reasonable person' trigger.
Statutory Text
In connection to all accounts or identifiers held by account holders who are minors, the companion chatbot platform shall do all of the following: (a) Disclose to the account holder that he or she is interacting with artificial intelligence. (b) Provide by default a clear and conspicuous notification to the account holder, at the beginning of companion chatbot interactions and at least once every hour during continuing interactions, reminding the minor to take a break and that the companion chatbot is artificially generated and not human.
S-02 Prohibited Conduct & Output Restrictions · S-02.6 · Deployer · ChatbotMinors
Fla. Stat. § 501.9984(2)(c)
Plain Language
Companion chatbot platforms must implement reasonable measures to prevent their chatbots from producing or sharing material harmful to minors, and from encouraging minor account holders to engage in conduct described or depicted in such material, when interacting with minor accounts. This is an affirmative, ongoing obligation to institute technical and operational safeguards. A platform may demonstrate compliance by showing controls aligned with NIST AI RMF or ISO 42001, including structured interaction logs, parental access controls, harm-signal detection procedures, and verified deletion events, per the safe harbor provision in § 501.9984(4)(a)(2).
Statutory Text
Institute reasonable measures to prevent the companion chatbot from producing or sharing materials harmful to minors or encouraging the account holder to engage in any of the conduct described or depicted in materials harmful to minors.
T-01 AI Identity Disclosure · T-01.1T-01.2 · Deployer · Chatbot
Fla. Stat. § 501.9985(1)
Plain Language
All bot operators must display a pop-up or other prominent notification at the start of every interaction and at least hourly during ongoing interactions informing the user that they are not communicating with a human. For non-screen interactions, the operator must otherwise inform the user. This is an unconditional disclosure obligation — it applies regardless of whether a reasonable person would be misled. Internal employee-only bots are exempt. The safe harbor allows operators to demonstrate compliance by showing persistent and conspicuous identity indicators conforming with NIST AI RMF or ISO 42001. This section expressly excludes private suits under ss. 501.211 and 501.212; enforcement is solely by the Department of Legal Affairs.
Statutory Text
At the beginning of an interaction between a user and a bot, and at least once every hour during the interaction, an operator shall display a pop-up message or other prominent notification notifying the user or, if the interaction is not through a device with a screen, otherwise inform the user, that he or she is not engaging in dialogue with a human counterpart. This section does not apply to a bot that is used solely by employees within a business for its internal operational purposes.
D-01 Automated Processing Rights & Data Controls · D-01.4 · DeveloperDeployer · General Consumer App
Fla. Stat. § 501.9986(1)-(2)
Plain Language
AI technology companies may not sell or disclose user personal information unless it has been deidentified — i.e., it cannot reasonably be linked to an identified or identifiable individual or their device. Where the company possesses deidentified data, it must: (1) take reasonable measures to prevent re-association with users, (2) maintain data in deidentified form and not attempt reidentification (except to test its own deidentification processes), (3) contractually bind recipients to the same obligations, and (4) implement business processes to prevent inadvertent release. Sales or disclosures specifically authorized by federal law are exempt. The safe harbor allows a company to demonstrate compliance by showing a risk management program validated against NIST AI RMF or ISO 42001 with assessed controls for deidentification, contractual flow-down, non-reidentification, inadvertent release prevention, monitoring, and auditing.
Statutory Text
(1) An artificial intelligence technology company may not sell or disclose personal information of users unless the information is deidentified data. This subsection does not prohibit the sale or disclosure of information specifically authorized by federal law. (2) An artificial intelligence technology company in possession of deidentified data shall do all of the following: (a) Take reasonable measures to ensure that the data cannot be associated with a user. (b) Maintain and use the data in deidentified form. An artificial intelligence technology company may not attempt to reidentify the data, except that the artificial intelligence technology company may attempt to reidentify the data solely for the purpose of determining whether its deidentification processes satisfy the requirements of this section. (c) Contractually obligate a recipient of the deidentified data to comply with this section. (d) Implement business processes to prevent the inadvertent release of deidentified data.
CP-02 Non-Consensual Intimate Imagery · CP-02.4 · DeveloperDeployer · Content Generation
Fla. Stat. § 540.08(2)-(3)
Plain Language
No person may commercially use an individual's name, portrait, photograph, image, or other likeness — whether created through generative AI (subsection 2) or by traditional means (subsection 3) — without express consent from the individual, an authorized representative, or, if the individual is deceased, an authorized person or a surviving spouse/child. The generative AI provision in subsection (2) is new and creates a parallel consent requirement specifically for AI-generated likenesses. Post-mortem rights extend 40 years after death. Violations are actionable through injunction, actual damages including reasonable royalty, and punitive damages. Servicemember likenesses carry an additional civil penalty of up to $1,000 per commercial transaction.
Statutory Text
(2) A person may not publish, print, display, or otherwise publicly use for trade or for any commercial or advertising purpose the name, portrait, photograph, image, or other likeness of an individual created through generative artificial intelligence without the express written or oral consent to such use given by any of the following: (a) The individual. (b) Any other person authorized in writing by the individual to license the commercial use of the individual's name, image, or likeness. (c) If the individual is deceased: 1. A person authorized in writing to license the commercial use of the individual's name, image, or likeness; or 2. If a person is not authorized, any one individual from a class composed of the deceased individual's surviving spouse and surviving children. A legal parent or guardian may give consent on behalf of a minor surviving child. (3) A person may not publish, print, display or otherwise publicly use for purposes of trade or for any commercial or advertising purpose the name, portrait, photograph, image, or other likeness of an individual without the express written or oral consent to such use given by any of the following: (a) The individual. (b) Any other person authorized in writing by the individual to license the commercial use of the individual's name, image, or likeness. (c) If the individual is deceased: 1. A person authorized in writing to license the commercial use of the deceased individual's name, image, or likeness; or 2. If a person is not authorized, any one individual from a class composed of the individual's surviving spouse and surviving children. A legal parent or guardian may give consent on behalf of a minor surviving child.
Other · Government System
Fla. Stat. § 287.138(3)(b), (7)
Plain Language
Florida governmental entities may not procure AI technology, software, or products from entities owned by, controlled by, or based in a foreign country of concern. Vendors must submit a perjury-backed affidavit attesting they do not meet any of the foreign-country-of-concern criteria. This is a government procurement restriction — it imposes obligations on government procurement officers and requires vendor attestation, but does not create a general compliance obligation for AI developers or deployers.
Statutory Text
(b) Beginning July 1, 2026, a governmental entity may not accept a bid on, a proposal for, or a reply to, or enter into a contract with, an entity to provide artificial intelligence technology, software, or products, including as a portion or an option to the products or services provided under the contract, unless the entity provides the governmental entity with an affidavit signed by an officer or a representative of the entity under penalty of perjury attesting that the entity does not meet any of the criteria in paragraph (7)(a), paragraph (7)(b), or paragraph (7)(c). (7) A governmental entity may not knowingly enter into a contract with an entity for artificial intelligence technology, software, or products, including as a portion or an option to the products or services provided under the contract, if: (a) The entity is owned by the government of a foreign country of concern; (b) A government of a foreign country of concern has a controlling interest in the entity; or (c) The entity is organized under the laws of or has its principal place of business in a foreign country of concern.
Other · Education
Fla. Stat. § 1006.1495(2)
Plain Language
Educational entities (school districts, public schools, and private schools, including VPK providers) may not give students access to AI instructional tools before grade 6. Three narrow exceptions apply: (1) use directed and supervised by school personnel, (2) translation support for English language learners, and (3) accommodations or assistive technology for students with documented disabilities. This is a bright-line access restriction on the educational entity, not on the AI tool developer.
Statutory Text
An educational entity may not provide students with access to an artificial intelligence instructional tool before grade 6 unless such use is: (a) Directed and supervised by school personnel; (b) For translation or similar support necessary for a student identified as an English language learner; or (c) For accommodations, assistive technology, or similar support necessary for a student with a documented disability.
Other · Education
Fla. Stat. § 1006.1495(3)-(4)
Plain Language
Before granting a student access to an AI instructional tool, the educational entity must notify the student's parent, identifying the tool, its educational purpose, how it will be used, and how to exercise opt-out and account access rights. Parents must be given the opportunity to opt out, with the opt-out process aligned to the entity's existing policies. If a parent opts out at a public school, the school must provide an alternative instructional activity meeting the same educational requirement without penalty to the student.
Statutory Text
(3) Before a student is provided access credentials for an artificial intelligence instructional tool, the educational entity must provide the parent of a minor student with notice that: (a) Identifies the tool and its educational purpose; (b) Describes, in general terms, the manner in which the tool will be used by students; (c) Explains how the parent may exercise the opt-out process under subsection (4); and (d) Explains how the parent may access the student's account or request access to information and account activity under subsection (5), including the method for submitting a written request. (4)(a) A parent of a minor student must be provided the opportunity to opt out of the student's use of an artificial intelligence instructional tool. (b) The opt-out process must align with the educational entity's existing policies for parental notice, consent, objection, or opt out for instructional materials, digital tools, or online accounts, as applicable. (c) If a parent opts out of a student's use of an artificial intelligence instructional tool and the student is enrolled in a public school, the school district or public school must provide an alternative instructional activity that allows the student to meet a comparative educational requirement without penalty.
Other · Education
Fla. Stat. § 1006.1495(5)
Plain Language
When an AI instructional tool operator provides student access credentials to an educational entity, the operator must simultaneously provide a means for the educational entity to authorize parental access to the student's account information. The operator can satisfy this by either (1) providing read-only parent credentials, or (2) responding to written parental requests within 30 days. The educational entity must inform parents of their right to request access. Neither the operator nor the educational entity is required to create or retain transcripts of student interactions beyond what is maintained in the ordinary course of operations.
Statutory Text
(5)(a) At the time an operator provides a student's access credentials or otherwise provides or enables student access to an educational entity for an artificial intelligence instructional tool, the operator shall simultaneously provide to the educational entity a means to authorize the parent of a minor student to access information and account activity maintained within the artificial intelligence instructional tool. (b) The operator may satisfy paragraph (a) by: 1. Providing the parent of a minor student credentials or another method for read-only access to the student's account; or 2. Upon written request from the parent of a minor student, providing access to the information and account activity maintained within the tool, in accordance with applicable state and federal law, within 30 days after receipt of the request. The educational entity shall inform the parent of the right to make such a request and the method for submitting the request. (c) If an educational entity satisfies subparagraph (b)1., the educational entity must provide the credentials or other access method at the time the educational entity provides the student with access credentials or otherwise enables student access. (d) This subsection does not require an operator or educational entity to create or retain a transcript or record of student interactions beyond information otherwise maintained in the ordinary course of providing access to the tool.
Other · General Consumer App
Fla. Stat. § 501.9982
Plain Language
This section enumerates aspirational rights of Florida residents related to AI, including the right to know when communicating with AI, to control children's AI use, to protect their name and likeness from unauthorized AI use, and to pursue remedies against AI-facilitated defamation and other harms. However, subsection (2) expressly provides that these rights are exercisable only under existing law and the section may not be construed as creating new or independent rights. This is a policy declaration, not a source of new compliance obligations.
Statutory Text
(1) Residents are entitled to certain rights with respect to the use of artificial intelligence, including, but not limited to: (a) The right to use artificial intelligence to improve their own lives and the lives of family members, fellow residents, and the world at large in accordance with the law. (b) The right to supervise, access, limit, and control their minor children's use of artificial intelligence. (c) The right to know whether they are communicating with a human being or an artificial intelligence system, program, or chatbot. (d) The right to know whether artificial intelligence technology companies are collecting personal information or biometric data, and the right to expect artificial intelligence technology companies to protect and deidentify that information or data in accordance with the law. (e) The right to pursue civil remedies authorized by law against persons who use artificial intelligence to appropriate the name, image, or likeness of others for commercial purposes without their consent. (f) The right to be protected by law from criminal acts, such as fraud, exploitation, identity theft, stalking, and cyberbullying, regardless of whether artificial intelligence is used in the commission of those acts. (g) The right to be protected by law from criminal acts relating to the alteration of existing images to create sexual or lewd or lascivious images or child pornography, regardless of whether artificial intelligence is used in the commission of those acts. (h) The right to know whether political advertisements, electioneering communications, or similar advertisements were created in whole or in part with the use of artificial intelligence. (i) The right to pursue civil remedies authorized by law against others who use artificial intelligence to slander, libel, or defame them. (j) The right to prevent a companion chatbot from engaging with a user as a character that is protected by federal copyright law without the express written consent of the copyright owner. (k) The right to prevent a companion chatbot from engaging with a user as a character that is a living individual without the express written consent of that individual. (l) The right to prevent generative artificial intelligence from using a character that is protected by federal copyright law without the express written consent of the copyright owner. (2) Residents may exercise the rights described in this section in accordance with existing law. This section may not be construed as creating new or independent rights or entitlements.
Other · General Consumer App
Fla. Stat. § 501.9987
Plain Language
This section grants the Department of Legal Affairs broad investigative authority to enforce the Artificial Intelligence Bill of Rights, including the power to administer oaths, issue subpoenas, and collect evidence. Parties served may challenge subpoenas in circuit court. Obstruction of investigations carries civil penalties of up to $5,000 per week. This is a procedural enforcement mechanism — it creates no independent compliance obligations on AI companies beyond cooperating with lawful investigations.
Statutory Text
(1) If, by its own inquiry or as a result of complaints, the department has reason to believe that a person has engaged in, or is engaging in, a practice or an act that violates this part, the department may administer oaths and affirmations, subpoena witnesses or matter, and collect evidence. Within 5 days, excluding weekends and legal holidays, after service of a subpoena, or at any time before the return date specified in the subpoena, whichever time period is longer, the party served may file in the circuit court in the county in which it resides or in which it transacts business and serve upon the enforcing authority a petition for an order modifying or setting aside the subpoena. The petitioner may raise any objection or privilege that would be available upon service of a subpoena in a civil action. The subpoena must inform the party served of the party's rights under this subsection. (2) If the matter that the department seeks to obtain by subpoena is located outside this state, the person subpoenaed may make the matter available to the department or its representative at the place where it is located. The department may designate representatives, including officials of the state in which the matter is located, to inspect the matter on its behalf and may respond to similar requests from officials of other states. (3) Upon the failure of a person, without lawful excuse, to obey a subpoena and upon reasonable notice to all persons affected, the department may apply to the circuit court for an order compelling compliance. (4) The department may request that a person who refuses to comply with a subpoena on the grounds that the testimony or matter may be self-incriminating be ordered by the court to provide the testimony or matter. Except in a prosecution for perjury, a person who complies with a court order to provide testimony or matter after asserting a valid privilege against self-incrimination may not have the testimony or matter so provided, or evidence derived from the testimony or matter, received against the person in any criminal investigation or proceeding. (5) A person upon whom a subpoena is served pursuant to this part must comply with its terms unless otherwise provided by order of the court. A person who fails to appear, with the intent to avoid, evade, or prevent compliance in whole or in part with an investigation under this part, or who removes from any place, conceals, withholds, mutilates, alters, or destroys, or by any other means falsifies any documentary material in the possession, custody, or control of a person subject to a subpoena, or who knowingly conceals relevant information with the intent to avoid, evade, or prevent compliance, is liable for a civil penalty of not more than $5,000 per week in violation, reasonable attorney fees, and costs. (6) The department may adopt rules to implement this section.