SB-482
FL · State · USA
FL
USA
● Failed
Effective Date
2026-07-01
Florida SB 482 — Artificial Intelligence Bill of Rights
Florida's AI Bill of Rights imposes obligations across multiple domains: companion chatbot platforms must obtain parental consent before allowing minors to hold accounts, provide parental controls (including access to chat logs, time limits, and self-harm notifications), disclose AI identity to minor users, display hourly break reminders, and prevent harmful content; bot operators must disclose AI identity at the start and hourly during interactions with all users; AI technology companies may not sell or disclose user personal information unless deidentified; and no person may commercially use an individual's AI-generated name, image, or likeness without consent. Educational entities face restrictions on AI instructional tool use before grade 6 and must provide parental notice, opt-out, and account access. Government entities are prohibited from contracting for AI technology with entities owned by or organized under foreign countries of concern. Enforcement is primarily through the Department of Legal Affairs under the Florida DUDTPA framework, with civil penalties up to $50,000 per violation and a 45-day cure period. A limited private right of action exists for minor account holders against companion chatbot platforms (up to $10,000 per violation) and for unauthorized name/image/likeness use.
Summary

Florida's AI Bill of Rights imposes obligations across multiple domains: companion chatbot platforms must obtain parental consent before allowing minors to hold accounts, provide parental controls (including access to chat logs, time limits, and self-harm notifications), disclose AI identity to minor users, display hourly break reminders, and prevent harmful content; bot operators must disclose AI identity at the start and hourly during interactions with all users; AI technology companies may not sell or disclose user personal information unless deidentified; and no person may commercially use an individual's AI-generated name, image, or likeness without consent. Educational entities face restrictions on AI instructional tool use before grade 6 and must provide parental notice, opt-out, and account access. Government entities are prohibited from contracting for AI technology with entities owned by or organized under foreign countries of concern. Enforcement is primarily through the Department of Legal Affairs under the Florida DUDTPA framework, with civil penalties up to $50,000 per violation and a 45-day cure period. A limited private right of action exists for minor account holders against companion chatbot platforms (up to $10,000 per violation) and for unauthorized name/image/likeness use.

Enforcement & Penalties
Enforcement Authority
Department of Legal Affairs (Attorney General) is the sole enforcing authority for §§ 501.9984–501.9986. Enforcement is agency-initiated or complaint-driven; the department may investigate on its own inquiry or upon receipt of complaints. The department may grant a 45-calendar-day cure period after written notice of an alleged violation; if the violation is cured to the department's satisfaction, the department may not bring an action for that violation. The 45-day cure period does not apply where the companion chatbot platform willfully or knowingly disregarded an account holder's age. For § 501.9984 (companion chatbot minors), a private right of action exists: an action may be brought on behalf of a minor account holder for knowing or reckless violations. For §§ 501.9985 and 501.9986, ss. 501.211 and 501.212 (which provide consumer private remedies under FDUTPA) do not apply, meaning enforcement is exclusively by the department. For § 540.08 (name/image/likeness), the individual or authorized person may bring a private action to enjoin unauthorized use and recover damages.
Penalties
Section 501.9984 (companion chatbot minors): Department may collect civil penalties up to $50,000 per violation plus reasonable attorney fees and court costs. Punitive damages may be assessed for consistent patterns of knowing or reckless conduct. Private right of action for minor account holders: up to $10,000 in damages plus court costs and reasonable attorney fees for knowing or reckless violations. Section 501.9985 (bots): Civil penalty up to $50,000 per violation plus reasonable attorney fees and court costs; no private right of action. Section 501.9986 (deidentified data): Civil penalty up to $50,000 per violation plus reasonable attorney fees and court costs; no private right of action. Section 501.9987 (investigations): Civil penalty up to $5,000 per week for subpoena noncompliance plus reasonable attorney fees and costs. Section 540.08 (name/image/likeness): Injunctive relief, actual damages including reasonable royalty, and punitive or exemplary damages. Additional civil penalty up to $1,000 per violation for unauthorized use of a servicemember's name, image, or likeness.
Who Is Covered
"Companion chatbot platform" means a platform that allows a user to engage with companion chatbots.
"Operator" means a person who owns, operates, or otherwise makes available a bot to individuals in this state.
"Artificial intelligence technology company" means a business or organization that produces, develops, creates, designs, or manufactures artificial intelligence technology or products, collects data for use in artificial intelligence products, or implements artificial intelligence technology.
What Is Covered
"Companion chatbot" means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a user's social needs by retaining information on prior interactions or user sessions and user preferences to personalize the interaction and facilitate ongoing engagement, asking unprompted or unsolicited emotion-based questions that go beyond a direct response to a user prompt, and sustaining an ongoing dialogue personalized to the user. The term does not include: (a) A chatbot used only for customer service; a business's internal operational purposes, productivity and analysis; or uses related to source information, internal research, or technical assistance; (b) A chatbot that is a feature of a video game or theme park and is limited to replies related to the video game or theme park experience and does not discuss topics related to mental health, self-harm, or material harmful to minors or maintain a dialogue on other topics unrelated to the video game or theme park; (c) A stand-alone consumer electronic device that functions as a speaker and voice command interface, acts as a voice activated virtual assistant, and does not sustain a relationship across multiple interactions or generate outputs likely to elicit emotional responses in the user; or (d) An artificial intelligence instructional tool, as defined in s. 1006.1495.
"Bot" means an automated online software application in which all or substantially all of the actions or posts of the account are not the result of a natural person.
"Artificial intelligence instructional tool" means a software application or service that uses artificial intelligence, including machine learning, which is made available to a student by an educational entity for educational purposes, including instruction, tutoring, practice, feedback, or completing educator-directed assignments, and that is not designed, marketed, or configured to: 1. Meet a student's social needs; 2. Simulate friendship, companionship, or an emotional relationship with a student; or 3. Employ relationship-building or anthropomorphic design features for the purpose of encouraging a student to continue interacting with the system.
Compliance Obligations 13 obligations · click obligation ID to open requirement page
MN-01 Minor User AI Safety Protections · MN-01.2 · Deployer · ChatbotMinors
Fla. Stat. § 501.9984(1)
Plain Language
Companion chatbot platforms must block minors (17 and under) from creating or maintaining accounts unless a parent or guardian consents. If the platform does allow a minor to become an account holder, the relationship is treated as a contract. The age threshold is 17 — slightly broader than California SB 243, which relies on the platform's actual knowledge of minor status without specifying a parental consent gate as a prerequisite to account creation.
Statutory Text
A companion chatbot platform shall prohibit a minor from becoming or being an account holder unless the minor's parent or guardian provides consent. If a companion chatbot platform allows a minor to become or be an account holder, the parties have entered into a contract.
MN-01 Minor User AI Safety Protections · MN-01.3 · Deployer · ChatbotMinors
Fla. Stat. § 501.9984(1)(a)
Plain Language
Once a parent consents to a minor's account, the platform must provide the parent with a suite of parental control tools: access to the full history of the minor's chat interactions, daily time limits, day-of-week and time-of-day access restrictions, the ability to disable third-party interactions, and timely notifications when the minor expresses self-harm or intent to harm others. The chat history access requirement (all past or present interactions) goes further than California SB 243, which does not mandate parental access to full chat logs. The self-harm notification obligation to parents is also a distinct requirement not found in California's companion chatbot law.
Statutory Text
If the minor's parent or guardian provides consent for the minor to become an account holder or maintain an existing account, the companion chatbot platform must allow the consenting parent or guardian of the minor account holder to: 1. Receive copies of all past or present interactions between the account holder and the companion chatbot; 2. Limit the amount of time that the account holder may interact with the companion chatbot each day; 3. Limit the days of the week and the times during the day when the account holder may interact with the companion chatbot; 4. Disable any of the interactions between the account holder and third-party account holders on the companion chatbot platform; and 5. Receive timely notifications if the account holder expresses to the companion chatbot a desire or an intent to engage in harm to self or others.
MN-01 Minor User AI Safety Protections · MN-01.9 · Deployer · ChatbotMinors
Fla. Stat. § 501.9984(1)(b)
Plain Language
Platforms must terminate minor accounts lacking parental consent (with a 90-day dispute window), honor minor-initiated account termination requests within 5 business days, and honor parent-initiated termination requests within 10 business days. Upon termination, all personal information associated with the minor's account must be permanently deleted unless retention is required by law. The differentiated timelines (5 days for minor requests vs. 10 days for parental requests) and the 90-day dispute period for platform-initiated terminations are distinctive features not found in California SB 243.
Statutory Text
A companion chatbot platform shall do all of the following: 1. Terminate any account or identifier belonging to an account holder who is a minor if the companion chatbot platform treats or categorizes the account or identifier as belonging to a minor for purposes of targeting content or advertising and if the minor's parent or guardian has not provided consent for the minor pursuant to subsection (1). The companion chatbot platform shall provide 90 days for the account holder to dispute the termination. Termination must be effective upon the expiration of the 90 days if the account holder fails to effectively dispute the termination. 2. Allow an account holder who is a minor to request to terminate the account or identifier. Termination must be effective within 5 business days after the request. 3. Allow the consenting parent or guardian of an account holder who is a minor to request that the minor's account or identifier be terminated. Termination must be effective within 10 business days after the request. 4. Permanently delete all personal information held by the companion chatbot platform relating to the terminated minor account or identifier, unless state or federal law requires the platform to maintain the information.
T-01 AI Identity Disclosure · T-01.1T-01.2 · Deployer · ChatbotMinors
Fla. Stat. § 501.9984(2)(a)-(b)
Plain Language
For all minor account holders, the platform must unconditionally disclose that the user is interacting with AI, and must display a clear, conspicuous reminder at the start and at least every hour during ongoing interactions that the chatbot is AI-generated and that the minor should take a break. The hourly reminder interval is more frequent than California SB 243's every-three-hours floor, making this a stricter periodic disclosure requirement. Both obligations are unconditional — they apply regardless of whether the minor could be misled.
Statutory Text
In connection to all accounts or identifiers held by account holders who are minors, the companion chatbot platform shall do all of the following: (a) Disclose to the account holder that he or she is interacting with artificial intelligence. (b) Provide by default a clear and conspicuous notification to the account holder, at the beginning of companion chatbot interactions and at least once every hour during continuing interactions, reminding the minor to take a break and that the companion chatbot is artificially generated and not human.
S-02 Prohibited Conduct & Output Restrictions · S-02.6 · Deployer · ChatbotMinors
Fla. Stat. § 501.9984(2)(c)
Plain Language
Companion chatbot platforms must implement reasonable measures to prevent their chatbots from generating or sharing material harmful to minors and from encouraging minor users to engage in conduct depicted in such material. The standard is 'reasonable measures,' providing some flexibility. During enforcement, a platform may present evidence that its controls align with the NIST AI Risk Management Framework and ISO 42001, including structured interaction logs, parental access controls, harm-signal detection procedures, and verified deletion events, as mitigating factors under the 45-day cure process.
Statutory Text
Institute reasonable measures to prevent the companion chatbot from producing or sharing materials harmful to minors or encouraging the account holder to engage in any of the conduct described or depicted in materials harmful to minors.
T-01 AI Identity Disclosure · T-01.1T-01.2 · Deployer · Chatbot
Fla. Stat. § 501.9985(1)
Plain Language
All bot operators must display a pop-up or other prominent notification at the start of every user interaction — and at least once every hour during continuing interactions — informing the user they are not communicating with a human. For non-screen interactions (e.g., voice), the operator must otherwise inform the user. This applies to all bots, not just companion chatbots, making it a broad AI identity disclosure obligation. Internal-use-only bots used solely by employees for business operational purposes are exempt. The hourly reminder requirement applies to all users regardless of age, which is more expansive than California SB 243 (which imposes periodic reminders only for known minors). During enforcement, operators may present evidence of NIST AI RMF/ISO 42001-aligned identity indicators and disclosures as mitigating factors.
Statutory Text
At the beginning of an interaction between a user and a bot, and at least once every hour during the interaction, an operator shall display a pop-up message or other prominent notification notifying the user or, if the interaction is not through a device with a screen, otherwise inform the user, that he or she is not engaging in dialogue with a human counterpart. This section does not apply to a bot that is used solely by employees within a business for its internal operational purposes.
D-01 Automated Processing Rights & Data Controls · D-01.4 · Developer · ChatbotGeneral Consumer App
Fla. Stat. § 501.9986(1)-(2)
Plain Language
AI technology companies may not sell or disclose users' personal information unless the information has been deidentified — meaning it cannot reasonably be linked to an identified or identifiable individual or their device. Sales authorized by federal law are excepted. Companies holding deidentified data must take reasonable measures to prevent re-association with users, maintain data in deidentified form, not attempt reidentification (except for testing deidentification processes), contractually require recipients to comply with the same rules, and implement processes to prevent inadvertent release. During enforcement, companies may present evidence of a risk management program aligned with NIST AI RMF/ISO 42001 that includes assessed controls for deidentification, contractual flow-down, non-reidentification, and auditing.
Statutory Text
(1) An artificial intelligence technology company may not sell or disclose personal information of users unless the information is deidentified data. This subsection does not prohibit the sale or disclosure of information specifically authorized by federal law. (2) An artificial intelligence technology company in possession of deidentified data shall do all of the following: (a) Take reasonable measures to ensure that the data cannot be associated with a user. (b) Maintain and use the data in deidentified form. An artificial intelligence technology company may not attempt to reidentify the data, except that the artificial intelligence technology company may attempt to reidentify the data solely for the purpose of determining whether its deidentification processes satisfy the requirements of this section. (c) Contractually obligate a recipient of the deidentified data to comply with this section. (d) Implement business processes to prevent the inadvertent release of deidentified data.
CP-02 Non-Consensual Intimate Imagery · CP-02.4 · DeployerDeveloper · Content Generation
Fla. Stat. § 540.08(2)-(3)
Plain Language
No person may commercially publish, display, or use an individual's name, portrait, image, or likeness — whether created through generative AI (subsection 2) or otherwise (subsection 3) — without express consent from the individual, an authorized representative, or (if deceased) an authorized licensee or surviving spouse/child. The AI-specific provision (subsection 2) extends existing right-of-publicity protections to AI-generated likenesses, covering post-mortem rights with a 40-year window after death. Consent may be written or oral. News media exceptions apply, though AI-generated content used in news must include a clear acknowledgment of speculation regarding authenticity. Remedies include injunctive relief, actual damages including reasonable royalty, and punitive or exemplary damages. An additional civil penalty of up to $1,000 per violation applies for unauthorized use of a servicemember's likeness.
Statutory Text
(2) A person may not publish, print, display, or otherwise publicly use for trade or for any commercial or advertising purpose the name, portrait, photograph, image, or other likeness of an individual created through generative artificial intelligence without the express written or oral consent to such use given by any of the following: (a) The individual. (b) Any other person authorized in writing by the individual to license the commercial use of the individual's name, image, or likeness. (c) If the individual is deceased: 1. A person authorized in writing to license the commercial use of the individual's name, image, or likeness; or 2. If a person is not authorized, any one individual from a class composed of the deceased individual's surviving spouse and surviving children. A legal parent or guardian may give consent on behalf of a minor surviving child. (3) A person may not publish, print, display or otherwise publicly use for purposes of trade or for any commercial or advertising purpose the name, portrait, photograph, image, or other likeness of an individual without the express written or oral consent to such use given by any of the following: (a) The individual. (b) Any other person authorized in writing by the individual to license the commercial use of the individual's name, image, or likeness. (c) If the individual is deceased: 1. A person authorized in writing to license the commercial use of the deceased individual's name, image, or likeness; or 2. If a person is not authorized, any one individual from a class composed of the individual's surviving spouse and surviving children. A legal parent or guardian may give consent on behalf of a minor surviving child.
MN-02 AI Crisis Response Protocols · MN-02.4 · Deployer · ChatbotMinors
Fla. Stat. § 501.9984(1)(a)5.
Plain Language
When a minor account holder expresses to the companion chatbot a desire or intent to self-harm or harm others, the platform must send a timely notification to the consenting parent or guardian. This is a parental notification obligation distinct from crisis referral — it requires alerting the parent, not just referring the minor to crisis resources. The statute does not define 'timely' with a specific timeframe, leaving some implementation discretion.
Statutory Text
Receive timely notifications if the account holder expresses to the companion chatbot a desire or an intent to engage in harm to self or others.
PS-01 Government AI Accountability · PS-01.4 · Government · Government System
Fla. Stat. § 287.138(3)(b), (7)
Plain Language
Beginning July 1, 2026, Florida governmental entities may not contract for AI technology, software, or products with entities that are owned by, controlled by, or organized under the laws of a foreign country of concern. Vendors seeking AI contracts must provide a sworn affidavit attesting they do not meet any of the prohibited criteria. Existing contracts with such entities may not be extended or renewed after July 1, 2026. This is a procurement restriction, not a performance standard — the obligation falls on both the governmental entity (not to contract) and the vendor (to attest).
Statutory Text
(b) Beginning July 1, 2026, a governmental entity may not accept a bid on, a proposal for, or a reply to, or enter into a contract with, an entity to provide artificial intelligence technology, software, or products, including as a portion or an option to the products or services provided under the contract, unless the entity provides the governmental entity with an affidavit signed by an officer or a representative of the entity under penalty of perjury attesting that the entity does not meet any of the criteria in paragraph (7)(a), paragraph (7)(b), or paragraph (7)(c). (7) A governmental entity may not knowingly enter into a contract with an entity for artificial intelligence technology, software, or products, including as a portion or an option to the products or services provided under the contract, if: (a) The entity is owned by the government of a foreign country of concern; (b) A government of a foreign country of concern has a controlling interest in the entity; or (c) The entity is organized under the laws of or has its principal place of business in a foreign country of concern.
Other · Education
Fla. Stat. § 1006.1495(2)
Plain Language
Educational entities — including school districts, public schools, private schools, and VPK providers — may not provide students with access to AI instructional tools before grade 6. Three exceptions apply: (1) use directed and supervised by school personnel, (2) translation support for English language learners, and (3) accommodations or assistive technology for students with documented disabilities. This effectively bans unsupervised student access to AI instructional tools in elementary school.
Statutory Text
An educational entity may not provide students with access to an artificial intelligence instructional tool before grade 6 unless such use is: (a) Directed and supervised by school personnel; (b) For translation or similar support necessary for a student identified as an English language learner; or (c) For accommodations, assistive technology, or similar support necessary for a student with a documented disability.
Other · Education
Fla. Stat. § 1006.1495(3)-(4)
Plain Language
Before providing a student with access credentials for an AI instructional tool, the educational entity must notify the parent, identifying the tool, its purpose, how it will be used, and how to exercise opt-out and account access rights. Parents must be given the opportunity to opt out, and the opt-out process must align with the entity's existing parental notice policies. Public schools must provide a penalty-free alternative instructional activity if a parent opts out. This creates a pre-deployment parental consent gate for AI in education.
Statutory Text
(3) Before a student is provided access credentials for an artificial intelligence instructional tool, the educational entity must provide the parent of a minor student with notice that: (a) Identifies the tool and its educational purpose; (b) Describes, in general terms, the manner in which the tool will be used by students; (c) Explains how the parent may exercise the opt-out process under subsection (4); and (d) Explains how the parent may access the student's account or request access to information and account activity under subsection (5), including the method for submitting a written request. (4) (a) A parent of a minor student must be provided the opportunity to opt out of the student's use of an artificial intelligence instructional tool. (b) The opt-out process must align with the educational entity's existing policies for parental notice, consent, objection, or opt out for instructional materials, digital tools, or online accounts, as applicable. (c) If a parent opts out of a student's use of an artificial intelligence instructional tool and the student is enrolled in a public school, the school district or public school must provide an alternative instructional activity that allows the student to meet a comparative educational requirement without penalty.
Other · Education
Fla. Stat. § 1006.1495(5)
Plain Language
Operators of AI instructional tools must provide educational entities with a means to authorize parental access to student account information at the time student access is granted. This may be satisfied by either (1) providing read-only parental credentials simultaneously with student access, or (2) responding to written parental requests within 30 days. Operators and educational entities are not required to create or retain transcripts of student interactions beyond what they maintain in the ordinary course. The educational entity must inform parents of their right to request access.
Statutory Text
(a) At the time an operator provides a student's access credentials or otherwise provides or enables student access to an educational entity for an artificial intelligence instructional tool, the operator shall simultaneously provide to the educational entity a means to authorize the parent of a minor student to access information and account activity maintained within the artificial intelligence instructional tool. (b) The operator may satisfy paragraph (a) by: 1. Providing the parent of a minor student credentials or another method for read-only access to the student's account; or 2. Upon written request from the parent of a minor student, providing access to the information and account activity maintained within the tool, in accordance with applicable state and federal law, within 30 days after receipt of the request. The educational entity shall inform the parent of the right to make such a request and the method for submitting the request. (c) If an educational entity satisfies subparagraph (b)1., the educational entity must provide the credentials or other access method at the time the educational entity provides the student with access credentials or otherwise enables student access. (d) This subsection does not require an operator or educational entity to create or retain a transcript or record of student interactions beyond information otherwise maintained in the ordinary course of providing access to the tool.