D-01
Data Governance
Automated Processing Rights & Data Controls
Individuals have rights to know, correct, and in some jurisdictions opt out of automated processing of their personal data for consequential decisions. Organizations face restrictions on using sensitive personal attributes in AI decision-making and must minimize data collection to what is necessary for the stated purpose. AI-generated inferences and derived attributes are themselves subject to these controls.
Applies to DeveloperDeployerManufacturerProfessionalGovernment Sector EmploymentFinancial ServicesHealthcare
Bills — Enacted
2
unique bills
Bills — Proposed
79
Last Updated
2026-03-29
Core Obligation

Individuals have rights to know, correct, and in some jurisdictions opt out of automated processing of their personal data for consequential decisions. Organizations face restrictions on using sensitive personal attributes in AI decision-making and must minimize data collection to what is necessary for the stated purpose. AI-generated inferences and derived attributes are themselves subject to these controls.

Sub-Obligations7 sub-obligations
ID
Name & Description
Enacted
Proposed
D-01.1
Right to know Individuals have the right to know that their personal data is being used in an automated decision-making system, and in some jurisdictions to receive a description of the categories of data used.
2 enacted
21 proposed
D-01.2
Right to correct Individuals have the right to correct inaccurate personal data used in automated decisions, and to have the correction reflected in pending and future decisions — not just in the underlying record.
1 enacted
14 proposed
D-01.3
Right to opt out Individuals have the right to opt out of automated processing of their personal data for consequential decisions.
1 enacted
12 proposed
D-01.4
Data minimization Data collected and generated in connection with AI systems — including behavioral data, inferences, and derived attributes — must be limited to what is necessary for the AI system's stated purpose. Secondary uses require separate justification.
1 enacted
50 proposed
D-01.5
Sensitive attribute restrictions AI systems may not use sensitive personal attributes (race, gender, religion, health status, sexual orientation, national origin, disability) as direct inputs to consequential automated decisions except where expressly permitted. Proxy variable restrictions also apply — systems may not be designed to infer sensitive attributes from non-sensitive proxies for use in consequential decisions.
1 enacted
13 proposed
D-01.6
Age-Differentiated Parental Control and Privacy Tools Operators must provide minor-specific and under-thirteen parental or guardian tools for managing privacy and account settings, including control over interaction data retention for personalization, use of personal data for AI training, and account deletion. Age assurance data must be minimized and immediately deleted upon determination.
0 enacted
7 proposed
D-01.8
Biometric Data Pre-Collection Consent Entities must provide written notice and obtain affirmative opt-in consent from individuals before collecting any biometric identifier, including specific notice of identifier type and collection purpose. Consent obtained from publicly available sources is insufficient unless the individual themselves made the data publicly available.
0 enacted
18 proposed
Bills That Map This Requirement 81 bills
Bill
Status
Sub-Obligations
Section
Pending 2026-10-01
D-01.4
Section 2(f)
Plain Language
Covered entities are restricted to collecting and storing only the minimum amount of user information needed for a legitimate purpose. The information must be sufficient for, relevant to, and the minimum needed for that purpose — a classic data minimization standard. Additionally, the collected information must not conflict with a 'trusted party's best interests,' though the statute does not define 'trusted party.' This applies to all data collection in the chatbot context, not only to minor users.
(f) Each covered entity shall collect and store only information that does not conflict with a trusted party's best interests, which must be: (1) Sufficient to fulfill a legitimate purpose of the covered entity; (2) Relevant to the legitimate purpose of the covered entity; and (3) The minimum amount of information needed for the legitimate purpose of the covered entity.
Pending 2026-01-01
D-01.4
A.R.S. § 44-1383.01(A)(1)
Plain Language
Chatbot providers may not use personal data to generate chatbot outputs unless doing so is necessary to fulfill a specific user request and the user has given affirmative consent through a robust consent mechanism. Consent cannot be obtained through broad terms of use, dark patterns, or user inaction. The consent request must be a standalone disclosure, written in plain language, accessible to users with disabilities, and available in the chatbot's operating language. The option to decline must be at least as prominent as the option to accept.
A chatbot provider may not: 1. Process personal data to inform a chatbot output unless processing personal data is necessary to fulfill an express request that is made by a user and the user provides affirmative consent.
Pending 2026-01-01
D-01.4
A.R.S. § 44-1383.01(A)(2)
Plain Language
Chatbot providers are categorically prohibited from using chat logs — meaning both user inputs and chatbot outputs — for any advertising purpose. This includes determining whether to show ads, selecting which product or service to advertise, and customizing ad content for individual users. There is no consent carve-out for this prohibition — even with user consent, chat logs may not be used for advertising targeting or customization.
A chatbot provider may not: 2. Process a user's chat log: (a) To determine whether to display an advertisement for a product or service to a user. (b) To determine a product or service or category of a product or service to advertise to a user. (c) To customize an advertisement for presentation to a user.
Pending 2026-01-01
D-01.4
A.R.S. § 44-1383.01(A)(3)(a)-(b)
Plain Language
When a chatbot provider knows or reasonably should know that a user is a minor — based on objective circumstances — the provider may not process the minor's chat logs or personal data at all without affirmative parental or guardian consent. This includes a separate prohibition on using minor users' data for model training without parental consent. The knowledge standard is constructive — 'reasonably should have known based on knowledge of objective circumstances' — meaning providers cannot ignore obvious indicators of minor status. Safety testing and legally required compliance actions are carved out of the definition of 'training.'
A chatbot provider may not: 3. Process a user's chat log and personal data: (a) If the chatbot provider knows or reasonably should have known that based on knowledge of objective circumstances the user is a minor and the user's parent or legal guardian did not provide affirmative consent. (b) For training purposes if the chatbot provider knows or reasonably should have known that based on knowledge of objective circumstances the user is a minor and the user's parent or legal guardian did not provide affirmative consent.
Pending 2026-01-01
D-01.4
A.R.S. § 44-1383.01(A)(3)(c)
Plain Language
Chatbot providers may not use adult users' chat logs or personal data for model training without first obtaining affirmative consent. This requirement applies regardless of context — any use of user interaction data to adjust or modify the underlying model requires opt-in consent. Safety testing and legally required actions are carved out of the definition of 'training' and do not require separate consent.
A chatbot provider may not: 3. Process a user's chat log and personal data: (c) for training purposes if the user is an adult, unless the chatbot provider first obtains affirmative consent.
Pending 2026-01-01
D-01.4
A.R.S. § 44-1383.01(A)(3)(d), (A)(4)
Plain Language
Chatbot providers may not use chat logs, personal data, or input data to profile users — meaning to classify or infer personality traits and behavioral characteristics — beyond what is strictly necessary to fulfill a specific user request. This is a double prohibition: one on using chat logs and personal data for profiling beyond necessity, and a second standalone prohibition on profiling based on personality or behavioral classifications beyond necessity. Processing chat logs for user safety or legal compliance is excluded from the definition of profiling.
A chatbot provider may not: 3. Process a user's chat log and personal data: (d) To engage in profiling beyond what is necessary to fulfill an express request. 4. Profile a user based on any classification or designation of the user's personality or behavioral characteristic beyond what is necessary to fulfill an express request made by the user.
Pending 2026-01-01
D-01.1D-01.2
A.R.S. § 44-1383.01(B)
Plain Language
Users have a right to access their own chat logs at any time, and chatbot providers must produce them upon request in a downloadable, easy-to-read format. Providers may not retaliate against users who exercise this access right. This is an on-demand data access right — there is no waiting period or limitation on frequency of requests.
A user has a right to access the user's own chat logs at any time. A chatbot provider shall provide a user's own chat log on request by the user and shall provide the chat log in a downloadable and easy to read format. A chatbot provider may not discriminate or retaliate against a user pursuant to subsection A paragraph 7 of this section that requests the user's chat.
Failed 2026-01-01
D-01.4
Lab. Code § 1524(b)
Plain Language
Employers may not use an ADS to collect worker data for any purpose beyond what was disclosed in the pre-use notice required by § 1522. This is a purpose limitation — data collection through the ADS is constrained to what was described in the mandatory notice. Because 'worker data' includes inferred and derived information, this restriction extends beyond directly collected data to encompass any data the ADS generates about workers. If an employer wants to collect new categories of worker data via the ADS, they must first update and re-issue the required notice.
(b) An employer shall not use an ADS to collect worker data for a purpose that is not disclosed pursuant to the notice requirements in Chapter 2 (commencing with Section 1522).
Failed 2026-01-01
D-01.1D-01.2
Lab. Code § 1524(e)-(f)
Plain Language
Workers have the right to request a copy of their own data from the most recent 12 months that was primarily used by an ADS to make a discipline, termination, or deactivation decision. This right may be exercised once per 12-month period. When providing worker data, the employer must anonymize any personal information belonging to customers, other workers, or other individuals to protect third-party privacy. The right to access data is tied to the pre-use notice obligation in § 1522(e)(6), which requires employers to describe workers' access and correction rights. This is an on-demand access right, not a proactive disclosure.
(e) A worker shall have the right to request, and an employer shall provide, a copy of the most recent 12 months of the worker's own data primarily used by an ADS to make a discipline, termination, or deactivation decision. A worker is limited to one request every 12 months for a copy of their own data used by an ADS to make a discipline, termination, or deactivation decision. (f) For purposes of safeguarding the privacy rights of consumers, workers, and individuals, when an employer is required to provide worker data pursuant to this part, that worker data shall be provided in a manner that anonymizes the customer's, other worker's, or individual's personal information.
Pending 2027-01-01
D-01.5
Lab. Code § 1522(a)(5)
Plain Language
Employers may not use individualized worker data as ADS inputs or outputs to inform compensation decisions unless they can affirmatively demonstrate that any resulting compensation differences for substantially similar work are based on cost differentials in performing the task or that the data was directly related to the tasks the worker was hired to perform. The burden of justification falls on the employer. 'Individualized' is defined broadly to include data specific to groups or tiers of individuals with particular personal characteristics, behaviors, or biometrics — not just individual-level data.
(5) Use or rely upon individualized worker data as inputs or outputs to inform compensation unless the employer can clearly demonstrate that any differences in compensation for substantially similar or comparable work assignments are based upon cost differentials in performing the task involved, or that the data was directly related to the tasks that the worker was hired to perform.
Pending 2027-01-01
D-01.1D-01.2
Lab. Code § 1522(e)-(f)
Plain Language
Workers have the right to request and receive a copy of the most recent 12 months of their own data that was primarily used by an ADS to make a disciplinary, termination, or deactivation decision. This right is limited to one request per 12-month period. When providing the data, employers must anonymize any personal information belonging to customers, other workers, or other individuals — the worker receives their own data but not anyone else's identifiable information. This is a data access right specific to ADS-related employment decisions, not a general right to all data the employer holds.
(e) A worker shall have the right to request, and an employer shall provide, a copy of the most recent 12 months of the worker's own data primarily used by an ADS to make a disciplinary, termination, or deactivation decision. A worker is limited to one request every 12 months for a copy of their own data used by an ADS to make a disciplinary, termination, or deactivation decision.
(f) For purposes of safeguarding the privacy rights of consumers, workers, and individuals, when an employer is required to provide worker data pursuant to this part, that worker data shall be provided in a manner that anonymizes the customer's, other worker's, or individual's personal information.
Pending 2027-01-01
D-01.6
C.R.S. § 6-1-1708(1)(f)
Plain Language
Operators must provide minor users with tools to manage their privacy and account settings, including controls over (1) whether the AI retains substantive interaction data for personalization and (2) whether the minor's personal data is used to train the AI. For minors under 13, parental or guardian tools must be provided. For minors 13 and older, parental tools must also be offered, but on a risk-appropriate basis — giving operators some discretion in what parental controls to offer for older minors. All three sub-provisions apply simultaneously when the user is under 13.
On and after January 1, 2027, if an operator knows or has reasonable certainty that a user of a conversational artificial intelligence service is a minor, the operator shall: (f) (I) Offer tools for the minor user to manage the minor user's privacy and account settings, including the ability to control whether the conversational artificial intelligence service retains substantive information from each interaction with the conversational artificial intelligence service for the purpose of personalizing the content of future interactions and whether the minor user's personal data is used for the purposes of training the conversational artificial intelligence service; (II) For a minor user who is under thirteen years old, offer tools for a parent or guardian of the minor user to manage the minor user's privacy and account settings; and (III) For a minor user who is thirteen years old or older, offer tools for a parent or guardian of the minor user to manage the minor user's privacy and account settings as appropriate, based on relevant risks.
Pending 2027-01-01
D-01.2
C.R.S. § 6-1-1705(1)(a)(I), (1)(b), (1)(c)
Plain Language
When a consumer experiences an adverse outcome from a consequential decision materially influenced by a covered ADMT, the consumer may request instructions for accessing their personal data and correcting factually incorrect or materially inaccurate personal data used in the decision, consistent with the Colorado Privacy Act (§ 6-1-1306). The data correction right applies broadly — certain CPA consumer definition exceptions and processing exceptions do not limit this right. However, the correction right does not extend to opinions, predictions, scores, or protected evaluations. The attorney general must adopt implementing rules by January 1, 2027.
(1) (a) WHEN A CONSUMER EXPERIENCES AN ADVERSE OUTCOME RESULTING FROM A CONSEQUENTIAL DECISION IN WHICH A COVERED ADMT MATERIALLY INFLUENCES THE CONSEQUENTIAL DECISION, THE CONSUMER MAY REQUEST AND THE DEPLOYER SHALL PROVIDE IN RESPONSE TO THE REQUEST: (I) INSTRUCTIONS FOR REQUESTING PERSONAL DATA AND CORRECTING FACTUALLY INCORRECT OR MATERIALLY INACCURATE PERSONAL DATA USED IN A CONSEQUENTIAL DECISION THAT USED A COVERED ADMT CONSISTENT WITH SECTION 6-1-1306; ... (b) FOR THE PURPOSES OF THIS SUBSECTION (1), THE EXCEPTIONS TO THE DEFINITION OF "CONSUMER" IN SECTION 6-1-1303 (6)(b) AND THE EXCEPTIONS IN SECTION 6-1-1304 (2)(k), (2)(n), AND (2)(o) DO NOT APPLY TO THE RIGHT TO REQUEST CORRECTION OF FACTUALLY INCORRECT OR MATERIALLY INACCURATE PERSONAL DATA PURSUANT TO THIS SUBSECTION (1). (c) SUBSECTION (1)(a) OF THIS SECTION DOES NOT REQUIRE CORRECTION OF OPINIONS, PREDICTIONS, SCORES, OR PROTECTED EVALUATIONS.
Enacted 2023-07-01
D-01.3
C.R.S. § 6-1-1306(1)(a)
Plain Language
Colorado consumers have the right to opt out of three categories of data processing: targeted advertising, the sale of personal data, and profiling used to make consequential decisions (covering financial services, housing, insurance, education, employment, healthcare, and essential goods/services). Controllers that engage in targeted advertising or data sales must provide a clear and conspicuous opt-out method both in and outside the privacy notice. Starting July 1, 2024, controllers must also honor a user-selected universal opt-out mechanism meeting AG specifications. However, a controller may obtain specific, informed consent that overrides the universal opt-out — but only after providing clear notice of available choices and enabling equally easy revocation of that consent. Authorized agents may exercise opt-out rights on behalf of consumers.
(a) Right to opt out. (I) A consumer has the right to opt out of the processing of personal data concerning the consumer for purposes of: (A) Targeted advertising; (B) The sale of personal data; or (C) Profiling in furtherance of decisions that produce legal or similarly significant effects concerning a consumer. (II) A consumer may authorize another person, acting on the consumer's behalf, to opt out of the processing of the consumer's personal data for one or more of the purposes specified in subsection (1)(a)(I) of this section, including through a technology indicating the consumer's intent to opt out such as a web link indicating a preference or browser setting, browser extension, or global device setting. A controller shall comply with an opt-out request received from a person authorized by the consumer to act on the consumer's behalf if the controller is able to authenticate, with commercially reasonable effort, the identity of the consumer and the authorized agent's authority to act on the consumer's behalf. (III) A controller that processes personal data for purposes of targeted advertising or the sale of personal data shall provide a clear and conspicuous method to exercise the right to opt out of the processing of personal data concerning the consumer pursuant to subsection (1)(a)(I) of this section. The controller shall provide the opt-out method clearly and conspicuously in any privacy notice required to be provided to consumers under this part 13, and in a clear, conspicuous, and readily accessible location outside the privacy notice. (IV) (A) A controller that processes personal data for purposes of targeted advertising or the sale of personal data may allow consumers to exercise the right to opt out of the processing of personal data concerning the consumer for purposes of targeted advertising or the sale of personal data pursuant to subsections (1)(a)(I)(A) and (1)(a)(I)(B) of this section by controllers through a user-selected universal opt-out mechanism that meets the technical specifications established by the attorney general pursuant to section 6-1-1313. This subsection (1)(a)(IV)(A) is repealed, effective July 1, 2024. (B) Effective July 1, 2024, a controller that processes personal data for purposes of targeted advertising or the sale of personal data shall allow consumers to exercise the right to opt out of the processing of personal data concerning the consumer for purposes of targeted advertising or the sale of personal data pursuant to subsections (1)(a)(I)(A) and (1)(a)(I)(B) of this section by controllers through a user-selected universal opt-out mechanism that meets the technical specifications established by the attorney general pursuant to section 6-1-1313. (C) Notwithstanding a consumer's decision to exercise the right to opt out of the processing of personal data through a universal opt-out mechanism pursuant to subsection (1)(a)(IV)(B) of this section, a controller may enable the consumer to consent, through a web page, application, or a similar method, to the processing of the consumer's personal data for purposes of targeted advertising or the sale of personal data, and the consent takes precedence over any choice reflected through the universal opt-out mechanism. Before obtaining a consumer's consent to process personal data for purposes of targeted advertising or the sale of personal data pursuant to this subsection (1)(a)(IV)(C), a controller shall provide the consumer with a clear and conspicuous notice informing the consumer about the choices available under this section, describing the categories of personal data to be processed and the purposes for which they will be processed, and explaining how and where the consumer may withdraw consent. The web page, application, or other means by which a controller obtains a consumer's consent to process personal data for purposes of targeted advertising or the sale of personal data must also allow the consumer to revoke the consent as easily as it is affirmatively provided.
Enacted 2023-07-01
D-01.1D-01.2
C.R.S. § 6-1-1306(1)(b)-(e)
Plain Language
Consumers have four core data rights: (1) the right to confirm whether their data is being processed and to access it; (2) the right to correct inaccuracies; (3) the right to delete their personal data; and (4) the right to data portability — receiving their data in a portable, usable format to transmit to another entity (limited to twice per calendar year, with a trade secret carve-out). These rights must be exercised through the methods described in the controller's privacy notice.
(b) Right of access. A consumer has the right to confirm whether a controller is processing personal data concerning the consumer and to access the consumer's personal data. (c) Right to correction. A consumer has the right to correct inaccuracies in the consumer's personal data, taking into account the nature of the personal data and the purposes of the processing of the consumer's personal data. (d) Right to deletion. A consumer has the right to delete personal data concerning the consumer. (e) Right to data portability. When exercising the right to access personal data pursuant to subsection (1)(b) of this section, a consumer has the right to obtain the personal data in a portable and, to the extent technically feasible, readily usable format that allows the consumer to transmit the data to another entity without hindrance. A consumer may exercise this right no more than two times per calendar year. Nothing in this subsection (1)(e) requires a controller to provide the data to the consumer in a manner that would disclose the controller's trade secrets.
Enacted 2023-07-01
D-01.4
C.R.S. § 6-1-1308(3)-(4)
Plain Language
Controllers must limit their data collection to what is adequate, relevant, and reasonably necessary for the stated processing purposes. They may not repurpose personal data for secondary uses that are incompatible with the original stated purposes unless the consumer provides fresh consent. These are foundational data minimization and purpose limitation obligations that apply across all data processing activities.
(3) Duty of data minimization. A controller's collection of personal data must be adequate, relevant, and limited to what is reasonably necessary in relation to the specified purposes for which the data are processed. (4) Duty to avoid secondary use. A controller shall not process personal data for purposes that are not reasonably necessary to or compatible with the specified purposes for which the personal data are processed, unless the controller first obtains the consumer's consent.
Enacted 2023-07-01
D-01.5
C.R.S. § 6-1-1308(7)
Plain Language
Controllers must obtain affirmative, informed consent before processing any sensitive data — including data revealing race, ethnicity, religion, health conditions, sex life, sexual orientation, citizenship, genetic data, biometric data, or data from known children. For children's data, consent must come from a parent or lawful guardian. Consent must be freely given, specific, informed, and unambiguous; bundled terms-of-service acceptance, passive interactions, and dark patterns do not qualify.
(7) Duty regarding sensitive data. A controller shall not process a consumer's sensitive data without first obtaining the consumer's consent or, in the case of the processing of personal data concerning a known child, without first obtaining consent from the child's parent or lawful guardian.
Pending 2026-10-01
D-01.1
Sec. 4
Plain Language
Before collecting any personal data from an applicant or employee for use in an automated employment-related decision process, the deployer must provide written notice covering: the purpose of collection, the categories of data to be collected, the retention period, who will have access to the data, and information about the right to opt out of personal data processing under Connecticut's existing data privacy law (§ 42-518). This is a pre-collection notice obligation — it must be provided before data collection begins, not at the time of the decision.
Except as provided in subsection (b) of section 2 of this act, prior to collecting any personal data of an applicant for employment or employee in the state for processing in an automated employment-related decision process, a deployer shall provide to such applicant or employee a written notice disclosing: (1) The purpose of such data collection; (2) The categories of personal data that will be collected for processing in such automated employment-related decision process; (3) The retention period for any personal data collected; (4) The categories of persons who will have access to such personal data; and (5) Information concerning the right, under subparagraph (C) of subdivision (5) of subsection (a) of section 42-518 of the general statutes, to opt out of the processing of personal data for the purposes set forth in said subparagraph.
Enacted 2023-01-01
D-01.1
N.Y.C. Admin. Code § 20-871(b)(3)
Plain Language
Employers and employment agencies must make available to NYC-resident candidates and employees, upon written request, information about the type of data collected by the AEDT, the source of that data, and the employer's data retention policy. This information must be provided within 30 days of a written request. Per the implementing rules, employers may satisfy this by posting the information on the employment section of their website along with instructions for how to make a written request. A carve-out applies where disclosure would violate local, state, or federal law or interfere with a law enforcement investigation — in that case, the employer must explain why disclosure cannot be made.
3. If not disclosed on the employer or employment agency's website, information about the type of data collected for the automated employment decision tool, the source of such data and the employer or employment agency's data retention policy shall be available upon written request by a candidate or employee. Such information shall be provided within 30 days of the written request. Information pursuant to this section shall not be disclosed where such disclosure would violate local, state, or federal law, or interfere with a law enforcement investigation.
Failed 2026-07-01
D-01.4
Fla. Stat. § 501.9986(1)-(2)
Plain Language
AI technology companies may not sell or disclose user personal information unless it is deidentified. This is a near-absolute prohibition on personal data sales — the only exceptions are for data that has been properly deidentified or for disclosures specifically authorized by federal law. Companies holding deidentified data must take reasonable measures to prevent re-association, maintain data in deidentified form, impose contractual flow-down on recipients requiring compliance, and implement business processes to prevent inadvertent release. The only permitted reidentification attempt is for testing the company's own deidentification processes. Compliance can be demonstrated through a risk management program aligned with the NIST AI RMF and ISO 42001 with controls for deidentification, contractual flow-down, non-reidentification, inadvertent release prevention, monitoring, and auditing.
(1) An artificial intelligence technology company may not sell or disclose personal information of users unless the information is deidentified data. This subsection does not prohibit the sale or disclosure of information specifically authorized by federal law. (2) An artificial intelligence technology company in possession of deidentified data shall do all of the following: (a) Take reasonable measures to ensure that the data cannot be associated with a user. (b) Maintain and use the data in deidentified form. An artificial intelligence technology company may not attempt to reidentify the data, except that the artificial intelligence technology company may attempt to reidentify the data solely for the purpose of determining whether its deidentification processes satisfy the requirements of this section. (c) Contractually obligate a recipient of the deidentified data to comply with this section. (d) Implement business processes to prevent the inadvertent release of deidentified data.
Failed 2026-07-01
D-01.3
Fla. Stat. § 1006.1495(4)
Plain Language
Parents must be given the opportunity to opt their minor student out of using an AI instructional tool. The opt-out process must align with the educational entity's existing policies for instructional materials and digital tools. If a parent opts out and the student attends a public school, the school must provide an alternative instructional activity that allows the student to meet a comparable educational requirement without penalty. This ensures no student is academically disadvantaged for a parent's decision to opt out of AI tools.
(a) A parent of a minor student must be provided the opportunity to opt out of the student's use of an artificial intelligence instructional tool. (b) The opt-out process must align with the educational entity's existing policies for parental notice, consent, objection, or opt out for instructional materials, digital tools, or online accounts, as applicable. (c) If a parent opts out of a student's use of an artificial intelligence instructional tool and the student is enrolled in a public school, the school district or public school must provide an alternative instructional activity that allows the student to meet a comparative educational requirement without penalty.
Failed 2026-07-01
D-01.4
Fla. Stat. § 501.9986(1)-(2)
Plain Language
AI technology companies may not sell or disclose user personal information unless it has been deidentified — i.e., it cannot reasonably be linked to an identified or identifiable individual or their device. Where the company possesses deidentified data, it must: (1) take reasonable measures to prevent re-association with users, (2) maintain data in deidentified form and not attempt reidentification (except to test its own deidentification processes), (3) contractually bind recipients to the same obligations, and (4) implement business processes to prevent inadvertent release. Sales or disclosures specifically authorized by federal law are exempt. The safe harbor allows a company to demonstrate compliance by showing a risk management program validated against NIST AI RMF or ISO 42001 with assessed controls for deidentification, contractual flow-down, non-reidentification, inadvertent release prevention, monitoring, and auditing.
(1) An artificial intelligence technology company may not sell or disclose personal information of users unless the information is deidentified data. This subsection does not prohibit the sale or disclosure of information specifically authorized by federal law. (2) An artificial intelligence technology company in possession of deidentified data shall do all of the following: (a) Take reasonable measures to ensure that the data cannot be associated with a user. (b) Maintain and use the data in deidentified form. An artificial intelligence technology company may not attempt to reidentify the data, except that the artificial intelligence technology company may attempt to reidentify the data solely for the purpose of determining whether its deidentification processes satisfy the requirements of this section. (c) Contractually obligate a recipient of the deidentified data to comply with this section. (d) Implement business processes to prevent the inadvertent release of deidentified data.
Failed 2026-07-01
D-01.6
Fla. Stat. § 501.1739(6)
Plain Language
Operators must protect the confidentiality of all age verification information provided by users, in accordance with the requirements of section 501.1738 (Florida's existing age verification confidentiality framework). This incorporates by reference the data protection standards of that statute, which generally require minimization of age verification data and prohibit secondary uses.
(6) An operator shall protect the confidentiality of age information provided by a user for age verification in accordance with s. 501.1738.
Failed 2026-07-01
D-01.6
Fla. Stat. § 501.1739(6)
Plain Language
Operators must protect the confidentiality of all age verification information collected from users, subject to the standards set forth in § 501.1738. This is a cross-reference obligation — the substantive confidentiality requirements are defined in the referenced statute, which likely includes data minimization and deletion requirements for age verification data. Practitioners should review § 501.1738 for the full scope of confidentiality protections required.
(6) An operator shall protect the confidentiality of age information provided by a user for age verification in accordance with s. 501.1738.
Failed 2026-07-01
D-01.4
Fla. Stat. § 501.9986(1)-(2)
Plain Language
AI technology companies may not sell or disclose users' personal information unless the information has been deidentified — meaning it cannot reasonably be linked to an identified or identifiable individual or their device. Sales authorized by federal law are excepted. Companies holding deidentified data must take reasonable measures to prevent re-association with users, maintain data in deidentified form, not attempt reidentification (except for testing deidentification processes), contractually require recipients to comply with the same rules, and implement processes to prevent inadvertent release. During enforcement, companies may present evidence of a risk management program aligned with NIST AI RMF/ISO 42001 that includes assessed controls for deidentification, contractual flow-down, non-reidentification, and auditing.
(1) An artificial intelligence technology company may not sell or disclose personal information of users unless the information is deidentified data. This subsection does not prohibit the sale or disclosure of information specifically authorized by federal law. (2) An artificial intelligence technology company in possession of deidentified data shall do all of the following: (a) Take reasonable measures to ensure that the data cannot be associated with a user. (b) Maintain and use the data in deidentified form. An artificial intelligence technology company may not attempt to reidentify the data, except that the artificial intelligence technology company may attempt to reidentify the data solely for the purpose of determining whether its deidentification processes satisfy the requirements of this section. (c) Contractually obligate a recipient of the deidentified data to comply with this section. (d) Implement business processes to prevent the inadvertent release of deidentified data.
Pending 2028-07-01
D-01.3
HRS § 321-__ (Consequential decisions; notice; statement; opt-out; corrections; appeal)(a)(4)
Plain Language
Patients must be given the right to opt out of having their individually identifiable health information or other personal data processed for profiling purposes when those profiling outputs are used to further decisions with legal or similarly significant effects. This opt-out must be offered as part of the pre-decision written notice. The opt-out right is specifically scoped to profiling — automated processing that evaluates, analyzes, or predicts personal aspects — not to all AI processing in general.
(4) Allows the patient to opt out of the processing of the patient's individually identifiable health information or other personal data for purposes of profiling in furtherance of decisions that have legal or similarly significant effects concerning the patient.
Pending 2025-07-01
D-01.4
§ 554J.2(2)
Plain Language
Deployers must minimize the collection and storage of user information gathered by the chatbot to only what is necessary for the deployer's stated purpose in making the chatbot publicly available. This is a data minimization obligation — it prohibits collecting data beyond what is functionally necessary. The standard is tied to the deployer's purpose, which creates some ambiguity about how broadly or narrowly that purpose may be defined.
A deployer of a chatbot shall do all of the following: ... 2. Limit the collection and storage of user information collected by the chatbot to what is necessary to fulfill the deployer's purpose for making the chatbot publicly available.
Pending
D-01.4
§ 554J.2(1)(b)
Plain Language
Deployers must minimize the collection and storage of user information gathered by their public-facing chatbot to only what is necessary to fulfill the deployer's stated purpose for making the chatbot publicly available. This is a data minimization obligation — deployers may not collect or retain user data beyond what the chatbot's stated purpose requires.
b. Limit the collection and storage of user information collected by the public-facing chatbot to what is necessary to fulfill the deployer's purpose for making the public-facing chatbot publicly available.
Pending
D-01.4
§ 554J.2(2)
Plain Language
Deployers must apply data minimization principles to all user information collected by the chatbot. Collection and storage must be limited to what is necessary to fulfill the deployer's stated purpose for making the chatbot publicly available. This prohibits collecting user data for purposes beyond the chatbot's core function — secondary uses such as training other models, cross-product profiling, or advertising would need to be justified as necessary to the stated purpose. The statute does not define 'necessary' or specify retention periods.
A deployer of a chatbot shall do all of the following: 2. Limit the collection and storage of user information collected by the chatbot to what is necessary to fulfill the deployer's purpose for making the chatbot publicly available.
Pending 2026-07-01
D-01.1D-01.2
Iowa Code § 91F.5
Plain Language
Employees have the right to request a copy of the most recent 12 months of their own data that was primarily used by an ADS for discipline, termination, or deactivation decisions. The employer must provide the copy upon request. This right is limited to one request per 12-month period. This is a data access right — it allows employees to see what information drove adverse automated decisions about them, which is a prerequisite to challenging or correcting that data.
An employee has the right to request a copy of the most recent twelve months of the employee's own data primarily used by an automated decision system to make a discipline, termination, or deactivation decision. An employer shall provide a copy upon request. An employee is limited to one such request every twelve months.
Pending 2026-07-01
D-01.4
Iowa Code § 91F.6
Plain Language
When an employer provides employee data under this chapter (e.g., in response to a data access request), the employer must anonymize the personal information of any customer, other employee, or other individual contained in that data. This is a privacy safeguard that ensures data disclosures under this chapter do not expose third parties' personal information.
For purposes of safeguarding the privacy rights of consumers, employees, and individuals, when an employer is required to provide employee data pursuant to this chapter, the employer shall provide the data in a manner that anonymizes the personal information of any customer, employee, or other individual.
Pending
D-01.4
§ 554J.2(1)(a)-(c)
Plain Language
Any private entity that possesses biometric data must create a written retention and destruction policy specifying how long it will retain biometric data, make that policy publicly available, and destroy biometric data no later than three years after the individual's last interaction with the entity or when the collection purpose is accomplished, whichever is longer. This combines a data minimization and retention limit obligation with a public transparency requirement for the retention policy itself.
1. a. A private entity in possession of biometric data shall develop a written policy to establish a schedule for how long the private entity will retain biometric data before the private entity destroys the biometric data. b. A written policy shall be available to the public. c. A private entity shall not retain biometric data for more than three years after the subject of the biometric data last interacts with the private entity or until the purposes for which the biometric data was collected have been accomplished, whichever is longer.
Pending
D-01.8
§ 554J.2(2)(a)-(b)
Plain Language
Before collecting, capturing, purchasing, or otherwise obtaining any individual's biometric data, a private entity must provide written notice to the individual (or their legal representative) of two things: (1) that the entity intends to collect the individual's biometric data, and (2) the specific purposes for and length of time the entity intends to retain that data. This is a pre-collection written notice requirement — collection cannot occur until the notice has been given. The bill does not require affirmative opt-in consent; written notice alone satisfies the obligation.
2. A private entity shall not collect, capture, purchase, or otherwise obtain an individual's biometric data unless, prior to receiving the biometric data, the private entity does all of the following: a. Informs the subject of the biometric data, or the subject's legal representative, in writing, that the private entity intends to collect the subject's biometric data. b. Informs the subject of the biometric data, or the subject's legal representative, in writing, of the purposes and length of time for which the private entity intends to retain the biometric data.
Pending
§ 554J.2(3)
Plain Language
Private entities are categorically prohibited from selling, leasing, trading, or otherwise profiting from any individual's biometric data. This is an absolute prohibition with no exceptions — there is no consent mechanism that would allow monetization of biometric data.
3. A private entity shall not sell, lease, trade, or otherwise profit from an individual's biometric data.
Pending 2026-07-01
D-01.8
Idaho Code § 48-2101(2)(a)-(b)
Plain Language
Before capturing any biometric identifier for a commercial purpose, a person must inform the individual and obtain the individual's consent. Consent cannot be inferred from the mere existence of an image or media containing the individual's biometric identifiers on the internet or other publicly available sources — the individual themselves must have made that data publicly available for it to count. This is an affirmative opt-in consent requirement with a specific carve-out prohibiting constructive consent from scraped public data.
(2)(a) A person may not capture a biometric identifier of an individual for a commercial purpose unless the person: (i) Informs the individual before capturing the biometric identifier; and (ii) Receives the individual's consent to capture the biometric identifier. (b) For the purposes of this subsection, an individual has not been informed of and has not provided consent for the capture or storage of a biometric identifier for a commercial purpose based solely on the existence of an image or other media containing one (1) or more biometric identifiers of the individual on the internet or other publicly available source unless the image or other media was made publicly available by the individual to whom the biometric identifiers relate.
Pending 2026-07-01
D-01.3
Idaho Code § 48-2101(3)(d)
Plain Language
Entities must provide a mechanism for individuals to revoke consent to biometric identifier storage and transmission at any time. Upon receiving a revocation, the entity must immediately destroy the biometric identifier — unless maintaining it is required by another law. This is an ongoing opt-out right, not a one-time election. The destruction obligation upon revocation is immediate, with no cure period or grace window.
(d) Shall provide a method for an individual to revoke consent to the storage and transmission of a biometric identifier at any time and shall immediately destroy the biometric identifier upon receiving a revocation of consent unless maintaining the biometric identifier is required by another law.
Pending 2026-01-01
D-01.1
Section 10(f)
Plain Language
When an ADMS collects data about employees, those employees and their exclusive bargaining representatives have the right to view the collected data. This is a data access right — the employer must make the collected data available upon request. This goes beyond mere notice of data use and requires actual disclosure of the data itself.
(f) If an automated decision-making system is collecting employee data, employees and their exclusive bargaining representatives have a right to view the data collected by the automated decision-making system.
Pending 2027-01-01
D-01.4
Section 15(b)(2)
Plain Language
Large online platforms are prohibited from retaining any personal provenance data extracted from content shared on the platform. Personal provenance data includes both personal information and device/system/service identifiers reasonably capable of being associated with a particular user. This is an absolute prohibition — there is no exception for operational necessity or user consent.
(b) A large online platform shall not: (2) retain any personal provenance data from content shared on the large online platform.
Pending 2026-01-01
D-01.3
Student Educational Technologies Rights Act § 15(a)(1), (3), (b)
Plain Language
Students and parents have the right to opt out of school-issued electronic devices, electronic textbooks, electronic reading materials, electronic or online assignments, and predictive analytics systems without academic penalty. When the opt-out right is exercised, the school must provide a comparable analog alternative — such as paper assignments, physical textbooks, or printed reading materials. The predictive analytics opt-out is notable because it explicitly prohibits any academic penalty for exercising it. Schools must be prepared to offer non-digital equivalents for all digital educational resources.
(a) It is the policy of this State that a student and the student's parent have the right to: (1) opt out of school-issued personal electronic devices, electronic textbooks, electronic required reading, or electronic or online assignments; (3) opt out of predictive analytics systems without academic penalty. (b) If a student or a student's parent exercises the right outlined in subsection (a), the school shall provide the student with a comparable analog version of what the educational technology provides. As used in this subsection, "comparable analog version" includes, but is not limited to, providing the assignment on physical paper, a physical copy of the required reading, or the option of a physical paper textbook.
Pending 2026-01-01
D-01.4
105 ILCS 85/10(3), (3.5)
Plain Language
Ed-tech operators may not sell or rent student information or data — including covered information — or any other person's information collected for K-12 school purposes. A narrow exception exists for corporate acquisitions if the successor complies with the Act. Separately, operators may not permit AI to train on covered information unless the training is for K-12 school purposes or to improve operability and functionality of the operator's service. This creates a strict purpose-limitation regime: student data collected in the K-12 context is locked to educational purposes, and AI training on that data is similarly restricted. The definition of 'covered information' now expressly includes data gathered through AI and digital replicas.
(3) Sell or rent a student's information or data, including covered information or any other person's information collected by the operator for K through 12 school purposes. This subdivision (3) does not apply to the purchase, merger, or other type of acquisition of an operator by another entity if the operator or successor entity complies with this Act regarding previously acquired student information. (3.5) Permit artificial intelligence to train on covered information unless for K through 12 school purposes or in furtherance of improving operability and functionality of the operator's service.
Pending 2026-01-01
D-01.4
105 ILCS 85/10(4)(A)
Plain Language
Operators may disclose covered information only for enumerated permitted purposes, including furtherance of K-12 school purposes. The new language clarifies that 'improving operability' — which is an exception allowing further disclosure — does not extend to disclosing covered information to third parties for AI training that is not for K-12 school purposes. This closes a potential loophole where operators could share student data with third-party AI companies under the guise of improving the platform's operability. Recipients of disclosed covered information are further prohibited from re-disclosing it except for operability improvements.
(4) Except as otherwise provided in Section 20 of this Act, disclose covered information, unless the disclosure is made for the following purposes: (A) In furtherance of the K through 12 school purposes of the site, service, application, or model if the recipient of the covered information disclosed under this clause (A) does not further disclose the information, unless done to allow or improve operability and functionality of the operator's site, service, or application. Improving operability does not include disclosing covered information to any third party to train artificial intelligence that is not for K through 12 school purposes.
Pending 2026-01-01
D-01.8
105 ILCS 85/15(2)
Plain Language
Before an operator's AI model may train on a student's covered information and retain that training data indefinitely, the operator must: (1) provide written notice to the student or parent that the AI model will retain training data indefinitely, and (2) obtain written consent from the student or parent. Without both written notice and written consent, the AI model may not train on covered information with indefinite retention. This imposes an opt-in consent regime specifically for the combination of AI training and indefinite data retention — not for AI training alone (which is separately restricted to K-12 purposes under Section 10(3.5)).
An operator's artificial intelligence model shall not train on a student's covered information and retain the training data indefinitely, unless it first: (A) informs the student or his or her parent in writing that the operator's artificial intelligence model will retain training data indefinitely; and (B) receives a written consent from the student or his or her parent.
Pending 2026-01-01
D-01.4
105 ILCS 85/10(2)
Plain Language
Operators may not use information gathered through K-12 platforms — including AI models — to build profiles about students except for K-12 school purposes. This is a purpose-limitation restriction on student profiling. Account information that remains under the student's, parent's, or school's control is excluded from the profiling prohibition. The new language extends this existing prohibition to AI models operated for K-12 school purposes.
(2) Use information, including persistent unique identifiers, created or gathered by the operator's site, service, application, or model to amass a profile about a student, except in furtherance of K through 12 school purposes. "Amass a profile" does not include the collection and retention of account information that remains under the control of the student, the student's parent, or the school.
Pending 2026-07-01
D-01.3
IC 22-5-10.4-13
Plain Language
When an employer uses an automated decision system to manage a covered individual on an ongoing basis (as distinct from a one-time employment decision), the individual has the right to opt out entirely and be managed by a human manager who has authority to make employment decisions. This is an unconditional opt-out right — there are no stated exceptions. The employer must provide a human alternative, not merely disable certain ADS features.
Sec. 13. An employer that manages a covered individual through an automated decision system shall allow the covered individual to: (1) opt out of the management through the automated decision system; and (2) be managed through a human manager who is able to make employment related decisions with respect to the covered individual.
Pending 2026-07-01
D-01.4
Sec. 3(d)
Plain Language
Covered entities must minimize age verification data by limiting its collection, processing, use, and storage to what is strictly necessary for three purposes: verifying user age, obtaining verifiable parental consent, and maintaining compliance records. This is a data minimization obligation specific to age information — it does not permit secondary use of age data for advertising, profiling, or any purpose beyond the enumerated three.
(d) A covered entity shall protect the confidentiality of age information provided by a user for age verification by limiting the collection, processing, use and storage of such information to what is strictly necessary to verify a user's age, obtain verifiable parental consent or maintain compliance records.
Passed 2025-03-13
D-01.4
Section 3(3)(a)-(c)
Plain Language
The Commonwealth Office of Technology must ensure that all state agencies limit AI system data use to what is necessary, prohibit unrestricted access to Commonwealth-controlled personal data, secure all data, and implement data retention timeframes. This is a data minimization, access control, and retention obligation applied to all state agency AI systems.
(3) The Commonwealth Office of Technology shall prioritize personal privacy and the protection of the data of individuals and businesses as the state develops, implements, employs, and procures artificial intelligence systems, generative artificial intelligence systems, and high-risk artificial intelligence systems by ensuring all departments, agencies, and administrative bodies: (a) Allow only the use of necessary data in artificial intelligence systems; (b) Do not allow unrestricted access to personal data controlled by the Commonwealth; and (c) Secure all data and implement a timeframe for data retention.
Pending 2026-08-01
D-01.4
R.S. 23:973(B)
Plain Language
Employers may not collect worker data through an ADS for any purpose that was not disclosed in the pre-use written notice required under § 972. This is a purpose limitation — data collection by the ADS is restricted to the purposes the employer communicated to workers. Any new collection purpose requires updated notice before the data is collected.
B. An employer shall not use an ADS to collect worker data for a purpose that is not disclosed pursuant to the notice requirements as provided in R.S. 23:972.
Pending 2026-08-01
D-01.1D-01.2
R.S. 23:973(C)(4)(a)-(b)
Plain Language
Employers must allow workers to access all worker data collected, used, or produced by an ADS — including both input data and output data — and to correct errors in that data, whether the data was used by the ADS itself or as corroborating evidence by a human reviewer. Workers may designate an authorized representative (who cannot be the employer) to exercise this access right on their behalf. This is a broad data access and correction right covering the full lifecycle of ADS data use.
(4)(a) An employer shall allow a worker to access worker data collected, used by, or produced by an ADS and correct errors in any input or output data used by or produced by the ADS or used as corroborating evidence by a human reviewer. (b) An affected worker shall be allowed to choose an authorized representative to request access to the worker's data on his behalf.
Pending 2026-08-01
D-01.1
R.S. 23:973(F)-(G)
Plain Language
Workers have the right to request and receive a copy of their own data from the most recent 12 months that was primarily used by an ADS for a discipline, termination, or deactivation decision. This right may be exercised once per 12-month period. When providing data, the employer must anonymize any information about customers, other workers, or other individuals to protect their privacy. The anonymization obligation applies to all worker data disclosures under this Part, not just this specific right.
F. A worker has the right to request, and an employer shall provide, a copy of the most recent twelve months of the worker's own data primarily used by an ADS to make a discipline, termination, or deactivation decision. A worker shall be limited to one request every twelve months for a copy of his own data used by an ADS to make a discipline, termination, or deactivation decision. G. For purposes of safeguarding the privacy rights of consumers, workers, and individuals, when an employer is required to provide worker data pursuant to this Part, the worker data shall be provided in a manner that provides anonymity regarding the customer's, other worker's, or individual's personal information.
Pending 2026-01-01
D-01.4
R.S. 28:16(D)(1)-(2)
Plain Language
Operators may not sell or share with third parties any individually identifiable health information of a user or the user's input. Three narrow exceptions apply: (1) health information requested by a healthcare provider with the user's consent, (2) information provided to the user's health plan at the user's request, and (3) information shared with a contracted party to ensure the chatbot functions effectively. When sharing under any exception, both the operator and the receiving entity must comply with HIPAA privacy and security rules (45 CFR Parts 160 and 164, Subparts A and E) as if the operator were a HIPAA covered entity and the receiving party were a business associate. This effectively extends HIPAA-like obligations to mental health chatbot operators who would not otherwise be covered entities.
D.(1) An operator of a mental health chatbot may not sell to or share with any third party any individually identifiable health information of a user or the user's input. This Subsection shall not apply to individually identifiable health information that is requested by a healthcare provider with the consent of the user, provided to a health plan of a user upon request of the user, or shared to ensure the effective functionality of the mental health chatbot with another party with which the operator has a contract related to such functionality. (2) When sharing information pursuant to this Subsection, the operator and the other entity shall comply with all applicable privacy and security provisions of 45 CFR Part 160 and 45 CFR Part 164, Subparts A and E, as if the operator were a covered entity and the other entity were a business associate, as such terms are defined in 45 CFR 160.103.
Pending 2025-01-17
D-01.8
Ch. 110I, § 2(c)(i)
Plain Language
Covered entities may not process or transfer biometric data in any manner the end user has not consented to. Consent must be freely given, specific, informed, and unambiguous — general terms of service that bundle biometric data processing with unrelated information do not qualify. Passive actions (hovering, muting, pausing, closing content) and consent obtained through abusive trade practices are also insufficient. This is an affirmative opt-in consent requirement before any biometric data processing can occur.
(c) A covered entity shall not: (i) process or transfer biometric data in any manner not consented to by the end user;
Pending 2025-01-17
D-01.4
Ch. 110I, § 2(a)
Plain Language
Covered entities owe a duty of loyalty to end users: they may not process biometric data or design biometric recognition technology in ways that conflict with the end user's best interests. This is a broad fiduciary-style obligation that goes beyond data minimization — it requires affirmatively evaluating whether each processing activity serves the end user's interests. The Attorney General may promulgate rules and regulations interpreting this provision.
(a) A covered entity shall be prohibited from taking any actions with respect to processing biometric data or designing biometric recognition technologies that conflict with an end user's best interests.
Pending 2025-01-17
D-01.5
Ch. 110I, § 2(c)(ii)-(iv)
Plain Language
Covered entities face three interrelated restrictions on biometric data transfers: (1) an absolute ban on selling biometric data to third parties; (2) a prohibition on disclosing biometric data to anyone except consistent with the duties of loyalty, care, and confidentiality; and (3) a requirement that any permitted disclosure be governed by a contract imposing on the recipient the same fiduciary duties the covered entity owes to the end user. In practice, this means any third-party data sharing requires both a lawful basis consistent with end-user interests and a downstream contractual pass-through of the full duty framework.
(c) A covered entity shall not: (ii) engage in the sale of biometric data to a third party; (iii) disclose biometric data with any other person or entity except as consistent with the duties of loyalty, care, and confidentiality under subsections 2(a), 2(b) and 2(c)(i) and 2(c)(ii), respectively; or (iv) disclose or share biometric data with any other person unless that person enters into a contract with the covered entity that imposes on the person the same duties of care, loyalty, and confidentiality toward the end user as are imposed on the covered entity under this subsection.
Pending 2025-01-17
D-01.3
Ch. 110I, § 2(e)
Plain Language
Covered entities may not retaliate against end users who refuse to consent to biometric data processing. Retaliation includes denying goods or services, differential pricing, degraded service quality, or even suggesting that the user will receive worse terms. This ensures the consent requirement is meaningful — users cannot be economically coerced into providing biometric data consent.
(e) A covered entity shall not discriminate against a consumer because of the withheld consent under this title, including, but not limited to: (i) denying goods or services to the end user; (ii) charging different prices or rates for goods or services, including through the use of discounts or other benefits or imposing penalties; (iii) providing a different level or quality of goods or services to the end user; (iv) suggesting that the end user will receive a different price or rate for goods or services or a different level or quality of goods or services.
Pending 2025-01-14
D-01.4
Ch. 149B § 2(a)(i)-(iv)
Plain Language
Employers may only use electronic monitoring tools to collect employee information for six enumerated legitimate purposes (facilitating essential job functions, quality assurance, periodic performance assessment, legal compliance, health/safety/security, and wage/benefit administration). The monitoring tool must be narrowly tailored to the stated purpose, implemented in the least invasive manner possible, limited to the smallest number of workers and least amount of data necessary, and data must be deleted once the purpose is achieved. The Department of Labor Standards may add further exceptions by rulemaking.
(a) It shall be unlawful for an employer to use an electronic monitoring tool to collect employee information unless: (i) the electronic monitoring tool is primarily used to accomplish any of the following purposes: (A) allowing a worker to accomplish or facilitating the accomplishment of an essential job function; (B) ensuring the quality of goods and services; (C) conducting periodic assessment of worker performance; (D) ensuring or facilitating compliance with employment, labor, or other relevant laws; (E) protecting the health, safety, or security of workers, or the security of the employer's facilities or computer networks; or (F) administering wages and benefits. The department of labor standards may establish additional exceptions under clause (i) through notice and comment rulemaking in compliance with chapter 30A. (ii) the specific type and activated capabilities of an electronic monitoring tool must be narrowly tailored to accomplish the employer's intended, legitimate purpose specified under (i). (iii) the electronic monitoring tool may only be used to accomplish the employer's intended, legitimate purpose specified in (i), and must be customized and implemented in a manner ensuring that the execution of its duties undertaken in the manner least invasive to employees of the employer while accomplishing the employer's legitimate purposes as defined by (i); (iv) the specific form of electronic monitoring is limited to the smallest number of workers, collects the least amount of data and is collected no more frequently than is necessary to accomplish the purpose, and the data collected is deleted once the purpose has been achieved.
Pending 2025-01-14
D-01.4
Ch. 149B § 2(a)(v)-(vii)
Plain Language
Employers must ensure that unnecessary employee data collected by monitoring tools is not disclosed to the employer and is promptly disposed of by the vendor. Employee data must never be collected when the employee is off-duty. Necessary data must be stored consistent with Massachusetts data and cybersecurity privacy laws, promptly deleted when no longer needed, and not used by the employer, vendor, or any third party for any purpose other than those specifically allowed under the data retention and impact assessment provisions of this chapter.
(v) the employer must ensure that any employee data that is collected utilizing an electronic monitoring tool that is not necessary to accomplish the employer's intended, legitimate purpose is not disclosed to the employer and is promptly disposed of by the vendor; (vi) the employer must ensure that employee data is not collected when the employee is off-duty; and (vii) the employer must ensure that any employee data collected utilizing an electronic monitoring tool that is necessary to accomplish the employer's intended, legitimate purpose, is stored consistent with the commonwealth's data- and cyber- privacy laws, promptly disposed of as soon as the data is no longer needed, and is not utilized by the employer, the vendor or any other third party for any reason except as provided in section 2(c) and section 3(c) of this chapter.
Pending 2025-01-14
D-01.1
Ch. 149B § 2(b)
Plain Language
Before using any electronic monitoring tool, employers must provide detailed written notice to and obtain written consent from all affected employees and candidates, and must conspicuously post the notice. The notice must include eleven enumerated categories of information: the monitoring purpose, specific data collected and collection schedule, monitoring dates/times/frequency, whether data feeds into an ADS, whether data will be used for employment decisions, how data may be used in discipline or litigation, whether data will set productivity standards, data storage location and retention period, why the monitoring method is the least invasive available, a statement of the employee's right to refuse data sale/transfer, and instructions for exercising rights under the chapter.
(b) Any employer that uses an electronic monitoring tool shall give prior written notice and must obtain written consent from all candidates and employees subject to electronic monitoring and must also post said notice in a conspicuous place which is readily available for viewing by candidates and employees, pursuant to sections 19B, 52C, and 190(i) of chapter 149 and section 99 of chapter 272. Such notice shall include, at a minimum, the following: (i) a description of the purpose for which the electronic monitoring tool will be used, as specified in subparagraph (i) of paragraph (a) of this subdivision; (ii) a description of the specific employee data to be collected, stored, secured, and disposed of (and the schedule therefore), and the activities, locations, communications, and job roles that will be electronically monitored by the tool; (iii) a description of the dates, times, and frequency that electronic monitoring will occur; (iv) whether and how any employee data collected by the electronic monitoring tool will be used as an input in an automated employment decision tool; (v) whether and how any employee data collected by the electronic monitoring tool will alone or in conjunction with an automated employment decision tool be used to make an employment decision by the employer or employment agency; (vi) whether and how any employee data collected by the electronic monitoring tool may be stored and utilized in discipline, in internal policy compliance, in administrative agency adjudications, and in litigation (whether or not it involves the employee as a party); (vii) whether any employee data collected by the electronic monitoring tool will be used to assess employees' productivity performance or to set productivity standards, and if so, how; (viii) a description of where any employee data collected by the electronic monitoring tool will be stored and the length of time it will be retained; (ix) an explanation for how the specific electronic monitoring practice is the least invasive means available to accomplish the monitoring purpose; (x) a statement that an employee is entitled to notice and maintains the right to refuse the sale, transfer, or disclosure of the employee's employee data subject to the provisions of section 2(f); and (xi) a clear and reasonably understandable description of how an employee can exercise the rights described in this chapter.
Pending 2025-01-14
D-01.4
Ch. 149B § 2(e)-(f)
Plain Language
Employers are prohibited from using electronically monitored employee data for any purpose beyond what was disclosed in the required notice. Employers also may not sell, transfer, or disclose such data to any third party unless required by law or necessary to comply with an impact assessment of an automated employment decision tool. These are strict purpose-limitation and data-sharing restrictions that go beyond typical data minimization.
(e) An employer shall not use employee data collected via an electronic monitoring tool for purposes other than those specified in the notice provided pursuant to paragraph (c) of subdivision one of this section. (f) An employer shall not sell, transfer, or disclose employee data collected via an electronic monitoring tool to any other entity unless it is required to do so under federal law or the laws of the commonwealth, or necessary to do so to comply with an impact assessment of an automated employment decision tool pursuant to section one thousand twelve of this article.
Pending 2025-01-14
D-01.4
Ch. 149B § 3(d)
Plain Language
When employee data must be collected for a bias impact assessment, that data must be handled with full privacy protections and comply with commissioner-specified data retention and security requirements. Critically, employee data provided to auditors for the assessment must not be shared back with the employer — creating a firewall between the audit data and the employer. The data may only be shared with third parties to the extent strictly necessary to complete the assessment.
(d) If an initial or subsequent impact assessment requires the collection of employee data to assess a tool's disparate impact on employees, such data shall be collected, processed, stored, retained, and disposed of in such a manner as to protect the privacy of employees, and shall comply with any data retention and security requirements specified by the commissioner. Employee data provided to auditors for the purpose of an impact assessment shall not be shared with the employer, nor shall it be shared with any person, business entity, or other organization unless strictly necessary for the completion of the impact assessment.
Pre-filed 2025-01-16
D-01.8
Chapter 110I, § 2(c)(i)-(ii)
Plain Language
Covered entities may not process or transfer biometric data in any manner not consented to by the end user. Sale of biometric data to third parties is categorically prohibited. Disclosure to third parties is permitted only if consistent with the duties of loyalty, care, and confidentiality, and only if the recipient enters a contract imposing the same duties toward the end user. Consent must be freely given, specific, informed, and unambiguous — bundled consent in general terms of use is expressly insufficient, and consent obtained through abusive trade practices is void.
(c) A covered entity shall not: (i) process or transfer biometric data in any manner not consented to by the end user; (ii) engage in the sale of biometric data to a third party;  (iii) disclose biometric data with any other person or entity except as consistent with the duties of loyalty, care, and confidentiality under subsections 2(a), 2(b) and 2(c)(i) and 2(c)(ii), respectively; or (iv) disclose or share biometric data with any other person unless that person enters into a contract with the covered entity that imposes on the person the same duties of care, loyalty, and confidentiality toward the end user as are imposed on the covered entity under this subsection.
Pre-filed 2025-01-16
D-01.5
Chapter 110I, § 4(a)
Plain Language
Covered entities are categorically prohibited from using biometric data to inform or contribute to any decision that produces legal effects or similarly significant effects on end users. The bill provides a non-exhaustive list of covered decisions: financial/lending services, housing, insurance, educational enrollment, criminal justice, employment, healthcare, and access to basic necessities. This is not a 'use with safeguards' provision — it is a flat prohibition on using biometric data in consequential decision-making, with no exceptions for consent, bias mitigation, or human oversight.
(a) Covered entities shall not use biometric data to help make decisions that produce legal effects or similarly significant effects concerning end users. Decisions that include legal effects or similarly significant effects concerning end users include, without limitation, denial or degradation of consequential services or support, such as financial or lending services, housing, insurance, educational enrollment, criminal justice, employment opportunities, health care services, and access to basic necessities, such as food and water.
Pre-filed 2025-01-17
D-01.8
Chapter 93M, § 2(b)(1)-(3)
Plain Language
Before collecting, capturing, purchasing, or otherwise obtaining any biometric identifier or biometric information, a private entity must provide the individual (or their authorized representative) with written notice that biometric data is being collected or stored, written notice of the specific purpose and retention period, and obtain the individual's informed written consent. Consent may be obtained electronically. This is a pre-collection requirement — all three elements must be satisfied before any biometric data is obtained. Broad carve-outs exist for HIPAA-covered healthcare data, medical imaging, organ transplant data, and demographic data.
(b) No private entity may collect, capture, purchase, receive through trade, or otherwise obtain a person's or a customer's biometric identifier or biometric information, unless it first: (1) informs the subject or the subject's legally authorized representative in writing that a biometric identifier or biometric information is being collected or stored; (2) informs the subject or the subject's legally authorized representative in writing of the specific purpose and length of term for which a biometric identifier or biometric information is being collected, stored, and used; and (3) receives written consent executed by the subject of the biometric identifier or biometric information or the subject's legally authorized representative. Written consent may be obtained by electronic means.
Pre-filed 2025-01-17
D-01.4
Chapter 93M, § 2(a)
Plain Language
Any private entity that possesses biometric identifiers or biometric information must develop and make available a written policy establishing a retention schedule and guidelines for permanently destroying the data. Destruction must occur when the original collection purpose has been satisfied or within one year of the individual's last interaction with the entity, whichever comes first. The entity must comply with its own retention and destruction schedule unless a valid court order, warrant, or subpoena requires otherwise. This creates both a documentation obligation (written policy) and a data minimization obligation (mandatory destruction on schedule).
(a) A private entity in possession of biometric identifiers or biometric information must develop a written policy, made available to the person from whom biometric information is to be collected or was collected, establishing a retention schedule and guidelines for permanently destroying biometric identifiers and biometric information when the initial purpose for collecting or obtaining such identifiers or information has been satisfied or within 1 year of the individual's last interaction with the private entity, whichever occurs first. Absent a valid order, warrant, or subpoena issued by a court of competent jurisdiction or a local or federal governmental agency, a private entity in possession of biometric identifiers or biometric information must comply with its established retention schedule and destruction guidelines.
Pre-filed 2025-01-17
D-01.5
Chapter 93M, § 2(c)
Plain Language
Private entities are categorically prohibited from selling, leasing, trading, or otherwise profiting from any person's biometric identifier or biometric information. There are no exceptions — this is an absolute prohibition. Unlike the disclosure restriction in § 2(d), which allows disclosure with consent or under legal compulsion, the commercialization prohibition has no carve-outs whatsoever.
(c) No private entity in possession of a biometric identifier or biometric information may sell, lease, trade, or otherwise profit from a person's or a customer's biometric identifier or biometric information.
Pre-filed 2025-01-17
D-01.8
Chapter 93M, § 2(d)(1)-(4)
Plain Language
Private entities may not disclose, redisclose, or otherwise disseminate biometric identifiers or biometric information unless one of four enumerated exceptions applies: (1) the individual or their authorized representative provides written consent; (2) the disclosure completes a financial transaction the individual requested or authorized; (3) the disclosure is required by law; or (4) the disclosure is required by a valid warrant or subpoena. Outside these four exceptions, any disclosure is prohibited. Note that the consent exception requires written consent specifically, consistent with the chapter's overall consent standard.
(d) No private entity in possession of a biometric identifier or biometric information may disclose, redisclose, or otherwise disseminate a person's or a customer's biometric identifier or biometric information unless: (1) the subject of the biometric identifier or biometric information or the subject's legally authorized representative provides written consent to the disclosure or redisclosure; (2) the disclosure or redisclosure completes a financial transaction requested or authorized by the subject of the biometric identifier or the biometric information or the subject's legally authorized representative; (3) the disclosure or redisclosure is required by state or federal law or municipal ordinance; or (4) the disclosure is required pursuant to a valid warrant or subpoena issued by a court of competent jurisdiction.
Pending 2026-10-01
D-01.4
Insurance Article § 15–10B–05.1(c)(10)
Plain Language
Patient data used by AI tools in utilization review must not be used beyond its intended and stated purpose, consistent with HIPAA. This is a data minimization and purpose limitation obligation applied specifically to AI-driven healthcare coverage decisions. This is existing law reenacted without amendment.
(10) patient data is not used beyond its intended and stated purpose, consistent with the federal Health Insurance Portability and Accountability Act of 1996, as applicable;
Pending 2026-10-01
D-01.4
Commercial Law § 14–1330(F)(1)
Plain Language
Controllers (the bill uses this term without defining it within this section — likely intended as the entity controlling data collection, which would typically be the operator) must limit the collection of personal data to what is reasonably necessary and proportionate to satisfy the requirements of this subtitle. This is a data minimization obligation — operators cannot collect more personal data than needed to comply with the companion chatbot obligations. De-identified data and publicly available information are excluded from the definition of personal data.
(F) (1) A CONTROLLER SHALL LIMIT THE COLLECTION OF PERSONAL DATA TO WHAT IS REASONABLY NECESSARY AND PROPORTIONATE TO SATISFY THE REQUIREMENTS OF THIS SUBTITLE.
Failed 2026-01-01
D-01.4
24-A MRSA §4304(8)(A) (final paragraph)
Plain Language
Data used by AI in utilization review determinations is subject to a strict purpose limitation: it may not be used beyond its intended and stated purpose. Additionally, such data must be affirmatively protected from risks that could directly or indirectly harm enrollees. This is a data governance obligation that constrains secondary use of enrollee data and requires affirmative data protection measures. It aligns with HC-01.5 (patient data purpose limitation) as well as D-01.4 (data minimization and purpose limitation for AI systems).
Data under this paragraph may not be used beyond its intended and stated purpose. Data under this paragraph must be protected from risk that may directly or indirectly cause harm to the enrollee.
Passed 2026-01-01
D-01.4
22 MRSA § 1730-B(5)
Plain Language
All records maintained by the licensed professional and all communications between the professional and any individual seeking or receiving therapy — including AI-generated or AI-processed data — are confidential. Disclosure is prohibited except as otherwise required by law. This operates as a data use limitation: session data collected by AI tools in the course of therapy may not be disclosed or repurposed beyond what law permits, reinforcing the purpose limitation in subsection 3(A)(3) regarding data storage, training use, and deletion.
5. Disclosure of records and communications. All records kept by a licensed professional and all communications between an individual seeking therapy or psychotherapy services and a licensed professional or between a client and a licensed professional are confidential and may not be disclosed except as required under law.
Failed 2026-06-15
D-01.4
10 MRSA § 1500-SS(2)
Plain Language
Deployers face a two-part data minimization obligation. First, they may not collect or store any information that conflicts with a user's safety and well-being — this is an absolute prohibition regardless of business purpose. Second, even for legitimate purposes, data collection and storage must be limited to information that is both relevant to the purpose and the minimum amount necessary. This is a purpose-limitation and data-minimization standard similar to GDPR's data minimization principle, applied specifically to chatbot and social AI companion deployers for all users, not just minors.
2. User information collection and storage. A deployer shall collect and store only information that does not conflict with a user's safety and well-being. A deployer may not collect and store information except to fulfill a legitimate purpose of the deployer. A deployer may collect and store information that is adequate to fulfill a legitimate purpose of the deployer, but only to the extent that the information: A. Is relevant to that legitimate purpose; and B. Is the minimum amount of information necessary to fulfill that legitimate purpose.
Pending 2026-02-24
D-01.4
Sec. 5(1)-(2)
Plain Language
Employers are prohibited from using electronic monitoring tools or automated decision tools to collect covered individuals' data except for enumerated permissible purposes. Electronic monitoring is permitted only for facilitating essential job functions, monitoring production quality, periodic performance assessment, labor/employment law compliance, protecting covered individuals' health/safety/security, administering wages and benefits (limited to geographic cost-of-living data), and other purposes the Department determines enable business operations. This is a purpose-limitation requirement: data collection through these tools is banned unless it fits one of the specified categories.
Sec. 5. (1) Except as provided in this act, an employer shall not use an electronic monitoring tool or automated decisions tool to collect a covered individual's data. (2) An employer may use an electronic monitoring tool for only the following purposes: (a) To allow an employee to accomplish or facilitate an essential job function. (b) To monitor production processes or quality. (c) To periodically assess an employee's performance. (d) To ensure or facilitate compliance with state or federal labor or employment law. (e) To protect the health, safety, or security of covered individuals. (f) To administer wages and benefits, if it can be determined that the electronic monitoring system uses only data regarding the city where the covered individual works and the costs of living in that area. (g) To accomplish any other purpose that enables business operations as determined by the department.
Pending 2026-02-24
D-01.1D-01.2
Sec. 5(3)(a)-(d)
Plain Language
Before using electronic monitoring or automated decision tools, employers must (1) provide written notice to all covered individuals subject to the tool, (2) obtain written consent from each covered individual, (3) ensure the collected data remains accurate and current, and (4) provide covered individuals a mechanism to correct inaccurate data. The notice and consent requirements are prerequisites to lawful use. The data accuracy and correction rights are ongoing obligations that persist throughout the tool's use.
(3) An employer that uses an electronic monitoring tool or automated decisions tool must do all of the following: (a) Provide written notice that the employer is using an electronic monitoring tool or automated decisions tool to all covered individuals who are subject to the tool. (b) Obtain written consent from each covered individual to electronically monitor or use an automated decisions tool on the covered individual in accordance with this act. (c) Ensure that data collected through the electronic monitoring tool or automated decisions tool is accurate and up to date. (d) Allow a covered individual to correct inaccurate data about that covered individual.
Pending 2026-02-24
D-01.4
Sec. 5(3)(e)-(h)
Plain Language
Employers must apply data minimization principles to all electronic monitoring and automated decision tool use: the tool must be narrowly tailored to its permitted purpose, deployed through the least invasive means, applied to the fewest covered individuals necessary, collect the minimum data required, and operate no more frequently than necessary. The tool must never collect data from off-duty employees. These are ongoing operational constraints — not one-time configuration requirements.
(e) Use the tool in a narrowly tailored manner to accomplish a purpose described in subsection (2) or section 4(2). (f) Use the tool through the least invasive means possible for the covered individual whom the tool monitors. (g) Ensure the tool applies to the smallest number of covered individuals, collects the least amount of data, and is used no more frequently than necessary to accomplish a purpose described in subsection (2) or section 4(2). (h) Ensure that the tool does not collect any data of an employee when the employee is off duty.
Pending 2026-02-24
D-01.5
Sec. 5(4)(a)-(c)
Plain Language
Even where electronic monitoring or automated decision tools are otherwise permitted, employers are prohibited from collecting specific categories of sensitive data: health/medical/wellness information, qualified characteristics (race, gender, disability, etc.), and a broad range of workplace activity data including HR files, productivity data, workplace communications, device usage, geolocation, audio-video/sensor data, ADT inputs/outputs linked to individuals, and online activity. Employers may not use these tools to identify, punish, or collect data about individuals engaged in legally protected labor activities. Monitoring is categorically prohibited in private areas (bathrooms, locker rooms, breakrooms, prayer areas, etc.) and in employees' homes, personal vehicles, or personal property.
(4) An employer that uses an electronic monitoring tool for a purpose described in subsection (2) or an automated decisions tool for a purpose described in section 4(2) shall not do any of the following: (a) Collect any of the following data of a covered individual: (i) Health, medical, lifestyle, and wellness information, including, but not limited to, the covered individual's medical history, physical or mental condition, diet or physical activity patterns, heart rate, medical treatment or diagnosis by a health care professional, health insurance policy number, subscriber identification number, or other unique identifier used to identify the covered individual. (ii) A qualified characteristic. (iii) Information related to workplace activities, including, but not limited, all of the following: (A) Human resources information, including contents of a covered individual's personnel file or performance evaluations. (B) Work process information, such as productivity and efficiency information. (C) Information that captures workplace communications and interactions, including emails, texts, internal message boards, and customer interaction and ratings. (D) Device usage, including calls placed or geolocation information. (E) Audio-video information and other information collected from sensors, including movement tracking, thermal sensors, voiceprints, or facial, emotion, and gait recognition. (F) Inputs of or outputs generated by an automated decisions tool that are linked to a covered individual. (G) Online information, including a covered individual's internet protocol address, private social media activity, or other digital sources or unique identifiers associated with a covered individual. (b) Identify, punish, or obtain data about a covered individual who engages in an activity that is protected under state or federal labor or employment law. (c) Monitor bathrooms or other similar private areas, including, but not limited to, locker rooms, changing areas, breakrooms, smoking areas, employee cafeterias, lounges, areas designated to express breast milk, or areas designated for prayer or other religious activity. The prohibition under this subdivision includes data collection on the frequency of use of those private areas and conducting audio or visual monitoring of a workplace in an employee's residence, an employee's personal vehicle, or property owned or leased by an employee.
Pending 2026-08-01
D-01.8
Minn. Stat. § 325M.40, subd. 2
Plain Language
Any person must obtain an individual's consent before collecting any biometric data from that individual. Biometric data includes facial images, retinal/iris scans, fingerprints, voiceprints, and hand/face geometry usable for identification. The consent must be received before the collection occurs — retroactive consent is not sufficient. The statute does not specify the form of consent (written vs. oral), nor does it require disclosure of the specific purpose or type of biometric identifier being collected, unlike Illinois BIPA. Voiceprint data retained by financial institutions (as defined under 15 U.S.C. § 6809) is exempt from this requirement.
A person is prohibited from collecting biometric data from an individual unless the person receives the individual's consent to collect the biometric data before the collection occurs.
Pending 2026-08-01
Minn. Stat. § 325M.40, subd. 3(1)
Plain Language
Once a person has obtained biometric data, they are prohibited from selling, leasing, or otherwise disclosing it to any third party unless one of four narrow exceptions applies: (1) the individual consented to disclosure for identification in case of disappearance or death; (2) the disclosure completes a financial transaction the individual requested or authorized; (3) the disclosure is required or permitted by federal or state law; or (4) the disclosure is to or by law enforcement pursuant to a warrant. This is effectively a near-total ban on commercial sale of biometric data. Voiceprint data retained by financial institutions is exempt.
A person who obtains biometric data: (1) must not sell, lease, or otherwise disclose the biometric data to another person unless: (i) the individual consents to the disclosure for identification purposes in the event of the individual's disappearance or death; (ii) the disclosure completes a financial transaction that the individual requested or authorized; (iii) the disclosure is required or permitted by a federal or state law; or (iv) the disclosure is made by or to a law enforcement agency for a law enforcement purpose in response to a warrant;
Pending 2026-08-01
Minn. Stat. § 325M.40, subd. 3(2)
Plain Language
Any person who holds biometric data must store, transmit, and protect it using reasonable care, at a level at least as protective as how the person handles its other confidential information. This establishes a relative security floor — if you already protect trade secrets or financial data at a high level, your biometric data security must match or exceed that standard. The statute does not prescribe specific technical measures such as encryption, but the reasonable care standard combined with the comparative benchmark creates an enforceable obligation.
(2) must store, transmit, and protect from disclosure the biometric data using reasonable care and in a manner that is at least as or more protective than the manner in which the person stores, transmits, and protects other confidential information the person possesses;
Pending 2026-08-01
D-01.4
Minn. Stat. § 325M.40, subd. 3(3)
Plain Language
Biometric data must be deleted and destroyed within a reasonable time, and in no event later than one year after the purpose for collection expires. If a federal or state law requires longer retention, the data must be destroyed within one year after that legal retention period expires. For employers collecting employee biometric data for security purposes, the purpose automatically expires when the employment relationship terminates — meaning the data must be destroyed within one year of the employee's departure. This creates a hard outer deadline (one year after purpose expiration) with an expectation of faster deletion where reasonable. Voiceprint data retained by financial institutions is exempt.
(3) must delete and destroy the biometric data within a reasonable time, but no later than one year from the date the purpose for collecting the data expires, unless the data is maintained pursuant to a federal or state law that requires a longer retention period, in which case the biometric data must be destroyed within a reasonable time frame but no later than one year from the date that the state or federal law retention period expires. If an employer collects an employee's biometric data for security purposes, the purpose for collecting the data expires upon termination of the employment relationship.
Pending 2026-08-01
D-01.1D-01.2
Minn. Stat. § 181.9923, subd. 2(a)-(b), subd. 3(a)-(d)
Plain Language
Workers have the right to request copies of all their data collected, used, or produced by an automated decision system — including inputs, outputs, and human reviewer corroborating evidence — and employers must respond within 7 days. Workers also have the right to request corrections to any such data. Upon receiving a correction request, the employer must investigate; if the data is inaccurate, the employer must correct it, review and adjust any employment decisions based on the inaccurate data, and notify third parties who shared or provided the data. If the employer determines the data is accurate, it must explain its decision, verification steps, and supporting evidence to the worker. Notably, corrections must cascade to pending and future decisions — not just the underlying record.
Subd. 2. Record requests. (a) A worker has the right to request a copy of: (1) any of the worker's data collected, used, or produced by an automated decision system; (2) any input or output data used or produced by the automated decision system; and (3) corroborating evidence used by a human reviewer. (b) The employer must provide copies of the data requested within seven days of receiving a worker's request. Subd. 3. Record corrections. (a) A worker has the right to request corrections to: (1) any worker data collected, used, or produced by an automated decision system; (2) any input or output data used or produced by the automated decision system; and (3) any corroborating evidence used by a human reviewer. (b) An employer that receives a request to correct any of the information listed in paragraph (a) must investigate and determine whether the disputed data is inaccurate. (c) If an employer determines that the disputed data is inaccurate, the employer must: (1) promptly correct the disputed data and inform the worker of the employer's decision and action; (2) review and adjust any employment-related decisions that were partially or solely based on the inaccurate data and inform the worker of the adjustment; and (3) inform any third parties with which the employer shared the inaccurate data, or from which the employer received the inaccurate data, of the error and direct those third parties to correct the data. (d) If an employer, upon investigation, determines that the disputed data is accurate, the employer must inform the worker of: (1) the decision not to amend the disputed data; (2) the steps taken to verify the accuracy of the data; and (3) the evidence supporting the decision not to amend the disputed data.
Pending
D-01.8
Minn. Stat. § 325M.40, subd. 2
Plain Language
Before collecting any biometric data from an individual, a person must first obtain the individual's consent. Biometric data is defined broadly to include images, descriptions, or recordings of facial features, retinas, irises, fingerprints, voiceprints, hand geometry, or face geometry usable to identify an individual. The bill does not specify the form of consent (written vs. oral) or require specific disclosures about the type of biometric data being collected or the purpose of collection, unlike Illinois BIPA. Voiceprint data retained by financial institutions or their affiliates (as defined by 15 U.S.C. § 6809) is exempt from this requirement.
A person is prohibited from collecting biometric data from an individual unless the person receives the individual's consent to collect the biometric data before the collection occurs.
Pending
D-01.4
Minn. Stat. § 325M.40, subd. 3(1)-(3)
Plain Language
Once biometric data is collected, the collector faces three ongoing obligations. First, the data cannot be sold, leased, or disclosed except in four narrow circumstances: individual consent for disappearance/death identification, completing an individual-authorized financial transaction, disclosure required or permitted by law, or law enforcement disclosure under a warrant. Second, the data must be stored, transmitted, and protected with at least the same level of care applied to the collector's other confidential information. Third, the data must be deleted within a reasonable time but no later than one year after the collection purpose expires. For employers collecting employee biometric data for security, the purpose expires upon employment termination. If a federal or state law requires longer retention, the one-year clock starts when that retention period ends. Voiceprint data held by financial institutions or their affiliates is exempt.
A person who obtains biometric data: (1) must not sell, lease, or otherwise disclose the biometric data to another person unless: (i) the individual consents to the disclosure for identification purposes in the event of the individual's disappearance or death; (ii) the disclosure completes a financial transaction that the individual requested or authorized; (iii) the disclosure is required or permitted by a federal or state law; or (iv) the disclosure is made by or to a law enforcement agency for a law enforcement purpose in response to a warrant; (2) must store, transmit, and protect from disclosure the biometric data using reasonable care and in a manner that is at least as or more protective than the manner in which the person stores, transmits, and protects other confidential information the person possesses; and (3) must delete and destroy the biometric data within a reasonable time, but no later than one year from the date the purpose for collecting the data expires, unless the data is maintained pursuant to a federal or state law that requires a longer retention period, in which case the biometric data must be destroyed within a reasonable time frame but no later than one year from the date that the state or federal law retention period expires. If an employer collects an employee's biometric data for security purposes, the purpose for collecting the data expires upon termination of the employment relationship.
Pending 2026-09-01
D-01.1D-01.2
§ 181.9923, Subd. 2(a)-(b); Subd. 3(a)-(d)
Plain Language
Workers have the right to request copies of all their data collected, used, or produced by an automated decision system — including input/output data and human reviewer corroborating evidence — and the employer must respond within seven days. Workers also have the right to request corrections. If the employer determines data is inaccurate, it must promptly correct it, review and adjust any employment decisions based on the inaccurate data, and notify third parties who received or supplied the data. If the employer finds the data accurate, it must explain its decision not to amend, the verification steps taken, and the supporting evidence. Notably, the correction obligation cascades to affected employment decisions and third-party data recipients.
Subd. 2. Record requests. (a) A worker has the right to request a copy of: (1) any of the worker's data collected, used, or produced by an automated decision system; (2) any input or output data used or produced by the automated decision system; and (3) corroborating evidence used by a human reviewer. (b) The employer must provide copies of the data requested within seven days of receiving a worker's request. Subd. 3. Record corrections. (a) A worker has the right to request corrections to: (1) any worker data collected, used, or produced by an automated decision system; (2) any input or output data used or produced by the automated decision system; and (3) any corroborating evidence used by a human reviewer. (b) An employer that receives a request to correct any of the information listed in paragraph (a) must investigate and determine whether the disputed data is inaccurate. (c) If an employer determines that the disputed data is inaccurate, the employer must: (1) promptly correct the disputed data and inform the worker of the employer's decision and action; (2) review and adjust any employment-related decisions that were partially or solely based on the inaccurate data and inform the worker of the adjustment; and (3) inform any third parties with which the employer shared the inaccurate data, or from which the employer received the inaccurate data, of the error and direct those third parties to correct the data. (d) If an employer, upon investigation, determines that the disputed data is accurate, the employer must inform the worker of: (1) the decision not to amend the disputed data; (2) the steps taken to verify the accuracy of the data; and (3) the evidence supporting the decision not to amend the disputed data.
Pending 2026-09-01
D-01.5
§ 181.9924, Subd. 1(b)
Plain Language
Employers may not use automated decision systems that rely on individualized worker data to set compensation unless three conditions are all met: the input data must be directly related to the worker's ability to perform the task (e.g., education, training, experience, seniority); the inputs must be clearly communicated to the worker so they understand what drives their compensation; and the system must be used no more than once every six months per worker or only when there is a meaningful change in work duties like hiring or promotion. All three conditions must be satisfied — failure on any one means the use is prohibited.
(b) An employer must not use an automated decision system that uses individualized worker data as inputs or outputs to set compensation, unless the employer can demonstrate that: (1) the input data is directly related to the ability of the worker to complete the task, such as education, training, experience, or seniority; (2) the inputs used are clearly communicated to the worker such that the worker knows their compensation is a function of the identified attributes; and (3) the employer uses the automated decision system either: (i) not more than once per six-month period per worker; or (ii) only in conjunction with a meaningful change in work duties, such as hiring or promotion.
Pending 2026-08-28
D-01.8
RSMo § 1.566(2)(1)-(3)
Plain Language
Before collecting any biometric identifier or biometric information, a private entity must: (1) provide written notice that biometric data is being collected or stored, (2) disclose the specific purpose and duration of collection, storage, and use, and (3) obtain a written release from the individual or their legally authorized representative. A general release or user agreement is insufficient — the consent must be specific. In the employment context, consent is further limited to access-control and timekeeping purposes and may not be used for location tracking or tracking time spent on applications. Employers may require consent as a condition of employment.
2. No private entity shall collect, capture, purchase, receive through trade, or otherwise obtain a person's or a customer's biometric identifier or biometric information unless it first: (1) Informs the person or customer, or the person's or customer's legally authorized representative, in writing that a biometric identifier or biometric information is being collected or stored; (2) Informs the person or customer, or the person's or customer's legally authorized representative, of the specific purpose and length of term for which a biometric identifier or biometric information is being collected, stored, and used; and (3) Receives a written release executed by the person or customer, or the person's or customer's legally authorized representative.
Pending 2026-08-28
D-01.5
RSMo § 1.566(3)(2)
Plain Language
Private entities are categorically prohibited from selling, leasing, or trading any person's biometric identifier or biometric information. There are no exceptions — this is an absolute prohibition on commercial transfer of biometric data.
(2) No private entity in possession of a biometric identifier or biometric information shall sell, lease, or trade a person's or a customer's biometric identifier or biometric information.
Pending 2026-08-28
D-01.4
RSMo § 1.566(4)(1)-(4)
Plain Language
Private entities may not disclose, redisclose, or disseminate a person's biometric identifier or biometric information except in four narrow circumstances: (1) the individual provides a written release, (2) the disclosure completes a financial transaction the individual requested or authorized, (3) the disclosure is required by law or ordinance, or (4) the disclosure is required by a valid warrant or subpoena. All other disclosures are prohibited. This is a purpose limitation and disclosure restriction — biometric data may only be shared beyond the collecting entity under these enumerated exceptions.
4. No private entity in possession of a biometric identifier or biometric information shall disclose, redisclose, or otherwise disseminate a person's or a customer's biometric identifier or biometric information unless: (1) The person or customer, or the person's or customer's legally authorized representative, provides written release to the disclosure or redisclosure; (2) The disclosure or redisclosure completes a financial transaction requested or authorized by the person or customer, or the person's or customer's legally authorized representative; (3) The disclosure or redisclosure is required by state law, federal law, or municipal ordinance; or (4) The disclosure is required pursuant to a valid warrant or subpoena issued by a court of competent jurisdiction.
Pending 2026-08-28
D-01.6
§ 1.2058(5)(2)(e)
Plain Language
Covered entities must implement comprehensive data governance for age verification data: collection must be minimized to what is strictly necessary for age verification or compliance; data must be protected against unauthorized access using industry-standard encryption; retention must be limited to what is reasonably necessary; and the data must never be shared with, transferred to, or sold to any other entity. This creates a standalone data minimization and security framework specifically for age verification data that goes beyond general data protection requirements.
(e) A covered entity shall: a. Establish, implement, and maintain reasonable data security to: (i) Limit collection of personal data to that which is minimally necessary to verify a user's age or maintain compliance with this section; and (ii) Protect such age verification data against unauthorized access; b. Protect such age verification data against unauthorized access; c. Protect the integrity and confidentiality of such data by only transmitting such data using industry-standard encryption protocols; d. Retain such data for no longer than is reasonably necessary to verify a user's age or maintain compliance with this section; and e. Not share with, transfer to, or sell to any other entity such data.
Pending 2026-08-28
D-01.4
RSMo § 1.2058(5)(2)(e)
Plain Language
Covered entities must establish and maintain reasonable data security for age verification data, including: limiting collection to what is minimally necessary for age verification or statutory compliance; protecting the data against unauthorized access; transmitting data only using industry-standard encryption; retaining data only as long as reasonably necessary; and never sharing, transferring, or selling the data to any other entity. This is a comprehensive data minimization and security obligation specific to age verification data — it goes beyond general data governance to impose specific technical requirements (encryption) and an absolute prohibition on third-party data sharing.
(e) A covered entity shall: a. Establish, implement, and maintain reasonable data security to: (i) Limit collection of personal data to that which is minimally necessary to verify a user's age or maintain compliance with this section; and (ii) Protect such age verification data against unauthorized access; b. Protect such age verification data against unauthorized access; c. Protect the integrity and confidentiality of such data by only transmitting such data using industry-standard encryption protocols; d. Retain such data for no longer than is reasonably necessary to verify a user's age or maintain compliance with this section; and e. Not share with, transfer to, or sell to any other entity such data.
Pending 2026-01-01
D-01.1D-01.8
G.S. 114B-4(b)(3)-(5)
Plain Language
Licensed health information chatbot operators must obtain explicit user consent before collecting and using data, provide users with access to their personal data, and allow users to delete their data upon request. These are individual data rights that must be operationalized — consent cannot be implied, and access and deletion requests must be honored on demand.
(3) Obtain explicit user consent for data collection and use. (4) Provide users with access to their personal data. (5) Provide users with the ability to delete their data upon request.
Pending 2026-01-01
D-01.4
G.S. 170-3(b)(5)
Plain Language
Covered platforms may only collect and store user data that does not conflict with users' best interests. All data collected must satisfy a three-part test: it must be (1) adequate — sufficient to fulfill a legitimate platform purpose, (2) relevant — linked to that legitimate purpose, and (3) necessary — the minimum amount needed for that purpose. This is a strict data minimization obligation framed through a loyalty lens — the platform must not only minimize data but ensure collection doesn't conflict with user interests.
(5) Duty of loyalty in collection. — A covered platform shall collect and store only that information that does not conflict with a trusting party's best interests. Such information must be (i) adequate, in the sense that it is sufficient to fulfill a legitimate purpose of the platform; (ii) relevant, in the sense that the information has a relevant link to that legitimate purpose, and (iii) necessary, in the sense that it is the minimum amount of information which is needed for that legitimate purpose.
Pending 2026-01-01
D-01.5
G.S. 170-3(b)(7)
Plain Language
Covered platforms must act as loyal gatekeepers of user personal information, meaning they must avoid conflicts with user interests when sharing data with governments or other third parties. This effectively constrains third-party data sharing to situations that do not conflict with user interests. Platforms cannot share user data with governments or third parties in ways that undermine the user's interests, even if the sharing is otherwise lawful.
(7) Duty of loyalty in gatekeeping. — A covered platform shall be a loyal gatekeeper of personal information from a trusted party, including avoiding conflicts to the best interests of trusting parties when allowing government or other third-party access to trusting parties and their data.
Pending 2026-01-01
D-01.4D-01.5
G.S. 170-6(a)-(d)
Plain Language
Covered platforms must implement four data privacy requirements: (1) all user-related data from chatbot conversations or third-party cookies must be de-identified before storage and analysis; (2) sensitive personal information from chatbot use must not be incorporated into aggregate training datasets for any chatbot or generative AI system; (3) non-sensitive chatbot conversations must be stored for at least 60 days; and (4) all messages between users and chatbots must use transport encryption. Additionally, chatbots deployed in healthcare, financial services, legal, government, mental health, education, or any other domain primarily processing sensitive personal information must use self-destructing messages that automatically and irreversibly delete data 30 days after acquisition. The training data prohibition is notable — it categorically blocks use of sensitive personal information derived from user interactions in model training.
(a) A covered platform must do each of the following: (1) Ensure that all user-related data disclosed collected through conversations between users and chatbots or through third-party cookies, undergoes a process of de-identification prior to storage and analysis; (2) Take reasonable care to prohibit the incorporation or inclusion of any sensitive personal information derived from a user during the use of a chatbot into an aggregate dataset used to train any chatbot or generative artificial intelligence system. (3) Store all chatbot conversations which does not include sensitive personal information for at least sixty (60) days. (b) Each covered platform that meets the standard set forth in subsection (a) of this section shall utilize self-destructing messages with a predetermined destruction period of thirty (30) days after the data has been acquired. (c) The requirements of subsection (b) of this section shall apply to all chatbots which are employed in: healthcare, financial services, the legal field, government services, mental health support, and education. In general, this applies to any domain, beyond those specifically listed, where chatbots are employed primarily for the processing or storage of sensitive personal information. (d) All covered platforms shall utilize transport encryption for all messages between a user and a chatbot.
Pending 2027-01-01
G.S. § 114B-4(b)(3)-(5)
Plain Language
Licensees must obtain explicit user consent before collecting or using data, provide users with access to their personal data held by the platform, and honor user requests to delete their data. These are standard data subject rights — consent, access, and deletion — applied specifically to licensed health-information chatbot operators.
A licensee shall do all of the following: (3) Obtain explicit user consent for data collection and use. (4) Provide users with access to their personal data. (5) Provide users with the ability to delete their data upon request.
Pending 2027-01-01
D-01.4
G.S. § 170-3(b)(5)
Plain Language
Covered platforms must limit data collection and storage to information that does not conflict with users' best interests and that meets a three-part test: the data must be adequate (sufficient for a legitimate platform purpose), relevant (linked to that purpose), and necessary (the minimum amount needed). This is a data minimization obligation framed through a fiduciary lens — the 'best interests' overlay means that even data meeting the adequacy/relevance/necessity test may be prohibited if collection itself conflicts with user interests.
Duty of loyalty in collection. – A covered platform shall collect and store only that information that does not conflict with a trusting party's best interests. Such information must be (i) adequate, in the sense that it is sufficient to fulfill a legitimate purpose of the platform, (ii) relevant, in the sense that the information has a relevant link to that legitimate purpose, and (iii) necessary, in the sense that it is the minimum amount of information which is needed for that legitimate purpose.
Pending 2027-01-01
G.S. § 170-3(b)(7)
Plain Language
Covered platforms must act as loyal gatekeepers of user personal information, particularly when granting government or third-party access to user data. The platform must avoid conflicts with users' best interests when sharing data with external parties. This creates a fiduciary-style data stewardship obligation that restricts how platforms share user data with third parties, with heightened scrutiny for government data requests.
Duty of loyalty in gatekeeping. – A covered platform shall be a loyal gatekeeper of personal information from a trusted party, including avoiding conflicts to the best interests of trusting parties when allowing government or other third-party access to trusting parties and their data.
Pending 2027-01-01
D-01.4
G.S. § 170-6(a)(1)-(3)
Plain Language
Covered platforms must de-identify all user-related data collected through chatbot conversations or third-party cookies before storing or analyzing it. De-identification requires replacing identifiable information with pseudonyms, aggregating data to make re-identification statistically improbable, and removing traceable context and metadata. Platforms must also take reasonable care to prevent sensitive personal information derived from chatbot use from being incorporated into training datasets for any chatbot or generative AI system. Non-sensitive chatbot conversations must be stored for at least 60 days. This creates a tension: data must be de-identified before storage but non-sensitive conversations must be retained for 60 days — in practice, the 60-day retention applies to de-identified conversation data.
A covered platform must do all of the following: (1) Ensure that all user-related data disclosed collected through conversations between users and chatbots or through third-party cookies undergoes a process of de-identification prior to storage and analysis. (2) Take reasonable care to prohibit the incorporation or inclusion of any sensitive personal information derived from a user during the use of a chatbot into an aggregate dataset used to train any chatbot or generative artificial intelligence system. (3) Store all chatbot conversations which does not include sensitive personal information for at least 60 days.
Pending 2027-01-01
G.S. § 170-6(b)-(c)
Plain Language
Covered platforms must implement self-destructing messages that automatically and irreversibly delete chatbot conversation data 30 days after acquisition. This requirement applies to chatbots employed in healthcare, financial services, legal services, government services, mental health support, education, and any other domain where chatbots primarily process or store sensitive personal information. The 30-day auto-deletion creates a hard ceiling on data retention for these sensitive-domain chatbots — data must be programmed to become permanently inaccessible to both parties after 30 days.
(b) Each covered platform that meets the standard set forth in subsection (a) of this section shall utilize self-destructing messages with a predetermined destruction period of 30 days after the data has been acquired. (c) The requirements of subsection (b) of this section shall apply to all chatbots which are employed in healthcare, financial services, the legal field, government services, mental health support, and education. In general, this applies to any domain, beyond those specifically listed, where chatbots are employed primarily for the processing or storage of sensitive personal information.
Passed
D-01.8
Sec. 4(1); Sec. 5(1)(a)
Plain Language
Controllers must obtain written consent from the data owner before processing any agricultural data. The consent must be affirmative and in writing — no processing may occur without it. Data owners may rescind consent at any time by providing written notice to the controller. This functions similarly to an opt-in consent requirement for data collection, applied specifically to agricultural data categories.
Sec. 4. (1) A person may provide written consent to any potential controller of such person's agricultural data that authorizes: (a) The potential controller to process such person's agricultural data; or (b) A third party to process such person's agricultural data on behalf of the potential controller. (2) A person that has provided written consent under this section may rescind such consent by providing a written notice of such rescission to the controller of the agricultural data.

Sec. 5. (1) A controller shall not: (a) Require any person to submit to any processing of such person's agricultural data without the written consent of such person;
Passed
Sec. 5(1)(b)
Plain Language
Controllers may not discriminate against individuals who decline to consent to agricultural data collection. A person who refuses consent must receive the same services, goods, benefits, and rewards as a person who consents. This is an anti-retaliation or non-discrimination-for-exercising-rights provision that ensures consent is genuinely voluntary.
A controller shall not: (b) Provide any difference in any service, good, benefit, or reward provided to any person who does not consent to the collection or possession of agricultural data;
Passed
Sec. 5(1)(c)
Plain Language
Controllers are prohibited from selling, providing to third parties, or using agricultural data without the data owner's authorization. This goes beyond the consent-to-process requirement in Section 4 — it separately prohibits any sale, transfer, or use without authorization, creating an independent violation for unauthorized secondary use or sale even if initial collection was authorized.
A controller shall not: (c) Sell, provide, or use the agricultural data of any person without such person's authorization.
Plain Language
When a data owner rescinds their written consent, the controller must delete all agricultural data relating to that person within 30 days of receiving the written rescission notice. This is a mandatory deletion obligation with a fixed compliance timeline — not merely a right to request deletion but an automatic trigger upon receipt of rescission.
A controller shall delete the agricultural data relating to a person that has provided a written notice rescinding the authorization pursuant to section 4 of this act within thirty days after receiving such written notice.
Plain Language
Processors face the same prohibitions as controllers: they may not process, sell, provide, or use agricultural data unless the data owner has provided written consent to the controller authorizing such processing. This means processors cannot rely on the controller's representation alone — the underlying written consent from the data owner must exist. Processors who act without valid underlying consent face independent liability.
A processor shall not process, sell to any person, provide to any person, or use the agricultural data of a person without such person providing written consent that authorizes such processing to the controller of the agricultural data under section 4 of this act.
Failed 2026-02-01
D-01.3
Sec. 4(4)(a)(iii)
Plain Language
Where applicable, deployers must inform consumers of their existing right under Nebraska's data privacy law (Section 87-1107) to opt out of the processing of personal data for profiling in furtherance of consequential decisions. This is a cross-reference disclosure obligation — it does not create a new opt-out right but requires deployers to inform consumers of their existing one at the point of AI-driven consequential decision-making.
(iii) If applicable, provide information to the consumer regarding the consumer's right to opt out of the processing of personal data concerning the consumer for any purpose of profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer under subdivision (2)(e)(iii) of section 87-1107.
Pending
D-01.4
Section 3(a)-(b)
Plain Language
Business entities are categorically prohibited from selling, leasing, trading, sharing, or otherwise profiting from any information obtained through the use of a biometric surveillance system on a consumer. There are no exceptions or consent-based carve-outs — this is a flat ban on commercialization of biometric surveillance data. Violations are unlawful practices under the Consumer Fraud Act. This prohibition applies regardless of whether the business complied with the notice requirement for using the biometric surveillance system in the first place.
a. A business entity shall not sell, lease, trade, share, or otherwise profit from information obtained through the business entity's use of a biometric surveillance system on a consumer. b. A violation of this section shall be an unlawful practice and a violation of P.L.1960, c.39 (C.56:8-1 et seq.).
Plain Language
Employers and public entities face three prohibitions related to biometric, health, and wellness data: (1) they may not transfer or disclose such data to third parties or government entities except when required by law; (2) they may not use such data in making employment or public benefits decisions; and (3) they must not retain such data for applicants who were not hired, former employees after employment ends, or service beneficiaries who no longer receive services. These restrictions apply regardless of how the data was obtained.
No employer or public entity, vendor, or contractor acting on behalf of the employer or public entity shall: k. Transfer or otherwise disclose biometric, health, or wellness data or information, however obtained, to any third party or government entity unless required to do so under State or federal law; use biometric, health, or wellness data or information in making an employment-related decision or decision regarding public benefits or services; or retain biometric, health, or wellness data or information of an applicant for employment who has not been hired or a former employee after employment ends, or after the service beneficiary no longer receives services or benefits;
Pending
D-01.2D-01.4
Section 5(a)
Plain Language
Employers, public entities, and vendors may not sell, license, transfer, disclose, or share employee, applicant, or service beneficiary data with third parties without uncoerced written consent — exceptions exist only for providing data to the individual, their representative, or law enforcement when legally required. Applicant data (including video) must be destroyed at the applicant's request, and vendors must return and delete all data when contracts terminate. All data must be kept accurate and up to date. When significant data changes occur, the employer must notify the individual and inform them of their right to access, correct, or remove inaccurate or improperly retained data. Even without a change notification, individuals have an annual right to access their data and seek corrections. If a correction request is denied, the employer must provide a written explanation, which is retained for potential appeals.
a. (1) An employer, public entity, vendor, or contractor shall ensure that no data or information about an employee or applicant, or service beneficiary, or applicant for employment collected by an EMT or other surveillance, and no output of an AEDS, or data or information used to produce that output, is sold, licensed, transferred, disclosed, or shared to or with any third party by the employer, public entity, or vendor, without the uncoerced written consent of the employee, service beneficiary, or applicant for employment, except that the data or information may be provided to the applicant, service beneficiary, employee, or an authorized representative, or to a law enforcement authority or a court when required by law. All information about an applicant for employment or public benefits or services, including any applicant video, shall be destroyed at the request of the applicant. A vendor shall return to the employer or public entity and delete all employee, applicant, and service beneficiary information once the contract between the vendor and the employer or public entity is terminated. (2) An employer, public entity, or vendor acting on behalf of an employer shall ensure that all information and data and information about an employee or service beneficiary, held by the employer, public entity, or vendor is accurate and up to date. An employer or public entity shall notify an employee or service beneficiary of any significant change in the data or information held by the employer or public entity or vendor. The notification shall inform the employee or service beneficiary of the change and the right of the employee, service beneficiary, or a designated representative, to access to any data or information about the employee or service beneficiary held by the employer, public entity, or vendor and make a written request to correct inaccurate information or remove information being retained or used in a manner that violates the provisions of this act, and, in addition, the employee or service beneficiary, even if not notified of any change, shall, at least one time per year, have the right to have access to the data and information and seek any needed corrections or removals. If the employer or public entity does not change or remove the information as requested, the employer or public entity shall provide a written explanation of the reason for that decision, and retain copies of the request and the written explanation, to be available for consideration in any appeal of an adverse decision made pursuant to section 8 of this act.
Passed 2026-01-01
D-01.8
Section 2(a)-(b)
Plain Language
Business entities are categorically prohibited from using biometric surveillance systems on consumers at their physical premises unless two conditions are met: (1) the business provides clear and conspicuous notice, which can be satisfied by posting a sign at the perimeter of the surveilled area, and (2) the system is used for a lawful purpose. This is a notice-and-lawful-purpose framework rather than an opt-in consent requirement — unlike Illinois BIPA, which requires written informed consent before any biometric identifier collection, New Jersey permits use with posted signage notice alone. The definition of facial recognition is notably broad, covering not only identification but also emotion inference, association tracking, and location inference from face, head, or body characteristics.
a. It shall be an unlawful practice and a violation of P.L.1960, c.39 (C.56:8-1 et seq.) for a business entity to use any biometric surveillance system on a consumer at the physical premises of the business entity, except as provided in subsection c. of this section. b. A business entity may use a biometric surveillance system on a consumer at the physical premises of the business entity, if: (1) the business entity provides clear and conspicuous notice to the consumer regarding its use of a biometric surveillance system; and (2) the biometric surveillance system is used for a lawful purpose. The business entity may satisfy the notice requirement of paragraph (1) of this section by posting a sign in a conspicuous location at the perimeter of any area where a biometric surveillance system is being used.
Passed 2026-01-01
D-01.4
Section 3(a)-(b)
Plain Language
Business entities are categorically prohibited from selling, leasing, trading, sharing, or otherwise profiting from any information obtained through biometric surveillance of consumers. This is a blanket prohibition on secondary use and monetization of biometric surveillance data — there is no consent exception. Unlike the notice-and-lawful-purpose framework in Section 2 that permits use with posted signage, this data commercialization prohibition is absolute. This is comparable to Illinois BIPA's prohibition on profiting from biometric identifiers, though BIPA structures it as a consent requirement rather than an outright ban.
a. A business entity shall not sell, lease, trade, share, or otherwise profit from information obtained through the business entity's use of a biometric surveillance system on a consumer. b. A violation of this section shall be an unlawful practice and a violation of P.L.1960, c.39 (C.56:8-1 et seq.).
Pending 2026-02-02
D-01.4
Section 1.b.
Plain Language
Employers may only share applicant video recordings with service providers whose expertise or technology is necessary for evaluating the applicant's fitness for the position. This is a purpose limitation — the video cannot be shared for any reason unrelated to the applicant evaluation. The restriction applies to all recipients, not just AI vendors, encompassing any person who might otherwise receive the video.
An employer shall not share an applicant's video except with a service provider whose expertise or technology is necessary to evaluate the applicant's fitness for a position.
Pending 2026-02-02
Section 1.c.
Plain Language
Applicants have the right to request deletion of their video interviews. Upon receiving such a request, the employer must delete the videos within 30 days and instruct all third parties who received copies — including service providers — to also delete all copies, including electronic backups. Third-party recipients must comply with the employer's deletion instructions. This creates both an individual deletion right for applicants and a downstream deletion cascade obligation for employers and their vendors.
Upon request from the applicant, an employer, within 30 days after receipt of the request, shall delete an applicant's interviews and instruct any other persons who received copies of the applicant's video interviews to also delete the videos, including all electronically generated backup copies. Any other person or service provider shall comply with the employer's instructions.
Pending 2026-01-01
D-01.1
Labor Law § 203-g(2)(a)(iii)
Plain Language
Employers and employment agencies must disclose to each candidate, as part of the required pre-use notice, the type of data collected for the automated tool, where that data comes from, and the employer's data retention policy. This is a data transparency obligation — candidates must understand what data inputs feed the automated screening process and how long their data is kept.
(iii) Information about the type of data collected for such automated employment decision tool, the source of such data, and the employer or employment agency's data retention policy.
Pending 2026-01-01
D-01.3
Labor Law § 203-g(2)(b)
Plain Language
The notice must be delivered at least ten business days before the automated tool is used and must include a mechanism for the candidate to request an alternative (non-automated) selection process or an accommodation. This effectively creates an opt-out right — candidates who do not wish to be screened by an automated tool must be given the opportunity to request an alternative path. The ten-business-day lead time ensures candidates have meaningful opportunity to exercise this right before the tool is applied.
(b) The notice required by paragraph (a) of this subdivision shall be made no less than ten business days before the use of such automated employment decision tool and shall allow such candidate to request an alternative selection process or accommodation.
Pending 2025-04-27
D-01.4
State Tech. Law § 506(1)-(2)
Plain Language
Automated systems must incorporate privacy protections by default. Data collection must conform to reasonable expectations and must be limited to what is strictly necessary for the specific context — a data minimization requirement. This is a design-level obligation requiring privacy-by-design architecture, not merely a policy commitment.
1. New York residents shall be protected from abusive data practices via built-in protections and shall maintain agency over the use of their personal data.
2. Privacy violations shall be mitigated through design choices that include privacy protections by default, ensuring that data collection conforms to reasonable expectations and that only strictly necessary data for the specific context is collected.
Pending 2025-04-27
D-01.3
State Tech. Law § 506(3)-(6)
Plain Language
Designers, developers, and deployers must respect residents' decisions regarding collection, use, access, transfer, and deletion of their data. Where honoring those decisions is not possible, alternative privacy-by-design safeguards must be used. Systems may not use dark patterns or privacy-invasive defaults. Consent may only justify data collection where it can be meaningfully given, and consent requests must be brief, in plain language, and context-specific. Existing complex notice-and-choice practices must be simplified. This effectively creates a right to opt out of data collection and requires affirmative, meaningful consent practices.
3. Designers, developers, and deployers of automated systems must seek and respect the decisions of New York residents regarding the collection, use, access, transfer, and deletion of their data in all appropriate ways and to the fullest extent possible. Where not possible, alternative privacy by design safeguards must be implemented.
4. Automated systems shall not employ user experience or design decisions that obscure user choice or burden users with default settings that are privacy-invasive.
5. Consent shall be used to justify the collection of data only in instances where it can be appropriately and meaningfully given. Any consent requests shall be brief, understandable in plain language, and provide New York residents with agency over data collection and its specific context of use.
6. Any existing practice of complex notice-and-choice for broad data use shall be transformed, emphasizing clarity and user comprehension.
Pending 2025-04-27
D-01.4D-01.5
State Tech. Law § 506(7)
Plain Language
In sensitive domains — broadly defined as areas where activities can cause material harms to human rights, autonomy, dignity, or civil liberties — individual data and related inferences may only be used for necessary functions. These uses must be safeguarded by ethical review and subject to use prohibitions. The definition of 'sensitive data' is extremely broad, encompassing data generated by minors, biometric data, genomic data, behavioral data, geolocation data, criminal justice data, and data with reasonable potential to cause harm. This effectively imposes a strict-necessity standard for data use in sensitive contexts.
7. Enhanced protections and restrictions shall be established for data and inferences related to sensitive domains. In sensitive domains, individual data and related inferences may only be used for necessary functions, safeguarded by ethical review and use prohibitions.
Pending 2025-04-27
D-01.1
State Tech. Law § 506(10)
Plain Language
Residents should have access to reporting that confirms their data preferences are being honored and that assesses the impact of surveillance technologies on their rights and access. The 'whenever possible' qualifier creates ambiguity about when this right is actually enforceable.
10. Whenever possible, New York residents shall have access to reporting that confirms respect for their data decisions and provides an assessment of the potential impact of surveillance technologies on their rights, opportunities, or access.
Pending 2025-07-26
D-01.5
State Tech. Law § 522(1)-(3)
Plain Language
Licensees may share information and source code with third parties, but when shared information includes biometric data (faceprints, voiceprints, fingerprints, gaitprints, irisprints, psychological profiles, or other identifying body/mind data), the receiving third party becomes jointly liable with the licensee for any harm or violations under the article. The Secretary may prohibit specific persons from accessing a licensee's information or source code, with written justification required. This applies only to information received or generated by the licensee and source code the licensee created — not to third-party integrations.
1. Licensees shall be permitted to share information and source code with any third party, provided however, that where information is biometric information such party shall be jointly liable for any harm or violations under this article with the licensee. The secretary may, in their discretion, prohibit any person from accessing the information or source code of a licensee provided however that the secretary shall provide a written justification for such a prohibition. 2. For purposes of this section, "biometric information" shall include a person's: (a) faceprint; (b) voiceprint; (c) fingerprint; (d) gaitprint; (e) irisprint; (f) psychological profile; or (g) any other data related to a person's body or mind that can be used to identify a person. 3. This section shall only apply to the sharing of information received or generated by the licensee or source code created by the licensee and shall not apply to a third party integrating their systems with the licensee.
Pending
D-01.8
Gen. Bus. Law § 676-b(2)(a)-(c)
Plain Language
Before collecting any biometric identifier or biometric information, a private entity must provide written notice to the individual (or their legal representative) that biometric data is being collected, disclose the specific purpose and duration of collection/storage/use, and obtain a written release (informed written consent) from the individual. All three steps must be completed before any collection occurs. In the employment context, the written release may be executed as a condition of employment. This is a strict pre-collection consent requirement — there is no exception for publicly available data or implied consent.
2. No private entity may collect, capture, purchase, receive through trade, or otherwise obtain a person's or a customer's biometric identifier or biometric information, unless it first: (a) informs the subject or the subject's legally authorized representative in writing that a biometric identifier or biometric information is being collected or stored; (b) informs the subject or the subject's legally authorized representative in writing of the specific purpose and length of term for which a biometric identifier or biometric information is being collected, stored, and used; and (c) receives a written release executed by the subject of the biometric identifier or biometric information or the subject's legally authorized representative.
Pending
D-01.8
Gen. Bus. Law § 676-b(2)(a)-(c)
Plain Language
Before collecting, capturing, purchasing, or otherwise obtaining any biometric identifier or biometric information, a private entity must provide the individual (or their authorized representative) with written notice that biometric data is being collected or stored, written notice of the specific purpose and duration for which the data will be collected, stored, and used, and must obtain a written release from the individual. All three steps — notice of collection, notice of purpose and duration, and written consent — must be completed before any collection occurs. In the employment context, a written release executed as a condition of employment satisfies the consent requirement.
2. No private entity may collect, capture, purchase, receive through trade, or otherwise obtain a person's or a customer's biometric identifier or biometric information, unless it first: (a) informs the subject or the subject's legally authorized representative in writing that a biometric identifier or biometric information is being collected or stored; (b) informs the subject or the subject's legally authorized representative in writing of the specific purpose and length of term for which a biometric identifier or biometric information is being collected, stored, and used; and (c) receives a written release executed by the subject of the biometric identifier or biometric information or the subject's legally authorized representative.
Pending 2025-09-05
D-01.8
Gen. Bus. Law § 1155(1)
Plain Language
News media employers may not train (or authorize a third party to train) a generative AI system on a news media worker's work product without providing notice, obtaining consent, and giving the worker an opportunity to bargain over appropriate compensation. Workers who decline consent may not be penalized. This creates a three-part prerequisite — notice, consent, and bargaining opportunity — before any AI training use of employee-created content. The anti-retaliation provision ensures the consent is genuinely voluntary.
News media employers shall not directly or through a third party authorize the training of a generative artificial intelligence system on the work product of a news media worker without notice, consent and an opportunity to bargain over appropriate remuneration. A news media employer shall not penalize a news media worker for declining to consent to allow their work product to be used to train a generative artificial intelligence system.
Pending 2026-07-22
D-01.5
Exec. Law § 296(23)(a)
Plain Language
The provision explicitly prohibits the use of zip codes as a proxy for protected classes in AI-driven employment decisions. This is a direct proxy variable restriction — employers may not design or use AI systems that infer protected characteristics from non-sensitive geographic proxies to make employment decisions. This maps to D-01.5's prohibition on using proxy variables to circumvent restrictions on sensitive attribute use in consequential automated decisions.
(a) It shall be an unlawful discriminatory practice for an employer to use artificial intelligence for recruitment, hiring, promotion, renewal of employment, selection for training or apprenticeship, discharge, discipline, tenure, or the terms, privileges, or conditions of employment that has the effect of subjecting employees to discrimination on the basis of age, race, creed, color, national origin, citizenship or immigration status, sexual orientation, gender identity or expression, military status, sex, disability, predisposing genetic characteristics, familial status, marital status, or status as a victim of domestic violence or to use zip codes as a proxy for such protected classes.
Pending 2026-11-01
D-01.4
75A O.S. § 702(B)(1)-(3)
Plain Language
Deployers must limit their data collection and storage to information that does not conflict with the trusting party's (i.e., the user's) best interests. Collected information must satisfy all three tests: it must be adequate (sufficient for a legitimate purpose), relevant (linked to that purpose), and necessary (the minimum amount needed). This is a data minimization obligation with a fiduciary-like framing — the reference to 'trusting party's best interests' implies a duty of loyalty in data handling. The term 'trusting party' is not defined in the statute.
B. Deployers shall collect and store only that information that does not conflict with a trusting party's best interests. Such information must be: 1. Adequate, in the sense that it is sufficient to fulfill a legitimate purpose of the deployer; 2. Relevant, in the sense that the information has a relevant link to that legitimate purpose; and 3. Necessary, in the sense that it is the minimum amount of information which is needed for that legitimate purpose.
Pending 2026-10-06
D-01.4
35 Pa.C.S. § 3503(b)(6)
Plain Language
Patient data used by facility AI systems must not be repurposed beyond the intended and stated purpose of the AI-based algorithms. This data minimization/purpose limitation obligation is layered on top of existing state law and HIPAA requirements. Secondary uses of patient data generated or collected through AI systems require separate justification.
(6) Patient data must not be used beyond the intended and stated purpose of the artificial intelligence-based algorithms, consistent with the laws of this Commonwealth and 42 U.S.C. Ch. 7 Subch. XI Part C (relating to administrative simplification), as applicable.
Pending 2026-10-06
D-01.4
40 Pa.C.S. § 5203(b)(8)
Plain Language
Covered person data used by insurer AI systems in utilization review must not be repurposed beyond the AI algorithms' intended and stated purpose. This purpose-limitation obligation is layered on top of HIPAA and state law. Insurers must ensure that data collected or processed through AI tools is not used for secondary purposes without separate justification.
(8) The data of the covered person must not be used beyond the intended and stated purpose of the artificial intelligence-based algorithms, consistent with Commonwealth law and 42 U.S.C. Ch. 7, Subch. XI Part C (relating to administrative simplification), as applicable.
Pending 2026-10-06
D-01.4
40 Pa.C.S. § 5303(b)(8)
Plain Language
MA/CHIP managed care plans must not use enrollee data beyond the stated purpose of their AI algorithms. This purpose-limitation obligation is layered on top of HIPAA and state law, preventing secondary use of enrollee data collected through AI utilization review processes.
(8) The data of the covered person or enrollees must not be used beyond the intended and stated purpose of the artificial intelligence-based algorithms, consistent with the laws of this Commonwealth and the Health Insurance Portability and Accountability Act of 1996 (Public Law 104-191, 110 Stat. 1936), as applicable.
Pending 2026-04-01
D-01.4
12 Pa.C.S. § 7103(a)-(d)
Plain Language
Suppliers may not sell or share with third parties either a consumer's individually identifiable health information or the content the consumer provides to the chatbot ('consumer input'). Three narrow exceptions exist: (1) a health care provider requests the health information and the consumer gives written consent; (2) the consumer requests that a health plan receive the information and consents in writing; or (3) the sharing is necessary for chatbot functionality with a contractually bound third party and the consumer consents in writing. When sharing under the functionality exception, both the supplier and the third party must comply with HIPAA privacy and security rules as if they were a covered entity and business associate, respectively. Written consent may be obtained via signature, checkbox, electronic signature, or button click.
(a) Prohibition.--Except as provided under subsections (b) and (c), a supplier may not sell to or share with a third party the following: (1) Individually identifiable health information of a consumer. (2) Consumer input. (b) Applicability.--The prohibition under subsection (a) shall not apply if: (1) Either: (i) A health care provider requests access to the individually identifiable health information of the consumer and the consumer consents to the access in accordance with subsection (d). (ii) The consumer requests that a health plan be provided access to the individually identifiable health information of the consumer and the consumer consents to the access in accordance with subsection (d). (2) The individually identifiable health information is shared in accordance with subsection (c). (c) Sharing information.-- (1) A supplier may share a consumer's individually identifiable health information if: (i) the sharing of the information is necessary to ensure the effective functionality of the chatbot with a third party with which the supplier has a contract related to the functionality; and (ii) the consumer consents to the sharing of the information in accordance with subsection (d). (2) When sharing information in accordance with this subsection, the supplier and the third party shall comply with all applicable privacy and security provisions of 45 CFR Pts. 160 (relating to general administrative requirements) and 164 (relating to security and privacy), as if the supplier were a covered entity and the third party were a business associate. (d) Consent.-- (1) A consumer may consent to access to individually identifiable health information of the consumer by a health care provider or health plan in accordance with this section. (2) To be effective, the consent under this subsection must: (i) Be in writing. (ii) Acknowledge that the consumer understands and agrees to the access of the individually identifiable health information of the consumer by a health care provider or health plan. (3) The consent under this subsection may involve the consumer initialing or signing the acknowledgment described in paragraph (2)(ii), checking a box, providing an electronic signature or hitting a button.
Pending 2027-01-09
D-01.4
35 Pa.C.S. § 3503(b)(6)
Plain Language
Patient data collected and used in connection with AI-based algorithms must not be used beyond the intended and stated purpose of those algorithms. This data minimization and purpose limitation requirement must be consistent with Pennsylvania state law and HIPAA. Facilities must clearly define and document the intended purpose of their AI algorithms and restrict data use accordingly.
(6) Patient data must not be used beyond the intended and stated purpose of the artificial-intelligence-based algorithms, consistent with the laws of this Commonwealth and 42 U.S.C. Ch. 7 Subch. XI Part C (relating to administrative simplification), as applicable.
Pending 2027-01-09
D-01.4
40 Pa.C.S. § 5203(b)(8)
Plain Language
Covered person data used in connection with insurer AI-based algorithms must not be used beyond the algorithms' intended and stated purpose, consistent with Pennsylvania law and HIPAA. This is a purpose limitation requirement for insurance utilization review AI.
(8) The data of the covered person must not be used beyond the intended and stated purpose of the artificial-intelligence-based algorithms, consistent with Commonwealth law and 42 U.S.C. Ch. 7, Subch. XI Part C (relating to administrative simplification), as applicable.
Pending 2027-01-09
D-01.4
40 Pa.C.S. § 5303(b)(8)
Plain Language
Enrollee data used in connection with MA or CHIP managed care plan AI-based algorithms must not be used beyond the algorithms' intended and stated purpose, consistent with Pennsylvania law and HIPAA.
(8) The data of the covered person or enrollees must not be used beyond the intended and stated purpose of the artificial-intelligence-based algorithms, consistent with the laws of this Commonwealth and the Health Insurance Portability and Accountability Act of 1996 (Public Law 104-191, 110 Stat. 1936), as applicable.
Pending 2026-01-28
R.I. Gen. Laws § 40.1-5.5-4
Plain Language
All records maintained by a licensed professional and all communications between a therapy-seeking individual and a licensed professional must be kept confidential. Disclosure is permitted only under the exceptions in existing R.I. Gen. Laws § 40.1-5-26. This applies to any records associated with AI use in the therapeutic context, reinforcing that AI-processed data (e.g., transcriptions, notes) carries the same confidentiality protections as traditional therapy records. Violations are subject to existing penalties under § 5-37.3-9.
All records kept by a licensed professional and all communications between an individual seeking therapy or psychotherapy services and a licensed professional shall be confidential and shall not be disclosed except as provided pursuant to the provisions of § 40.1-5-26.
Pending
D-01.4
§ 28-5.2-2(a)-(b)
Plain Language
Employers may only use electronic monitoring tools to collect employee information if the tool is primarily used for one of six enumerated legitimate purposes (facilitating essential job functions, quality assurance, periodic performance assessment, legal compliance, health/safety/security, or wage/benefit administration). Beyond meeting a legitimate purpose, the employer must narrowly tailor the tool's type and capabilities to that purpose, implement it in the least invasive manner possible, limit monitoring to the fewest workers and least data necessary, prohibit collection when employees are off-duty, ensure unnecessary data is never disclosed to the employer, and delete collected data once the purpose is achieved. This is a comprehensive data minimization and purpose limitation regime for workplace monitoring.
(a) It shall be unlawful for an employer to use an electronic monitoring tool to collect employee information unless: (1) The electronic monitoring tool is primarily used to accomplish any of the following legitimate purposes: (i) Allowing a worker to accomplish or facilitating the accomplishment of an essential job function; (ii) Ensuring the quality of goods and services; (iii) Conducting periodic assessment of worker performance; (iv) Ensuring or facilitating compliance with employment, labor, or other relevant laws; (v) Protecting the health, safety, or security of workers, or the security of the employer's facilities or computer networks; or (vi) Administering wages and benefits. (2) The department of labor and training standards may establish additional exceptions under this subsection, pursuant to chapter 35 of title 42 ("administrative procedures act.") (b)(1) The specific type and activated capabilities of an electronic monitoring tool shall be narrowly tailored to accomplish the employer's intended, legitimate purpose specified under subsection (a)(1) of this section; (2) The electronic monitoring tool shall only be used to accomplish the employer's intended, legitimate purpose specified in subsection (a)(1) of this section, and shall be customized and implemented in a manner ensuring that the execution of its duties are undertaken in the manner least invasive to employees of the employer, while still accomplishing the employer's legitimate purposes as defined by subsection (a)(1) of this section; (3) The specific form of electronic monitoring is limited to the smallest number of workers, collection of the least amount of data which shall be collected no more frequently than is necessary to accomplish the purpose, and the data collected, shall be deleted once the purpose has been achieved; (4) The employer shall ensure that any employee data that is collected utilizing an electronic monitoring tool that is not necessary to accomplish the employer's intended, legitimate purpose shall not be disclosed to the employer and shall be promptly disposed of by the vendor; (5) The employer shall ensure that employee data is not collected when the employee is off-duty; and (6) The employer shall ensure that any employee data collected utilizing an electronic monitoring tool that is necessary to accomplish the employer's intended, legitimate purpose, is stored consistent with the state's data and cyber privacy laws, promptly disposed of as soon as the data is no longer needed, and is not utilized by the employer, the vendor or any other third party for any reason except, as provided in subsection (c) of this section.
Pending
D-01.1
§ 28-5.2-2(c)
Plain Language
Before using any electronic monitoring tool, employers must provide prior written notice to all affected employees and candidates and obtain written acknowledgment. The notice must also be posted conspicuously where candidates and employees can see it. The notice must cover eleven specific categories of information: the monitoring purpose, what data is collected and how it is stored and disposed of, monitoring schedule, whether data feeds into an ADS, whether data informs employment decisions, downstream uses of data (discipline, litigation, etc.), whether it sets productivity standards, data storage location and retention period, why this is the least invasive approach, employee rights to refuse data sale/transfer, and how to exercise rights under the chapter. This is a comprehensive transparency obligation that must be satisfied before monitoring begins.
(c) Any employer that uses an electronic monitoring tool shall give prior written notice and shall obtain written acknowledgment from all candidates and employees subject to electronic monitoring and shall also post said notice in a conspicuous place which is readily available for viewing by candidates for employment and employees. Such notice shall include, at a minimum, the following: (1) A description of the purpose for which the electronic monitoring tool will be used, as specified in subsection (a)(1) of this section; (2) A description of the specific employee data to be collected, stored, secured, and disposed of (and the schedule therefor), and the activities, locations, communications, and job roles that will be electronically monitored by the tool; (3) A description of the dates, times, and frequency that electronic monitoring will occur; (4) Whether and how any employee data collected by the electronic monitoring tool will be used as an input in an automated decision system; (5) Whether and how any employee data collected by the electronic monitoring tool will alone or in conjunction with an automated decision system be used to make an employment decision by the employer or employment agency; (6) Whether and how any employee data collected by the electronic monitoring tool may be stored and utilized in discipline, in internal policy compliance, in administrative agency adjudications, in litigation (whether or not it involves the employee or not as a party); (7) Whether any employee data collected by the electronic monitoring tool will be used to assess employees' productivity performance or to set productivity standards, and if so, how; (8) A description of where any employee data collected by the electronic monitoring tool will be stored and the length of time it will be retained; (9) An explanation for how the specific electronic monitoring practice is the least invasive means available to accomplish the monitoring purpose; (10) That an employee is entitled to notice and maintains the right to refuse the sale, transfer, or disclosure of their employee data, subject to the provisions of subsection (g) of this section; and (11) A clear and reasonably understandable description of how an employee can exercise the rights described in this chapter.
Pending
D-01.4
§ 28-5.2-2(f)-(g)
Plain Language
Employers face two related restrictions on monitored employee data: (1) a strict purpose limitation — data may only be used for the purposes described in the notice given to employees, and (2) a near-total ban on selling, transferring, or disclosing employee data to other entities, with exceptions only for legal requirements or compliance with an ADS impact assessment. These provisions together create a closed-loop data governance regime where employee monitoring data stays within the noticed scope and does not leave the employer absent legal compulsion.
(f) An employer shall not use employee data collected via an electronic monitoring tool for purposes other than those specified in the notice provided pursuant to subsection (c) of this section. (g) An employer shall not sell, transfer, or disclose employee data collected via an electronic monitoring tool to any other entity unless it is required to do so under federal law or the laws of the state, or necessary to do so to comply with an impact assessment of an automated decision system used pursuant to this section.
Pending 2026-01-23
R.I. Gen. Laws § 40.1-5.5-4
Plain Language
All records maintained by a licensed professional in connection with therapy or psychotherapy — and all communications between clients and the professional — are confidential and may only be disclosed under existing exceptions in R.I. Gen. Laws § 40.1-5-26. This extends existing mental health confidentiality requirements to encompass records generated or maintained with AI assistance. The practical effect is that any AI-generated notes, transcripts, anonymized analyses, or other records fall under the same confidentiality protections as traditional clinical records.
All records kept by a licensed professional and all communications between an individual seeking therapy or psychotherapy services and a licensed professional shall be confidential and shall not be disclosed except as provided pursuant to the provisions of § 40.1-5-26.
Pending 2026-02-06
D-01.4
§ 28-5.2-2(a)-(b)
Plain Language
Employers may only use electronic monitoring tools to collect employee data if the tool is primarily used for one of six enumerated legitimate purposes (essential job functions, quality assurance, periodic performance assessment, legal compliance, health/safety/security, or wage/benefit administration). Beyond having a qualifying purpose, the monitoring must be narrowly tailored and least invasive, limited to the smallest number of workers and least amount of data necessary, and all data must be deleted once the purpose is achieved. Data unnecessary to the purpose must not be disclosed to the employer and must be disposed of by the vendor. Off-duty data collection is prohibited. All necessary data must comply with state data privacy and cybersecurity laws and may not be used for secondary purposes.
(a) It shall be unlawful for an employer to use an electronic monitoring tool to collect employee information unless: (1) The electronic monitoring tool is primarily used to accomplish any of the following legitimate purposes: (i) Allowing a worker to accomplish or facilitating the accomplishment of an essential job function; (ii) Ensuring the quality of goods and services; (iii) Conducting periodic assessment of worker performance; (iv) Ensuring or facilitating compliance with employment, labor, or other relevant laws; (v) Protecting the health, safety, or security of workers, or the security of the employer's facilities or computer networks; or (vi) Administering wages and benefits. (2) The department of labor and training standards may establish additional exceptions under this subsection, pursuant to chapter 35 of title 42 ("administrative procedures act.") (b)(1) The specific type and activated capabilities of an electronic monitoring tool shall be narrowly tailored to accomplish the employer's intended, legitimate purpose specified under subsection (a)(1) of this section; (2) The electronic monitoring tool shall only be used to accomplish the employer's intended, legitimate purpose specified in subsection (a)(1) of this section, and shall be customized and implemented in a manner ensuring that the execution of its duties are undertaken in the manner least invasive to employees of the employer, while still accomplishing the employer's legitimate purposes as defined by subsection (a)(1) of this section; (3) The specific form of electronic monitoring is limited to the smallest number of workers, collection of the least amount of data which shall be collected no more frequently than is necessary to accomplish the purpose, and the data collected, shall be deleted once the purpose has been achieved; (4) The employer shall ensure that any employee data that is collected utilizing an electronic monitoring tool that is not necessary to accomplish the employer's intended, legitimate purpose shall not be disclosed to the employer and shall be promptly disposed of by the vendor; (5) The employer shall ensure that employee data is not collected when the employee is off-duty; and (6) The employer shall ensure that any employee data collected utilizing an electronic monitoring tool that is necessary to accomplish the employer's intended, legitimate purpose, is stored consistent with the state's data and cyber privacy laws, promptly disposed of as soon as the data is no longer needed, and is not utilized by the employer, the vendor or any other third party for any reason except, as provided in subsection (c) of this section.
Pending 2026-02-06
D-01.1
§ 28-5.2-2(c)
Plain Language
Before deploying any electronic monitoring tool, employers must provide detailed prior written notice to all affected employees and candidates, obtain written acknowledgment, and post the notice conspicuously in the workplace. The notice must cover eleven specified categories of information including: the monitoring purpose, specific data collected and retention schedule, monitoring dates/times/frequency, whether data feeds into an ADS or employment decisions, storage and use in discipline or litigation, productivity assessment use, data storage location and retention period, explanation of least-invasive justification, employee rights to refuse data sale/transfer/disclosure, and how to exercise chapter rights. This is an extremely granular notice requirement — not a generic 'we use AI' disclosure.
(c) Any employer that uses an electronic monitoring tool shall give prior written notice and shall obtain written acknowledgment from all candidates and employees subject to electronic monitoring and shall also post said notice in a conspicuous place which is readily available for viewing by candidates for employment and employees. Such notice shall include, at a minimum, the following: (1) A description of the purpose for which the electronic monitoring tool will be used, as specified in subsection (a)(1) of this section; (2) A description of the specific employee data to be collected, stored, secured, and disposed of (and the schedule therefor), and the activities, locations, communications, and job roles that will be electronically monitored by the tool; (3) A description of the dates, times, and frequency that electronic monitoring will occur; (4) Whether and how any employee data collected by the electronic monitoring tool will be used as an input in an automated decision system; (5) Whether and how any employee data collected by the electronic monitoring tool will alone or in conjunction with an automated decision system be used to make an employment decision by the employer or employment agency; (6) Whether and how any employee data collected by the electronic monitoring tool may be stored and utilized in discipline, in internal policy compliance, in administrative agency adjudications, in litigation (whether or not it involves the employee or not as a party); (7) Whether any employee data collected by the electronic monitoring tool will be used to assess employees' productivity performance or to set productivity standards, and if so, how; (8) A description of where any employee data collected by the electronic monitoring tool will be stored and the length of time it will be retained; (9) An explanation for how the specific electronic monitoring practice is the least invasive means available to accomplish the monitoring purpose; (10) That an employee is entitled to notice and maintains the right to refuse the sale, transfer, or disclosure of their employee data, subject to the provisions of subsection (g) of this section; and (11) A clear and reasonably understandable description of how an employee can exercise the rights described in this chapter.
Pending 2026-02-06
D-01.4
§ 28-5.2-2(f)-(g)
Plain Language
Employee monitoring data is strictly purpose-limited: it may only be used for the purposes disclosed in the notice to employees. Employers may not sell, transfer, or disclose monitoring data to any other entity, with only two narrow exceptions: where required by federal or state law, or where necessary to comply with an impact assessment of an automated decision system. This effectively prohibits secondary monetization, data broker transfers, and sharing with affiliates or third-party vendors beyond what the impact assessment requires.
(f) An employer shall not use employee data collected via an electronic monitoring tool for purposes other than those specified in the notice provided pursuant to subsection (c) of this section. (g) An employer shall not sell, transfer, or disclose employee data collected via an electronic monitoring tool to any other entity unless it is required to do so under federal law or the laws of the state, or necessary to do so to comply with an impact assessment of an automated decision system used pursuant to this section.
Pending
D-01.4
S.C. Code § 39-80-20(A)(1)
Plain Language
Chatbot providers may not use personal data to generate chatbot outputs unless the processing is strictly necessary to fulfill a specific user request and the user has given affirmative consent. Affirmative consent requires a clear, standalone disclosure with an equally prominent option to decline — consent cannot be inferred from inaction, buried in terms of service, or obtained through dark patterns. This effectively creates a necessity-plus-consent dual requirement: even with consent, the processing must be necessary for an express user request.
(A) A chatbot provider may not: (1) process personal data to inform a chatbot output unless processing personal data is necessary to fulfill an express request that is made by a user and the user provides affirmative consent;
Pending
D-01.4
S.C. Code § 39-80-20(A)(2)
Plain Language
Chatbot providers are categorically prohibited from using a user's chat logs — meaning the user's inputs and the chatbot's outputs — for any advertising purpose. This includes deciding whether to show an ad, selecting which product or service to advertise, and customizing ad content. Unlike the personal data processing restriction in § 39-80-20(A)(1), this is an absolute prohibition with no consent exception.
(A) A chatbot provider may not: (2) process a user's chat log: (a) to determine whether to display an advertisement for a product or service to a user; (b) to determine a product or service or category of a product or service to advertise to a user; or (c) to customize an advertisement for presentation to a user;
Pending
D-01.4D-01.6
S.C. Code § 39-80-20(A)(3)
Plain Language
Chatbot providers face a tiered consent framework for processing chat logs and personal data. For minors (when the provider knows or should know the user is a minor): no processing of chat logs or personal data is permitted without parental or guardian affirmative consent, including for training. For adults: training use of chat logs and personal data requires the adult user's affirmative consent. For all users: profiling beyond what is necessary to fulfill an express request is prohibited. Training has a narrow definition that excludes safety testing, safety-related modifications, and compliance actions. Profiling also excludes safety-related processing.
(A) A chatbot provider may not: (3) process a user's chat log and personal data: (a) if the chatbot provider knows or reasonably should have known that based on knowledge of objective circumstances the user is a minor and the user's parent or legal guardian did not provide affirmative consent; (b) for training purposes if the chatbot provider knows or reasonably should have known that based on knowledge of objective circumstances the user is a minor and the user's parent or legal guardian did not provide affirmative consent; (c) for training purposes if the user is an adult, unless the chatbot provider first obtains affirmative consent; or (d) to engage in profiling beyond what is necessary to fulfill an express request;
Pending
D-01.4
S.C. Code § 39-80-20(A)(4)
Plain Language
Chatbot providers may not build personality or behavioral profiles of users beyond what is strictly necessary to fulfill a user's express request. This is a necessity limitation on profiling — any profiling that goes beyond the immediate request is prohibited. Processing chat logs for safety or regulatory compliance does not count as profiling under this provision.
(A) A chatbot provider may not: (4) profile a user based on any classification or designation of the user's personality or behavioral characteristic beyond what is necessary to fulfill an express request made by the user;
Pending
S.C. Code § 39-80-20(A)(5)-(6)
Plain Language
Chatbot providers are absolutely prohibited from selling chat logs — meaning they may not exchange user input/output data for monetary or other valuable consideration or make it available to a third party for such consideration. Narrow carve-outs exist for processor disclosures, user-directed disclosures with affirmative consent, and information the user intentionally made public. Separately, chat logs may not be retained for more than ten years, except where retention is necessary for compliance with this chapter or other law.
(A) A chatbot provider may not: (5) sell a user's chat logs; (6) retain a user's chat log for more than ten years, unless retention is necessary to comply with this chapter or otherwise required by law;
Pending
D-01.3
S.C. Code § 39-80-20(A)(7)
Plain Language
Chatbot providers may not punish users who decline to consent to the use of their chat logs or personal data for training purposes. Prohibited retaliation includes service denial, differential pricing, and reduced service quality. This protects the meaningfulness of the affirmative consent requirement by preventing providers from making refusal commercially disadvantageous.
(A) A chatbot provider may not: (7) discriminate or retaliate against a user, including: (a) denying products or services to the user; (b) charging different prices or rates for products or services to the user; or (c) providing lower quality products or services to the user for refusing to consent to the use of chat logs or personal data for training purposes.
Pending
D-01.1D-01.2
S.C. Code § 39-80-20(B)
Plain Language
Users have an unconditional right to access their own chat logs at any time. Upon request, the chatbot provider must deliver the logs in a downloadable, easy-to-read format. Providers may not discriminate or retaliate against users who exercise this right. This is a straightforward data portability and access right — there is no limit on frequency of requests and no exception for trade secrets or proprietary information.
(B) A user has a right to access the user's own chat logs at any time. A chatbot provider shall provide a user's own chat log on request by the user and shall provide the chat log in a downloadable and easy to read format. A chatbot provider may not discriminate or retaliate against a user that requests the user's chat.
Pending
S.C. Code § 39-80-20(E)
Plain Language
Chatbot providers must implement physical, administrative, and technical safeguards to ensure that deidentified data cannot be reidentified. All processing, retention, and transfer of deidentified data must be conducted without any reasonable means of reidentification. This creates an ongoing obligation to maintain the irreversibility of deidentification.
(E) A chatbot provider shall take the necessary physical, administrative, and technical measures to prevent deidentified data from being reidentified and to process, retain, and transfer deidentified data without any reasonable means of reidentification.
Pending
S.C. Code § 40-1-730
Plain Language
All records maintained by a licensed professional — including AI-generated records — and all communications between a client and a licensed professional are confidential. Disclosure is permitted only as required under South Carolina's existing mental health confidentiality statute (§ 44-22-100). While this reinforces existing confidentiality law, it extends the obligation explicitly to cover records created or maintained with AI assistance, creating a data governance obligation in the AI context that practitioners must observe.
All records kept by a licensed professional and all communications between an individual seeking therapy or psychotherapy services and a licensed professional shall be confidential and shall not be disclosed except as required pursuant to Section 44-22-100.
Pending
D-01.4
S.C. Code § 39-80-20(A)(1)
Plain Language
Chatbot providers may not process personal data to shape chatbot outputs unless two conditions are met: (1) the processing is necessary to fulfill a specific, express user request, and (2) the user has given affirmative consent. The affirmative consent standard is stringent — it requires a standalone disclosure in plain language, accessible to disabled users, in the chatbot's language, with the decline option at least as prominent as the consent option. Consent cannot be inferred from inaction, continued use, or general terms of service. This effectively creates a purpose limitation and opt-in consent requirement for all personal data used in chatbot responses.
(A) A chatbot provider may not: (1) process personal data to inform a chatbot output unless processing personal data is necessary to fulfill an express request that is made by a user and the user provides affirmative consent;
Pending
D-01.4
S.C. Code § 39-80-20(A)(2)
Plain Language
Chatbot providers are categorically prohibited from using chat logs for any advertising purpose — whether to decide whether to show an ad, what type of ad to show, or how to customize an ad for a particular user. This is an absolute prohibition with no consent exception. Chat logs include both user inputs and chatbot outputs from the interaction.
(A) A chatbot provider may not: (2) process a user's chat log: (a) to determine whether to display an advertisement for a product or service to a user; (b) to determine a product or service or category of a product or service to advertise to a user; or (c) to customize an advertisement for presentation to a user;
Pending
D-01.4
S.C. Code § 39-80-20(A)(3)(c)-(d)
Plain Language
Adult users' chat logs and personal data may not be used for model training without affirmative consent. Additionally, chat logs and personal data may not be used for profiling beyond what is necessary to fulfill an express user request. Training is defined broadly as adjusting or modifying a model using input data, but excludes safety testing and harm-mitigation adjustments. Profiling is defined as classifying personality traits and behavioral characteristics, but excludes safety-related processing. These provisions together create a consent-gated training restriction and a necessity-limited profiling restriction for adult users.
(A) A chatbot provider may not: (3) process a user's chat log and personal data: (c) for training purposes if the user is an adult, unless the chatbot provider first obtains affirmative consent; or (d) to engage in profiling beyond what is necessary to fulfill an express request;
Pending
D-01.4
S.C. Code § 39-80-20(A)(4)
Plain Language
Chatbot providers may not classify users by personality traits or behavioral characteristics beyond what is strictly necessary to fulfill the user's express request. This is a standalone profiling restriction that applies to all users regardless of age. Safety-related processing is carved out from the definition of profiling and is therefore exempt.
(A) A chatbot provider may not: (4) profile a user based on any classification or designation of the user's personality or behavioral characteristic beyond what is necessary to fulfill an express request made by the user;
Pending
D-01.1D-01.2
S.C. Code § 39-80-20(B)
Plain Language
Users have an unconditional right to access their own chat logs at any time. On request, the chatbot provider must deliver the chat log in a downloadable, easy-to-read format. Providers may not discriminate or retaliate against users who exercise this access right. This is broader than a typical data access right because it covers both user input data and chatbot output data (per the chat log definition), giving users access to the full conversation record.
(B) A user has a right to access the user's own chat logs at any time. A chatbot provider shall provide a user's own chat log on request by the user and shall provide the chat log in a downloadable and easy to read format. A chatbot provider may not discriminate or retaliate against a user that requests the user's chat.
Passed 2025-09-01
D-01.4
Health & Safety Code § 183.004
Plain Language
Covered entities are categorically prohibited from including credit score or voter registration data in any individual's EHR. This applies to collection, storage, and sharing — entities may not add this data to the record at any stage. This is a data minimization requirement specific to EHRs, prohibiting inclusion of data types that have no legitimate healthcare purpose.
A covered entity may not collect, store, or share any information regarding an individual's credit score or voter registration status in the individual's electronic health record.
Failed 2026-07-01
D-01.3
Va. Code § 2.2-1202.2(B)(3)
Plain Language
State agencies must provide all individuals the right to opt out of having an automated decision system used in employment decisions affecting them. Additionally, agencies must provide a separate accommodation process for individuals with disabilities to seek accommodations related to the automated decision system. The opt-out right is unconditional — individuals need not demonstrate a specific reason for opting out.
The Director shall require any state agency that uses an automated decision system as a substantial factor in any employment decision to: 3. Provide to all individuals the right to opt out of the use of the automated decision system for employment decisions and a process by which individuals with disabilities may seek accommodations for the automated decision system;
Failed 2026-07-01
D-01.3
Va. Code § 15.2-1500.2(B)(3)
Plain Language
Local government entities must provide all individuals the right to opt out of having an automated decision system used in employment decisions affecting them, and must establish a process for individuals with disabilities to seek accommodations. Mirrors the state agency obligation under § 2.2-1202.2(B)(3) but applies to local government instrumentalities.
Any department, office, board, commission, agency, or instrumentality of local government that uses an automated decision system as a substantial factor in any employment decision shall: 3. Provide to all individuals the right to opt out of the use of the automated decision system for employment decisions and a process by which individuals with disabilities may seek accommodations for the automated decision system;
Pending 2027-01-01
D-01.6
§ 59.1-618
Plain Language
Operators may not use a minor's inputs to train the companion chatbot's underlying model without first obtaining affirmative written consent from the minor's parent or guardian. The consent must be specific to the purpose of using the minor's personal information for model training — a general terms-of-service consent would not satisfy this requirement. This applies regardless of whether the training occurs in real time or in batch. Operators should implement consent flows that clearly identify model training as a distinct purpose and require a separate written affirmation.
An operator shall not train the underlying model of a companion chatbot with the inputs of a minor unless the minor's parent or guardian has affirmatively provided written consent to the operator to use the minor's personal information for that specific purpose.
Failed 2026-07-01
D-01.4
§ 59.1-615(C)
Plain Language
Deployers must limit the data they collect and store to the minimum amount that is adequate, relevant, and necessary for a legitimate purpose, and must not collect or store information that conflicts with a user's best interests. This is a data minimization obligation applying to all users — not just minors — and covers all data associated with chatbot interactions. The 'user's best interests' standard is unusual and goes beyond typical necessity-based data minimization by adding a separate affirmative duty not to collect data that conflicts with the user's interests, even if the data might serve a legitimate deployer purpose.
A deployer shall collect and store only such information as does not conflict with a user's best interests. Such information shall be (i) adequate, in the sense that it is sufficient to fulfill a legitimate purpose of the deployer; (ii) relevant, in the sense that the information has a relevant link to such legitimate purpose; and (iii) necessary, in the sense that it is the minimum amount of information that is needed for such legitimate purpose.
Pending 2025-07-01
D-01.4
21 V.S.A. § 495q(b)
Plain Language
Employers may not electronically monitor employees unless five conditions are all met: (1) the monitoring serves one of seven enumerated legitimate purposes (e.g., essential job functions, safety, compliance, periodic performance assessment); (2) the specific form of monitoring is necessary for and exclusively used for that purpose; (3) it is the least invasive means available; (4) it uses the smallest scope of employees, data, and frequency necessary; and (5) access is limited to authorized persons and use is limited to the purpose and duration disclosed to the employee. This imposes a strict data minimization and purpose-limitation framework on all workplace electronic monitoring.
(b) Employee monitoring restricted. An employer shall not engage in electronic monitoring of an employee unless all of the following requirements are met: (1) the employer's purpose in utilizing the electronic monitoring is to: (A) assist or allow the employee to accomplish an essential job function; (B) monitor production processes or quality; (C) ensure compliance with applicable employment or labor laws; (D) protect the health, safety, or security of the employee, clients, or the public; (E) secure the employer's physical or digital property; (F) conduct periodic assessment of employee performance; or (G) track time worked or production output for purposes of determining the employee's compensation; (2) the specific form of electronic monitoring is necessary to accomplish the purpose identified pursuant to subdivision (1) of this subsection and is used exclusively to accomplish that purpose; (3) the specific form of electronic monitoring is the least invasive means, with respect to the employee, of accomplishing the purpose identified pursuant to subdivision (1) of this subsection; (4) the specific form of electronic monitoring is used with the smallest number of employees, collects the smallest amount of data necessary to accomplish the purpose identified pursuant to subdivision (1) of this subsection, and is collected not more frequently than necessary to accomplish that purpose; and (5) the employer ensures that only authorized persons have access to any data produced through the electronic monitoring and that the data is only used for the purpose and duration that the employee has been notified of pursuant to subsection (c) of this section.
Pending 2025-07-01
D-01.1
21 V.S.A. § 495q(c)(1)-(3)
Plain Language
At least 15 calendar days before beginning any electronic monitoring, employers must provide each affected employee with a detailed written notice in plain language and in the employee's primary language. The notice must cover 14 specific items including the form of monitoring, its purpose and necessity, data use, technologies used, activities monitored, third-party access, data retention and destruction timelines, employee data access and correction rights, a cover sheet summary, employee rights, and complaint instructions. If monitoring tracks productivity or performance, additional disclosures about standards, measurement methods, and consequences are required. The notice must be updated if the employer materially changes monitoring practices. A narrow exception to prior notice exists when the employer has reasonable grounds to believe an employee is engaged in illegal conduct, rights violations, or hostile work environment creation.
(c) Required notice for employee monitoring. (1) At least 15 calendar days prior to commencing any form of electronic monitoring, an employer shall provide notice of the electronic monitoring to each employee who will be subject to it. The notice shall, at a minimum, include the following information: (A) the specific form of electronic monitoring; (B) a description of the intended purpose of the electronic monitoring and why the electronic monitoring is necessary to accomplish that purpose; (C) a description of how any data generated by the electronic monitoring will be used, including whether and how the data generated by the electronic monitoring will be used to inform employment-related decisions; (D) a description of the technologies that will be used to conduct the electronic monitoring; (E) a description of the specific activities, locations, communications, and job roles that will be electronically monitored; (F) the name of any person conducting electronic monitoring on the employer's behalf and any associated contract language related to the monitoring; (G) the name of any person, apart from the employer, who will have access to any data generated by the electronic monitoring and the reason why the person will have access to the data; (H) the positions within the employer that will have access to any data generated by the electronic monitoring; (I) when, where, and how frequently monitoring will occur; (J) the period of time for which any data generated by the electronic monitoring will be retained by the employer or another person and when that data will be destroyed; (K) notice of how an employee may access the data generated by the electronic monitoring and the process to correct any errors in the data; (L) a cover sheet that concisely summarizes the details contained in the notice; (M) notice of an employee's rights pursuant to this section and the judicial and administrative remedies available for redressing the wrongful use of electronic monitoring; and (N) instructions on how an employee can file a complaint against an employer for violations of this section. (2) If an employer uses electronic monitoring to track employee productivity or performance, the employer shall include the following information in the notice required by subdivision (1) of this subsection: (A) the performance or productivity standards by which employees will be assessed and how employees will be measured against those standards; (B) how performance or productivity data will be monitored and collected, including the identity of the employees subject to such monitoring and when, where, and how the monitoring and data collection will occur; and (C) any adverse consequences for failing to meet a performance or productivity standard and whether there is any bonus or incentive program associated with meeting or exceeding each standard. (3)(A) Notice of electronic monitoring provided pursuant to this section shall be written in plain, clear, and concise language and provided to each employee in the employee's primary language. (B) An employer shall provide a new, updated notice to employees if it makes any significant changes to the manner of electronic monitoring or to the way that the employer utilizes the electronic monitoring or any data generated by it.
Pending 2025-07-01
D-01.5
21 V.S.A. § 495q(f)(3)
Plain Language
Employers are categorically prohibited from using any ADS outputs about an employee's physical or mental health as a factor in any employment-related decision. This is an absolute prohibition — there is no exception for health-related jobs or reasonable accommodation determinations. It applies regardless of how the health information was derived (directly collected or inferred by the system).
(3) An employer shall not use any automated decision system outputs regarding an employee's physical or mental health in relation to an employment-related decision.
Pending 2025-07-01
D-01.1D-01.2
21 V.S.A. § 495q(j)
Plain Language
Employees have a right to access all data related to them that was produced or used by electronic monitoring or ADS, and employers must provide this access within seven days of request. Employees also have a right to request correction of errors, and within seven days the employer must either: (A) correct the error and provide a plain-language notice explaining the steps taken; or (B) provide a notice explaining why the data was not corrected and describing verification steps taken. Both the access and correction rights have a strict seven-day response window. Correction requests do not guarantee correction — the employer may decline but must explain its verification process.
(j) Employee right to access and correct data. (1) Within seven days of receiving a request, an employer shall provide an employee with access to any data that relates to the employee that was produced or utilized by electronic monitoring or an automated decision system used by the employer. (2) Within seven days of receiving a request to correct potential errors identified by an employee, an employer shall: (A) correct the erroneous information or data and provide the employee with a notice that complies with subdivision (c)(3)(A) of this section, explaining the steps taken by the employer; or (B) provide the employee with a notice explaining that the employer has not corrected the information or data and describing the steps the employer has taken to verify the accuracy of the disputed information or data.
Pending 2025-07-01
D-01.1
21 V.S.A. § 495q(c)(5)
Plain Language
Employers must annually provide each employee with a list of all electronic monitoring systems currently in use in relation to that employee, in the employee's primary language. 'Currently in use' is broadly defined to include systems the employer is currently using, used within the past 90 days, or intends to use within the next 30 days. This is a recurring annual disclosure obligation — separate from the initial 15-day pre-monitoring notice — ensuring employees have ongoing awareness of all monitoring systems applied to them.
(5)(A) An employer that utilizes electronic monitoring shall annually provide each of its employees with a list of all electronic monitoring systems currently in use by the employer in relation to that employee. The list shall be provided in the primary language of the employee. (B) As used in this subdivision (5), "currently in use" means that the employer: (i) is currently using the system in relation to the employee; (ii) used the electronic monitoring system in relation to the employee within the past 90 days; or (iii) intends to use the electronic monitoring system in relation to the employee within the next 30 days.
Pre-filed 2026-07-01
D-01.4
9 V.S.A. § 4193b(a)(1)
Plain Language
Chatbot providers may not use personal data — beyond the user's own input data — to inform chatbot outputs unless two conditions are met: (1) the processing is necessary to fulfill an express user request, and (2) the user has given affirmative consent. This effectively creates a default prohibition on enriching chatbot responses with personal data from external sources, behavioral profiles, or cross-session data unless the user specifically asks for it and consents. The affirmative consent standard is stringent — it cannot be bundled into general terms of service, must be presented as a standalone disclosure, and the refuse option must be at least as prominent as the accept option.
(1) process personal data other than input data to inform chatbot outputs unless the processing of personal data is necessary to fulfill an express request made by a user and that user has provided affirmative consent;
Pre-filed 2026-07-01
D-01.4
9 V.S.A. § 4193b(a)(2)
Plain Language
Chatbot providers are categorically prohibited from using a user's chat logs for any advertising purpose — whether to decide whether to show an ad, what to advertise, or how to customize or present an ad. This is an absolute prohibition with no consent exception. Chat logs include both user inputs and chatbot outputs, so providers cannot mine conversational history for ad targeting under any circumstances.
(2) process a user's chat log to: (A) determine whether to display an advertisement for a product or service to the user; (B) determine a product, service, or category of product or service to advertise to the user; or (C) customize an advertisement or how an advertisement is presented to the user;
Pre-filed 2026-07-01
D-01.4
9 V.S.A. § 4193b(a)(3)(C)-(D)
Plain Language
Chatbot providers face two restrictions on processing adult users' data: (1) they may not use chat logs or personal data of adult users for training purposes unless the provider first obtains affirmative consent, and (2) they may not use chat logs or personal data for profiling — classifying users' personality or behavioral characteristics — beyond what is necessary to fulfill an express user request. The training definition carves out safety testing and legally required modifications, so those activities do not require consent. Similarly, chat log processing for user safety is excluded from the profiling prohibition.
(3) process a user's chat log or personal data: (C) of a user over 18 years of age for training purposes, unless the chatbot provider first obtains affirmative consent; or (D) to engage in profiling beyond what is necessary to fulfill an express request from the user;
Pre-filed 2026-07-01
D-01.6
9 V.S.A. § 4193b(a)(3)(A)-(B)
Plain Language
When a chatbot provider knows or should know that a user is under 18, the provider faces two distinct restrictions: (1) no processing of the minor's chat logs or personal data at all without affirmative consent from a parent or legal guardian, and (2) an absolute prohibition on using the minor's chat logs or personal data for model training — with no consent carve-out even from a parent. The knowledge standard is constructive — 'should have known based on knowledge fairly implied on the basis of objective circumstances' — meaning providers cannot avoid the obligation by simply not asking about age.
(3) process a user's chat log or personal data: (A) if the chatbot provider knows or should have known, based on knowledge fairly implied on the basis of objective circumstances, that the user is under 18 years of age without the affirmative consent of that user's parent or legal guardian; (B) for training purposes, if the chatbot provider knows or should have known, based on knowledge fairly implied on the basis of objective circumstances, that a user is under 18 years of age;
Pre-filed 2026-07-01
D-01.4
9 V.S.A. § 4193b(a)(4)
Plain Language
This is a downstream use restriction on profiling outputs: even if a chatbot provider has legitimately profiled a user (e.g., to fulfill an express request), the resulting personality or behavioral classifications may not be used for any purpose beyond what is necessary to fulfill that express request. This prevents providers from building and then repurposing user behavioral profiles for marketing, content personalization, or other secondary purposes.
(4) use any classification or designation of a user's personality or behavioral characteristics created through profiling beyond what is necessary to fulfill an express request made by the user;
Pre-filed 2026-07-01
D-01.1
9 V.S.A. § 4193b(b)(1)-(2)
Plain Language
Users have the right to access their own chat logs at any time in a portable, downloadable, human-readable and machine-readable format. Chatbot providers must make this data available on demand. Providers may not discriminate or retaliate against users for exercising this access right — the same anti-retaliation protections that apply to training consent refusal also apply here, covering denial of service, price discrimination, and quality degradation.
(b) Right to access. A user has the right to access, in a portable and readily usable format and at any time, any of the user's own chat logs that a chatbot provider has retained. (1) Chat logs must be made available to users in a downloadable and human- and machine-readable format. (2) A chatbot provider shall not discriminate or retaliate against any user, including by denying products or services, charging different prices or rates for products or services, or providing lower-quality products or services to the user, for accessing their own chat logs.
Passed 2026-07-01
D-01.8
18 V.S.A. § 1893(a)-(b)
Plain Language
No person may collect or record neural data from a brain-computer interface without first providing the individual with a written notice explaining how the data will be used, then obtaining written informed consent. This is an affirmative opt-in consent requirement — collection is prohibited by default. The consent must be voluntary, informed as to nature/benefits/risks/consequences, and may be given by an agent, guardian, or surrogate on behalf of an individual who lacks capacity.
(a) Prohibition. Subject to the limited exceptions provided in this section, no person shall: (1) collect or record an individual's neural data gathered from a brain-computer interface; or (2) share with a third party an individual's neural data gathered from a brain-computer interface. (b) Consent to collect. A person shall not collect or record an individual's neural data gathered from a brain-computer interface unless the person: (1) provides the individual with a written notice explaining how the person will use the individual's neural data; and (2) thereafter receives written informed consent from the individual to collect or record the individual's neural data.
Passed 2026-07-01
D-01.8
18 V.S.A. § 1893(c)
Plain Language
Sharing neural data with any third party requires a separate written informed consent process, distinct from the consent to collect. The person must identify the specific third party by name and address and explain the purposes for sharing before obtaining consent. This is more granular than most biometric consent statutes, which typically bundle collection and sharing consent.
(c) Consent to share. A person shall not share with a third party an individual's neural data gathered from a brain-computer interface unless the person: (1) provides the individual with a written request for the individual's neural data to be shared with a third party and for what purposes, including the name and address of the third party; and (2) thereafter receives written informed consent from the individual to share the individual's neural data with the third party.
Passed 2026-07-01
D-01.3
18 V.S.A. § 1893(d)
Plain Language
Individuals have the right to revoke consent for neural data collection or sharing at any time by written notice. The revocation process must be at least as easy as the original consent process. Upon receiving revocation notice, the entity must destroy all neural data records within 10 days, immediately cease sharing with all third parties, and notify all third parties that consent has been revoked. This creates a deletion obligation more aggressive than most data privacy laws (10-day destruction deadline).
(d) Revocation of consent. (1) An individual who has provided written informed consent allowing a person to collect, record, or share the individual's neural data pursuant to this section has the right to revoke consent at any time thereafter by providing written notice to the person initially receiving the consent. This revocation of consent notice shall be as easy or easier for the individual to provide as compared to the requirements for initially providing consent. (2) A person who receives written notice from an individual revoking consent pursuant to subdivision (1) of this subsection shall: (A) destroy all records of the individual's neural data not later than 10 days after receiving the notice; and (B) in the case of the revocation of consent to share an individual's neural data, immediately: (i) cease sharing an individual's neural data with all third parties upon receipt of the notice; and (ii) inform all third parties with whom the person has shared the individual's neural data that the individual has revoked consent.
Passed 2026-07-01
D-01.4
18 V.S.A. § 9761(a)-(b)
Plain Language
Suppliers of mental health chatbots are prohibited from selling or sharing Vermont users' individually identifiable health information or user inputs with third parties, with three narrow exceptions: (1) health care providers requesting data with user consent, (2) health plans at user request, and (3) contractors necessary for chatbot functionality, who must comply with HIPAA privacy and security rules as if the supplier were a HIPAA covered entity. This effectively extends HIPAA-equivalent obligations to mental health chatbot suppliers who would not otherwise be covered entities under federal law.
(a)(1) Except as provided in subdivision (2) of this subsection, a supplier of a mental health chatbot shall not sell to or share with any third party any: (A) individually identifiable health information of a Vermont user; or (B) user input of a Vermont user. (2) The prohibition set forth in subdivision (1) of this subsection shall not apply to individually identifiable health information that is: (A) requested by a health care provider with the consent of the Vermont user; (B) provided to a health plan of a Vermont user upon request of the Vermont user; or (C) shared in compliance with subsection (b) of this section. (b)(1) A supplier may share individually identifiable health information necessary to ensure the effective functionality of the mental health chatbot with another person with whom the supplier has a contract related to such functionality. (2) When sharing information pursuant to subdivision (1) of this subsection, the supplier and the other person shall comply with all applicable privacy and security provisions of 45 C.F.R. Part 160 and 45 C.F.R. Part 164, Subparts A and E, as if the supplier were a covered entity and the other person were a business associate, as those terms are defined in 45 C.F.R. § 160.103.
Pending 2026-07-01
D-01.8
§ 16-5EE-4(1)-(2)
Plain Language
Before collecting, using, or disclosing a consumer's genetic data, entities must provide both a high-level privacy policy overview and a detailed, publicly available privacy notice covering the entity's data practices. The entity must also obtain initial express consent from the consumer (or parent/guardian/power of attorney) that clearly describes how genetic data will be used, who within the entity can access test results, and how the data may be shared. This is a precondition to any genetic data collection — the consent must be obtained before data is collected, not after.
To safeguard the privacy, confidentiality, security, and integrity of a consumer's genetic data, an entity shall: (1) Provide clear and complete information regarding the entity's policies and procedures for the collection, use, or disclosure of genetic data by making available to a consumer: (A) A high-level privacy policy overview that includes basic, essential information about the entity's collection, use, or disclosure of genetic data; and (B) A prominent, publicly available privacy notice that includes, at a minimum, information about the entity's data collection, consent, use, access, disclosure, transfer, security, and retention and deletion practices for genetic data; (2) Obtain initial express consent from a consumer, parent, guardian, or power of attorney for the collection, use, or disclosure of the consumer's genetic data that: (A) Clearly describes the entity's use of the genetic data that the entity collects through the entity's genetic testing product or service; (B) Specifies the categories of individuals within the entity that have access to test results; and (C) Specifies how the entity may share the genetic data;
Pending 2026-07-01
D-01.8
§ 16-5EE-4(4)
Plain Language
Beyond the initial consent requirement, entities must obtain separate, purpose-specific express consent for each additional use of genetic data. Third-party transfers require separate consent naming the recipient. Secondary uses beyond the testing service's primary purpose, retention of biological samples after initial testing, research transfers, genetic-data-based marketing, third-party marketing, and sale of genetic data each require their own express consent. Research transfers require 'informed' express consent — a higher standard. The marketing exception for first-party customized content does not require separate consent. Transfers to processors operating under restrictive contracts are exempt from the third-party consent requirement.
(4) If the entity engages in any of the following, obtain a consumer's: (A) Separate express consent for: (i) The transfer or disclosure of the consumer's genetic data or biological sample to any third party other than the entity's processors, including the name of the third party to which the consumer's genetic data or biological sample will be transferred or disclosed with the consumer's express consent; (ii) The use of genetic data beyond the primary purpose of the entity's genetic testing product or service and inherent contextual uses; or (iii) The entity's retention of any biological sample provided by the consumer following the entity's completion of the initial testing service requested by the consumer; (B) Informed express consent for transfer or disclosure of the consumer's genetic data to third party persons for: (i) Research purposes; or (ii) Research conducted under the control of the entity for the purpose of publication or generalizable knowledge; and (C) Express consent for: (i) Marketing to a consumer based on the consumer's genetic data; (ii) Marketing by a third-party person to a consumer based on the consumer having ordered or purchased a genetic testing product or service. Marketing does not include the provision of customized content or offers on the websites or through the applications or services provided by the entity with the first-party relationship to the consumer; or (iii) Sale or other valuable consideration of the consumer's genetic data.
Pending 2026-07-01
D-01.1D-01.2
§ 16-5EE-4(6)(A)
Plain Language
Entities must develop and maintain a comprehensive security program for genetic data and provide consumers with individual rights mechanisms including the ability to access their genetic data, delete it, revoke any previously granted consent, and request destruction of their biological sample. These consumer rights must be operationalized through an accessible process — the statute does not specify the format but requires the capability to be available to consumers.
(6) Develop, implement, and maintain a comprehensive security program to protect a consumer's genetic data against unauthorized access, use, or disclosure; and (A) Provide a process for a consumer to: (i) Access the consumer's genetic data; (ii) Delete the consumer's genetic data; (iii) Revoke any consent provided by the consumer; and (iv) Request and obtain the destruction of the consumer's biological sample.
Pending
D-01.8
§ 15-17-3(b)
Plain Language
Before collecting any biometric identifier or biometric information — including fingerprints, voiceprints, retina or iris scans, or scans of hand or face geometry — a private entity must provide written notice to the individual (or their legal representative) that biometric data is being collected or stored, specify the purpose and duration of collection, storage, and use, and obtain a written release from the individual. All three steps must be completed before any collection occurs. In the employment context, the written release may be executed as a condition of employment. This is a pre-collection obligation that cannot be satisfied retroactively.
(b) No private entity may collect, capture, purchase, receive through trade, or otherwise obtain a person's or a customer's biometric identifier or biometric information, unless it first: (1) Informs the subject or the subject's legally authorized representative in writing that a biometric identifier or biometric information is being collected or stored; (2) Informs the subject or the subject's legally authorized representative in writing of the specific purpose and length of term for which a biometric identifier or biometric information is being collected, stored, and used; and (3) Receives a written release executed by the subject of the biometric identifier or biometric information or the subject's legally authorized representative.