D-01
Data Governance
Automated Processing Rights & Data Controls
Individuals have rights to know, correct, and in some jurisdictions opt out of automated processing of their personal data for consequential decisions. Organizations face restrictions on using sensitive personal attributes in AI decision-making and must minimize data collection to what is necessary for the stated purpose. AI-generated inferences and derived attributes are themselves subject to these controls.
Applies to DeveloperDeployerManufacturerProfessionalGovernment Sector EmploymentFinancial ServicesHealthcare
Bills — Enacted
0
unique bills
Bills — Proposed
65
Last Updated
2026-03-29
Core Obligation

Individuals have rights to know, correct, and in some jurisdictions opt out of automated processing of their personal data for consequential decisions. Organizations face restrictions on using sensitive personal attributes in AI decision-making and must minimize data collection to what is necessary for the stated purpose. AI-generated inferences and derived attributes are themselves subject to these controls.

Sub-Obligations7 sub-obligations
ID
Name & Description
Enacted
Proposed
D-01.1
Right to know Individuals have the right to know that their personal data is being used in an automated decision-making system, and in some jurisdictions to receive a description of the categories of data used.
0 enacted
19 proposed
D-01.2
Right to correct Individuals have the right to correct inaccurate personal data used in automated decisions, and to have the correction reflected in pending and future decisions — not just in the underlying record.
0 enacted
13 proposed
D-01.3
Right to opt out Individuals have the right to opt out of automated processing of their personal data for consequential decisions.
0 enacted
13 proposed
D-01.4
Data minimization Data collected and generated in connection with AI systems — including behavioral data, inferences, and derived attributes — must be limited to what is necessary for the AI system's stated purpose. Secondary uses require separate justification.
0 enacted
49 proposed
D-01.5
Sensitive attribute restrictions AI systems may not use sensitive personal attributes (race, gender, religion, health status, sexual orientation, national origin, disability) as direct inputs to consequential automated decisions except where expressly permitted. Proxy variable restrictions also apply — systems may not be designed to infer sensitive attributes from non-sensitive proxies for use in consequential decisions.
0 enacted
14 proposed
D-01.6
Age-Differentiated Parental Control and Privacy Tools Operators must provide minor-specific and under-thirteen parental or guardian tools for managing privacy and account settings, including control over interaction data retention for personalization, use of personal data for AI training, and account deletion. Age assurance data must be minimized and immediately deleted upon determination.
0 enacted
3 proposed
D-01.8
Biometric Data Pre-Collection Consent Entities must provide written notice and obtain affirmative opt-in consent from individuals before collecting any biometric identifier, including specific notice of identifier type and collection purpose. Consent obtained from publicly available sources is insufficient unless the individual themselves made the data publicly available.
0 enacted
16 proposed
Bills That Map This Requirement 65 bills
Bill
Status
Sub-Obligations
Section
Pending 2026-10-01
D-01.4
Section 2(f)
Plain Language
Covered entities are subject to a data minimization requirement: they may collect and store only information that (1) does not conflict with a 'trusted party's' best interests, (2) is sufficient for a legitimate purpose, (3) is relevant to that purpose, and (4) is the minimum amount needed. This is a three-prong necessity test layered on top of a best-interests constraint. The term 'trusted party' is not defined in the statute, creating significant ambiguity — it likely refers to the user or the minor's parent/guardian, but this is not explicit.
(f) Each covered entity shall collect and store only information that does not conflict with a trusted party's best interests, which must be: (1) Sufficient to fulfill a legitimate purpose of the covered entity; (2) Relevant to the legitimate purpose of the covered entity; and (3) The minimum amount of information needed for the legitimate purpose of the covered entity.
Pending 2026-01-01
D-01.4
A.R.S. § 44-1383.01(A)(1)
Plain Language
Chatbot providers may not process personal data to influence chatbot outputs unless the processing is necessary to fulfill a user's express request and the user has given affirmative consent. The affirmative consent standard is demanding: it requires a standalone, accessible disclosure in the user's language, with equal or easier decline mechanics, and cannot be obtained through terms of service, dark patterns, or user inaction. This effectively restricts personal data processing to a narrow necessity-plus-consent basis.
A chatbot provider may not: 1. Process personal data to inform a chatbot output unless processing personal data is necessary to fulfill an express request that is made by a user and the user provides affirmative consent.
Pending 2026-01-01
D-01.4
A.R.S. § 44-1383.01(A)(2)
Plain Language
Chatbot providers are categorically prohibited from using a user's chat logs for any advertising purpose — including deciding whether to show an ad, selecting which product or service to advertise, or customizing ad content. This is an absolute ban with no consent exception, covering the full lifecycle of ad targeting and personalization based on chat interactions.
A chatbot provider may not: 2. Process a user's chat log: (a) To determine whether to display an advertisement for a product or service to a user. (b) To determine a product or service or category of a product or service to advertise to a user. (c) To customize an advertisement for presentation to a user.
Pending 2026-01-01
D-01.4D-01.5
A.R.S. § 44-1383.01(A)(3)(d), (A)(4)
Plain Language
Chatbot providers may not use chat logs and personal data for profiling — classifying or designating personality traits or behavioral characteristics — beyond what is strictly necessary to fulfill a user's express request. This restriction applies regardless of consent. Processing chat logs for user safety or statutory compliance is excluded from the definition of profiling and is therefore not restricted by this provision.
A chatbot provider may not: 3. Process a user's chat log and personal data: (d) To engage in profiling beyond what is necessary to fulfill an express request. 4. Profile a user based on any classification or designation of the user's personality or behavioral characteristic beyond what is necessary to fulfill an express request made by the user.
Pending 2026-01-01
D-01.4
A.R.S. § 44-1383.01(A)(3)(c)
Plain Language
For adult users, chatbot providers may not use chat logs and personal data for training purposes unless the provider first obtains affirmative consent. Unlike the minor provision, which requires parental consent, here the adult user themselves must consent. The affirmative consent standard is the same demanding standard defined elsewhere in the statute.
A chatbot provider may not: 3. Process a user's chat log and personal data: (c) for training purposes if the user is an adult, unless the chatbot provider first obtains affirmative consent.
Pending 2026-01-01
D-01.1
A.R.S. § 44-1383.01(B)
Plain Language
Users have an unconditional right to access their own chat logs at any time. Upon request, the chatbot provider must deliver the chat logs in a downloadable, easily readable format. Providers may not retaliate against users who exercise this access right. This is a standing access right — no triggering conditions or limitations on frequency.
A user has a right to access the user's own chat logs at any time. A chatbot provider shall provide a user's own chat log on request by the user and shall provide the chat log in a downloadable and easy to read format. A chatbot provider may not discriminate or retaliate against a user pursuant to subsection A paragraph 7 of this section that requests the user's chat.
Passed 2026-01-01
D-01.4
Lab. Code § 1524(b)
Plain Language
Employers may not use an ADS to collect worker data for any purpose beyond what was disclosed in the pre-use notice required under Section 1522. This operates as a purpose limitation requirement — the scope of permissible data collection by the ADS is bounded by what the employer affirmatively disclosed to workers. Any undisclosed data collection constitutes a separate violation.
(b) An employer shall not use an ADS to collect worker data for a purpose that is not disclosed pursuant to the notice requirements in Chapter 2 (commencing with Section 1522).
Passed 2026-01-01
D-01.1D-01.2
Lab. Code § 1524(e)-(f)
Plain Language
Workers have the right to request a copy of their own data that was primarily used by an ADS in making discipline, termination, or deactivation decisions, covering the most recent 12 months. This right is limited to one request per 12-month period. When providing the data, employers must anonymize any personal information belonging to customers, other workers, or other individuals — the worker receives only their own data, with third-party identifiers removed. This is a post-decision data access right, distinct from the pre-use notice about data categories.
(e) A worker shall have the right to request, and an employer shall provide, a copy of the most recent 12 months of the worker's own data primarily used by an ADS to make a discipline, termination, or deactivation decision. A worker is limited to one request every 12 months for a copy of their own data used by an ADS to make a discipline, termination, or deactivation decision. (f) For purposes of safeguarding the privacy rights of consumers, workers, and individuals, when an employer is required to provide worker data pursuant to this part, that worker data shall be provided in a manner that anonymizes the customer's, other worker's, or individual's personal information.
Pending 2027-01-01
D-01.4
Lab. Code § 1522(a)(5)
Plain Language
Employers may not use individualized worker data as ADS inputs or outputs to determine compensation unless they can clearly demonstrate that any resulting pay differences for substantially similar work are justified by cost differentials in performing the task or that the data was directly related to the worker's hired tasks. This effectively creates a burden-shifting framework: the default is that individualized data-driven compensation is prohibited, and the employer must affirmatively prove a legitimate justification.
(5) Use or rely upon individualized worker data as inputs or outputs to inform compensation unless the employer can clearly demonstrate that any differences in compensation for substantially similar or comparable work assignments are based upon cost differentials in performing the task involved, or that the data was directly related to the tasks that the worker was hired to perform.
Pending 2027-01-01
D-01.1D-01.2
Lab. Code § 1522(e)-(f)
Plain Language
Workers have the right to request and receive a copy of their own data used by an ADS in connection with discipline, termination, or deactivation decisions, covering the most recent 12-month period. This right is limited to one request per 12 months. When providing the data, the employer must anonymize any personal information belonging to customers, other workers, or other individuals to protect their privacy. This is a data access right — not a pre-decision notice — and is triggered by the worker's request rather than automatically.
(e) A worker shall have the right to request, and an employer shall provide, a copy of the most recent 12 months of the worker's own data primarily used by an ADS to make a disciplinary, termination, or deactivation decision. A worker is limited to one request every 12 months for a copy of their own data used by an ADS to make a disciplinary, termination, or deactivation decision.
(f) For purposes of safeguarding the privacy rights of consumers, workers, and individuals, when an employer is required to provide worker data pursuant to this part, that worker data shall be provided in a manner that anonymizes the customer's, other worker's, or individual's personal information.
Pending 2026-10-01
D-01.1
Sec. 4
Plain Language
Before collecting any personal data from an applicant or employee for use in an automated employment-related decision process, the deployer must provide a written notice covering: the purpose of collection, categories of data collected, retention period, who will access the data, and the individual's right to opt out of certain personal data processing under Connecticut's existing data privacy law (§ 42-518). This is a pre-collection notice — it must be delivered before data collection begins, not at the point of an employment decision.
Except as provided in subsection (b) of section 2 of this act, prior to collecting any personal data of an applicant for employment or employee in the state for processing in an automated employment-related decision process, a deployer shall provide to such applicant or employee a written notice disclosing: (1) The purpose of such data collection; (2) The categories of personal data that will be collected for processing in such automated employment-related decision process; (3) The retention period for any personal data collected; (4) The categories of persons who will have access to such personal data; and (5) Information concerning the right, under subparagraph (C) of subdivision (5) of subsection (a) of section 42-518 of the general statutes, to opt out of the processing of personal data for the purposes set forth in said subparagraph.
Pending 2026-07-01
D-01.4
Fla. Stat. § 501.9986(1)-(2)
Plain Language
AI technology companies may not sell or disclose user personal information unless it has been deidentified — meaning it cannot reasonably be linked to an identified or identifiable individual or their device. Sales or disclosures specifically authorized by federal law are exempt. When holding deidentified data, the company must: (1) take reasonable measures to prevent re-association with users, (2) maintain and use data only in deidentified form (reidentification is permitted only to test deidentification processes), (3) contractually bind data recipients to these same requirements, and (4) implement business processes to prevent inadvertent release. A company may demonstrate compliance during a cure period by showing a risk management program validated against NIST AI RMF / ISO 42001 with assessed controls for deidentification, contractual flow-down, non-reidentification, release prevention, monitoring, and auditing.
(1) An artificial intelligence technology company may not sell or disclose personal information of users unless the information is deidentified data. This subsection does not prohibit the sale or disclosure of information specifically authorized by federal law. (2) An artificial intelligence technology company in possession of deidentified data shall do all of the following: (a) Take reasonable measures to ensure that the data cannot be associated with a user. (b) Maintain and use the data in deidentified form. An artificial intelligence technology company may not attempt to reidentify the data, except that the artificial intelligence technology company may attempt to reidentify the data solely for the purpose of determining whether its deidentification processes satisfy the requirements of this section. (c) Contractually obligate a recipient of the deidentified data to comply with this section. (d) Implement business processes to prevent the inadvertent release of deidentified data.
Pending 2026-07-01
Fla. Stat. § 501.1739(6)
Plain Language
Operators must protect the confidentiality of all age verification information collected from users, in accordance with the requirements of s. 501.1738. The full scope of confidentiality obligations is defined in that cross-referenced section, which is not part of this bill. Operators should consult s. 501.1738 for specific data handling, retention, and deletion requirements applicable to age verification data.
(6) An operator shall protect the confidentiality of age information provided by a user for age verification in accordance with s. 501.1738.
Failed 2026-07-01
D-01.6
Fla. Stat. § 501.1739(6)
Plain Language
Operators must protect the confidentiality of all age verification information collected from users, subject to the standards set forth in § 501.1738. This is a cross-reference obligation — the substantive confidentiality requirements are defined in the referenced statute, which likely includes data minimization and deletion requirements for age verification data. Practitioners should review § 501.1738 for the full scope of confidentiality protections required.
(6) An operator shall protect the confidentiality of age information provided by a user for age verification in accordance with s. 501.1738.
Failed 2026-07-01
D-01.4
Fla. Stat. § 501.9986(1)-(2)
Plain Language
AI technology companies may not sell or disclose users' personal information unless the information has been deidentified — meaning it cannot reasonably be linked to an identified or identifiable individual or their device. Sales authorized by federal law are excepted. Companies holding deidentified data must take reasonable measures to prevent re-association with users, maintain data in deidentified form, not attempt reidentification (except for testing deidentification processes), contractually require recipients to comply with the same rules, and implement processes to prevent inadvertent release. During enforcement, companies may present evidence of a risk management program aligned with NIST AI RMF/ISO 42001 that includes assessed controls for deidentification, contractual flow-down, non-reidentification, and auditing.
(1) An artificial intelligence technology company may not sell or disclose personal information of users unless the information is deidentified data. This subsection does not prohibit the sale or disclosure of information specifically authorized by federal law. (2) An artificial intelligence technology company in possession of deidentified data shall do all of the following: (a) Take reasonable measures to ensure that the data cannot be associated with a user. (b) Maintain and use the data in deidentified form. An artificial intelligence technology company may not attempt to reidentify the data, except that the artificial intelligence technology company may attempt to reidentify the data solely for the purpose of determining whether its deidentification processes satisfy the requirements of this section. (c) Contractually obligate a recipient of the deidentified data to comply with this section. (d) Implement business processes to prevent the inadvertent release of deidentified data.
Pending 2028-07-01
D-01.3
HRS § 321-__ (Consequential decisions; notice; statement; opt-out; corrections; appeal)(a)(4)
Plain Language
As part of the pre-decision written notice, health care providers must give patients the right to opt out of profiling — automated processing of their individually identifiable health information or personal data used to evaluate, analyze, or predict personal aspects — when the profiling furthers decisions with legal or similarly significant effects on the patient. This is an affirmative opt-out right that must be offered before the AI is used in the consequential decision. Though this obligation is embedded in the same subsection as the pre-decision notice, it is a distinct data governance right warranting separate compliance attention.
(4) Allows the patient to opt out of the processing of the patient's individually identifiable health information or other personal data for purposes of profiling in furtherance of decisions that have legal or similarly significant effects concerning the patient.
Pre-filed 2025-07-01
D-01.4
§ 554J.2(2)
Plain Language
Deployers must practice data minimization — they may only collect and store user information gathered through the chatbot to the extent necessary to fulfill the stated purpose for which the chatbot is made publicly available. Information collected beyond what is necessary for that purpose violates this provision. There is no exception for secondary uses or separate justification.
A deployer of a chatbot shall do all of the following: 2. Limit the collection and storage of user information collected by the chatbot to what is necessary to fulfill the deployer's purpose for making the chatbot publicly available.
Pending 2025-07-01
D-01.4
§ 554J.2(1)(b)
Plain Language
Deployers must minimize user information collected and stored by the public-facing chatbot to only what is necessary for the deployer's stated purpose in making the chatbot publicly available. This is a data minimization obligation — secondary uses of collected data beyond the chatbot's stated purpose are not permitted. The obligation applies to all data collected by the chatbot, not only personal data.
b. Limit the collection and storage of user information collected by the public-facing chatbot to what is necessary to fulfill the deployer's purpose for making the public-facing chatbot publicly available.
Pending 2026-07-01
D-01.4
§ 554J.2(2)
Plain Language
Deployers must minimize the user information their chatbot collects and stores, limiting it to what is necessary to fulfill the deployer's stated purpose for making the chatbot publicly available. This is a data minimization obligation — deployers cannot collect data beyond what is required for the chatbot's core purpose. Secondary uses or excessive retention are implicitly prohibited.
A deployer of a chatbot shall do all of the following: 2. Limit the collection and storage of user information collected by the chatbot to what is necessary to fulfill the deployer's purpose for making the chatbot publicly available.
Pending 2026-07-01
D-01.4
Iowa Code § 91F.3(1)(d)
Plain Language
Employers may not use an ADS to collect employee data for any purpose not already disclosed in the advance written notice required under § 91F.2. This functions as a purpose limitation — the employer's data collection through the ADS is constrained to the purposes described in the notice. Any new data collection purpose would require an updated notice before collection can begin.
d. Collect employee data for a purpose that is not disclosed pursuant to the notice requirements in section 91F.2.
Pending 2026-07-01
D-01.1D-01.2
Iowa Code § 91F.5
Plain Language
Employees have the right to request a copy of their own data that was primarily used by an ADS in a discipline, termination, or deactivation decision, covering the most recent 12-month period. Employers must fulfill the request. The right is limited to one request per 12-month period. This is a data access right — it enables employees to understand what data drove adverse automated decisions about them, supporting their ability to challenge or correct that data.
An employee has the right to request a copy of the most recent twelve months of the employee's own data primarily used by an automated decision system to make a discipline, termination, or deactivation decision. An employer shall provide a copy upon request. An employee is limited to one such request every twelve months.
Pending 2026-07-01
Iowa Code § 91F.6
Plain Language
Whenever an employer provides employee data under this chapter (e.g., in response to a data access request under § 91F.5), it must anonymize the personal information of any third party — including customers, other employees, and other individuals — contained in that data. This privacy safeguard ensures that fulfilling one employee's data access request does not compromise the privacy of others whose data may appear in the same dataset.
For purposes of safeguarding the privacy rights of consumers, employees, and individuals, when an employer is required to provide employee data pursuant to this chapter, the employer shall provide the data in a manner that anonymizes the personal information of any customer, employee, or other individual.
Pending 2025-07-01
D-01.8
§ 554J.2(2)(a)-(b)
Plain Language
Before collecting, capturing, purchasing, or otherwise obtaining any biometric data, a private entity must provide the subject (or the subject's legal representative) with written notice of two things: (1) that the entity intends to collect the subject's biometric data, and (2) the specific purposes for collection and the length of time the entity intends to retain the data. This is a pre-collection requirement — the notice must be delivered before the biometric data is received. The bill does not require affirmative opt-in consent; written notice alone satisfies the obligation.
2. A private entity shall not collect, capture, purchase, or otherwise obtain an individual's biometric data unless, prior to receiving the biometric data, the private entity does all of the following: a. Informs the subject of the biometric data, or the subject's legal representative, in writing, that the private entity intends to collect the subject's biometric data. b. Informs the subject of the biometric data, or the subject's legal representative, in writing, of the purposes and length of time for which the private entity intends to retain the biometric data.
Pending 2025-07-01
D-01.4
§ 554J.2(3)
Plain Language
Private entities are categorically prohibited from selling, leasing, trading, or otherwise deriving profit from any individual's biometric data. This is an absolute prohibition — no consent mechanism or opt-in can override it. The prohibition covers any commercial transaction or monetization arrangement involving biometric data, not just direct sales.
3. A private entity shall not sell, lease, trade, or otherwise profit from an individual's biometric data.
Pending 2026-07-01
D-01.8
Idaho Code § 48-2101(2)(a)-(b)
Plain Language
Before capturing any biometric identifier for a commercial purpose, a person must inform the individual and obtain their consent. Consent cannot be inferred from the mere existence of the individual's image or media containing biometric identifiers on the internet or other publicly available sources — the individual themselves must have made the image or media publicly available for such availability to count as notice or consent. This is an affirmative opt-in requirement.
(2)(a) A person may not capture a biometric identifier of an individual for a commercial purpose unless the person: (i) Informs the individual before capturing the biometric identifier; and (ii) Receives the individual's consent to capture the biometric identifier. (b) For the purposes of this subsection, an individual has not been informed of and has not provided consent for the capture or storage of a biometric identifier for a commercial purpose based solely on the existence of an image or other media containing one (1) or more biometric identifiers of the individual on the internet or other publicly available source unless the image or other media was made publicly available by the individual to whom the biometric identifiers relate.
Pending 2026-07-01
D-01.4
Idaho Code § 48-2101(3)(a)
Plain Language
Persons or entities possessing commercially-captured biometric identifiers may not sell, lease, or otherwise disclose them to third parties except in four narrow circumstances: (1) the individual consents for identification in case of disappearance or death, (2) the disclosure completes a financial transaction the individual requested, (3) the disclosure is required or permitted by law, or (4) the disclosure is made to or by law enforcement pursuant to a warrant. All other third-party disclosures are prohibited.
(3) Persons or entities possessing a biometric identifier of an individual that is captured for a commercial purpose: (a) May not sell, lease, or otherwise disclose the biometric identifier to another person unless: (i) The individual consents to the disclosure for identification purposes in the event of the individual's disappearance or death; (ii) The disclosure completes a financial transaction that the individual requested or authorized; (iii) The disclosure is required or permitted by state or federal law; or (iv) The disclosure is made by or to a law enforcement agency for a law enforcement purpose in response to a warrant;
Pending 2026-07-01
D-01.3
Idaho Code § 48-2101(3)(d)
Plain Language
Entities possessing commercially-captured biometric identifiers must provide individuals with a mechanism to revoke consent to storage and transmission at any time. Upon receiving a revocation, the entity must immediately destroy the biometric identifier — unless retention is independently required by another law. This is an ongoing opt-out right that must be available throughout the data lifecycle, not just at collection.
(d) Shall provide a method for an individual to revoke consent to the storage and transmission of a biometric identifier at any time and shall immediately destroy the biometric identifier upon receiving a revocation of consent unless maintaining the biometric identifier is required by another law.
Pending 2026-01-01
D-01.1D-01.2
Section 10(f)
Plain Language
Whenever an automated decision-making system collects data about employees, both the affected employees and their exclusive bargaining representatives have the right to view the data collected. This is a data access right — employees can see what data the system has gathered about them. The right extends to bargaining representatives, enabling union oversight of data collection practices. The statute does not specify a mechanism or timeframe for requests, but the right is unconditional whenever data collection is occurring.
(f) If an automated decision-making system is collecting employee data, employees and their exclusive bargaining representatives have a right to view the data collected by the automated decision-making system.
Pending 2026-01-01
D-01.3
Student Educational Technologies Rights Act § 15(a)(1), (a)(3), (b)
Plain Language
Students and parents have the right to opt out of school-issued electronic devices, electronic textbooks, electronic assignments, and predictive analytics systems. Opting out of predictive analytics may not result in academic penalty. When a student or parent exercises any of these opt-out rights, the school must provide a comparable analog alternative — such as paper assignments, physical textbooks, or physical copies of required reading. This creates an affirmative obligation on schools to maintain non-digital alternatives.
(a) It is the policy of this State that a student and the student's parent have the right to: (1) opt out of school-issued personal electronic devices, electronic textbooks, electronic required reading, or electronic or online assignments; (3) opt out of predictive analytics systems without academic penalty. (b) If a student or a student's parent exercises the right outlined in subsection (a), the school shall provide the student with a comparable analog version of what the educational technology provides. As used in this subsection, "comparable analog version" includes, but is not limited to, providing the assignment on physical paper, a physical copy of the required reading, or the option of a physical paper textbook.
Pending 2026-01-01
D-01.4
105 ILCS 85/10(3), (3.5)
Plain Language
Ed-tech operators may not sell or rent student information or data — including covered information or any other person's information collected for K-12 school purposes. Separately, operators may not permit AI to train on covered information unless the training is for K-12 school purposes or to improve the operability and functionality of the operator's own service. The sale/rent prohibition has a narrow carve-out for corporate acquisitions where the successor entity continues to comply with the Act. The definition of covered information has been expanded to include data gathered through AI and digital replicas.
(3) Sell or rent a student's information or data, including covered information or any other person's information collected by the operator for K through 12 school purposes. This subdivision (3) does not apply to the purchase, merger, or other type of acquisition of an operator by another entity if the operator or successor entity complies with this Act regarding previously acquired student information. (3.5) Permit artificial intelligence to train on covered information unless for K through 12 school purposes or in furtherance of improving operability and functionality of the operator's service.
Pending 2026-01-01
D-01.4D-01.8
105 ILCS 85/15(2), 105 ILCS 85/10(4)(A)
Plain Language
An operator's AI model may not train on a student's covered information and retain that training data indefinitely unless the operator first provides written notice to the student or parent that data will be retained indefinitely, and obtains written consent. This is an affirmative opt-in consent requirement — the default is that indefinite retention of training data derived from student covered information is prohibited. The consent must be obtained before the training and retention occurs. This requirement also limits the 'improving operability' exception in Section 10(4)(A): disclosing covered information to third parties to train AI that is not for K-12 school purposes does not qualify as improving operability, even with consent.
An operator's artificial intelligence model shall not train on a student's covered information and retain the training data indefinitely, unless it first: (A) informs the student or his or her parent in writing that the operator's artificial intelligence model will retain training data indefinitely; and (B) receives a written consent from the student or his or her parent.
Pending 2026-01-01
D-01.4
105 ILCS 85/10(4)(A)
Plain Language
Operators may disclose covered information for K-12 school purposes, but the recipient may not further disclose it except to improve the operator's own service operability. Critically, the 'improving operability' exception is now explicitly limited: disclosing covered information to any third party for the purpose of training AI that is not for K-12 school purposes does not qualify as improving operability. This prevents operators from using the operability exception as a loophole to feed student data into general-purpose AI training pipelines.
(A) In furtherance of the K through 12 school purposes of the site, service, application, or model if the recipient of the covered information disclosed under this clause (A) does not further disclose the information, unless done to allow or improve operability and functionality of the operator's site, service, or application. Improving operability does not include disclosing covered information to any third party to train artificial intelligence that is not for K through 12 school purposes.
Pending 2026-07-01
D-01.3
IC 22-5-10.4-13
Plain Language
When an employer uses an automated decision system to manage a covered individual on an ongoing basis (e.g., algorithmic scheduling, performance monitoring, task assignment), the individual has the right to opt out entirely and be managed by a human manager who has authority to make employment decisions. This is broader than the dispute/appeal rights in Section 10(2)(G) — it applies to ongoing algorithmic management, not just discrete employment decisions. The employer must ensure a human alternative manager is available and empowered to make all employment-related decisions for any individual who exercises this opt-out right.
Sec. 13. An employer that manages a covered individual through an automated decision system shall allow the covered individual to: (1) opt out of the management through the automated decision system; and (2) be managed through a human manager who is able to make employment related decisions with respect to the covered individual.
Pending 2026-07-01
D-01.4
Sec. 3(d)
Plain Language
Covered entities must minimize the collection, processing, use, and storage of age verification information to only what is strictly necessary for three purposes: verifying the user's age, obtaining parental consent, or maintaining compliance records. This is a data minimization obligation specific to age verification data — it prohibits secondary uses of age information beyond these three enumerated purposes.
(d) A covered entity shall protect the confidentiality of age information provided by a user for age verification by limiting the collection, processing, use and storage of such information to what is strictly necessary to verify a user's age, obtain verifiable parental consent or maintain compliance records.
Pending 2025-08-01
D-01.4
R.S. 23:973(B)
Plain Language
Employers may not use an ADS to collect worker data for any purpose that was not disclosed in the pre-use notice required by R.S. 23:972. This is a purpose limitation obligation — the ADS may only collect worker data for purposes the employer has already communicated to the worker. Any new data collection purpose requires updated notice before collection begins.
B. An employer shall not use an ADS to collect worker data for a purpose that is not disclosed pursuant to the notice requirements as provided in R.S. 23:972.
Pending 2025-08-01
D-01.1D-01.2
R.S. 23:973(C)(4)(a)-(b)
Plain Language
Workers have the right to access all worker data collected, used, or produced by an ADS — including both input data and output data — and to correct errors in that data, including data used as corroborating evidence by a human reviewer. Workers may designate an authorized representative (who cannot be the employer) to request access on their behalf. This right applies to all ADS-related data, not just data used in adverse decisions.
(4)(a) An employer shall allow a worker to access worker data collected, used by, or produced by an ADS and correct errors in any input or output data used by or produced by the ADS or used as corroborating evidence by a human reviewer. (b) An affected worker shall be allowed to choose an authorized representative to request access to the worker's data on his behalf.
Pending 2025-08-01
D-01.1
R.S. 23:973(F)-(G)
Plain Language
Workers have the right to request — and employers must provide — a copy of the most recent 12 months of the worker's own data that was primarily used by an ADS to make a discipline, termination, or deactivation decision. This request is capped at once per 12-month period. When providing any worker data under this Part, the employer must anonymize third-party personal information (customers, other workers, or other individuals) to protect their privacy. This data portability right is distinct from the general data access right in §973(C)(4) — it specifically covers a full 12-month data extract for adverse decisions.
F. A worker has the right to request, and an employer shall provide, a copy of the most recent twelve months of the worker's own data primarily used by an ADS to make a discipline, termination, or deactivation decision. A worker shall be limited to one request every twelve months for a copy of his own data used by an ADS to make a discipline, termination, or deactivation decision. G. For purposes of safeguarding the privacy rights of consumers, workers, and individuals, when an employer is required to provide worker data pursuant to this Part, the worker data shall be provided in a manner that provides anonymity regarding the customer's, other worker's, or individual's personal information.
Pending 2026-01-01
D-01.4
R.S. 28:16(D)(1)-(2)
Plain Language
Operators may not sell or share any individually identifiable health information or user input with third parties. Three narrow exceptions apply: (1) a healthcare provider requests the information with the user's consent; (2) the user's health plan requests the information at the user's request; or (3) the operator shares data with a contracted party solely to ensure the chatbot's effective functionality. When sharing under any exception, the operator and the receiving entity must comply with HIPAA privacy and security rules (45 CFR Parts 160 and 164, Subparts A and E) as if the operator were a HIPAA covered entity and the receiving party were a business associate. This effectively extends HIPAA-equivalent protections to mental health chatbot operators who would not otherwise be covered entities.
(1) An operator of a mental health chatbot may not sell to or share with any third party any individually identifiable health information of a user or the user's input. This Subsection shall not apply to individually identifiable health information that is requested by a healthcare provider with the consent of the user, provided to a health plan of a user upon request of the user, or shared to ensure the effective functionality of the mental health chatbot with another party with which the operator has a contract related to such functionality. (2) When sharing information pursuant to this Subsection, the operator and the other entity shall comply with all applicable privacy and security provisions of 45 CFR Part 160 and 45 CFR Part 164, Subparts A and E, as if the operator were a covered entity and the other entity were a business associate, as such terms are defined in 45 CFR 160.103.
Pre-filed 2025-01-17
D-01.8
Ch. 110I, § 2(c)(i)
Plain Language
Covered entities may not process or transfer biometric data in any manner that the end user has not affirmatively consented to. Consent must be freely given, specific, informed, and unambiguous — a general terms-of-service acceptance is explicitly insufficient. Consent obtained through abusive trade practices is void. Passive actions such as hovering, muting, pausing, or closing content do not constitute consent. This effectively requires opt-in consent before any biometric data processing, with the scope of processing limited to the narrowly defined purpose stated in the consent.
(c) A covered entity shall not: (i) process or transfer biometric data in any manner not consented to by the end user;
Pre-filed 2025-01-17
D-01.5
Ch. 110I, § 2(a)
Plain Language
Covered entities owe a broad duty of loyalty to end users: they may not take any action in processing biometric data or designing biometric recognition technology that conflicts with an end user's best interests. This is a fiduciary-style obligation that goes beyond mere consent requirements — even if the end user consents, the covered entity cannot act contrary to the user's interests. The attorney general has rulemaking authority to interpret this provision further.
(a) A covered entity shall be prohibited from taking any actions with respect to processing biometric data or designing biometric recognition technologies that conflict with an end user's best interests.
Pre-filed 2025-01-17
D-01.4
Ch. 110I, § 2(c)(ii)-(iv)
Plain Language
Covered entities face three interlocking restrictions on biometric data transfers: (1) sale of biometric data to third parties is categorically prohibited; (2) disclosure to any other person or entity is only permitted if consistent with the duties of loyalty, care, and confidentiality; and (3) any person receiving biometric data must enter into a contract imposing the same fiduciary-style duties toward the end user. Together these provisions mean biometric data may never be sold and may only be shared under contractual safeguards that extend the full suite of end-user protections to the recipient.
(c) A covered entity shall not: (ii) engage in the sale of biometric data to a third party; (iii) disclose biometric data with any other person or entity except as consistent with the duties of loyalty, care, and confidentiality under subsections 2(a), 2(b) and 2(c)(i) and 2(c)(ii), respectively; or (iv) disclose or share biometric data with any other person unless that person enters into a contract with the covered entity that imposes on the person the same duties of care, loyalty, and confidentiality toward the end user as are imposed on the covered entity under this subsection.
Pre-filed 2025-01-14
D-01.4
Chapter 149B, § 2(a)
Plain Language
Employers may only use electronic monitoring tools to collect employee data if the monitoring serves one of six enumerated legitimate purposes (facilitating essential job functions, quality assurance, periodic performance assessment, legal compliance, health/safety/security, or wages/benefits administration). Beyond purpose limitation, the tool must be narrowly tailored, implemented in the least invasive manner, limited to the fewest workers and least data necessary, and data must be deleted once the purpose is achieved. Off-duty monitoring is prohibited. Excess data must be disposed of by the vendor without disclosure to the employer. This creates a comprehensive data minimization regime for workplace surveillance.
(a) It shall be unlawful for an employer to use an electronic monitoring tool to collect employee information unless: (i) the electronic monitoring tool is primarily used to accomplish any of the following purposes: (A) allowing a worker to accomplish or facilitating the accomplishment of an essential job function; (B) ensuring the quality of goods and services; (C) conducting periodic assessment of worker performance; (D) ensuring or facilitating compliance with employment, labor, or other relevant laws; (E) protecting the health, safety, or security of workers, or the security of the employer's facilities or computer networks; or (F) administering wages and benefits. The department of labor standards may establish additional exceptions under clause (i) through notice and comment rulemaking in compliance with chapter 30A. (ii) the specific type and activated capabilities of an electronic monitoring tool must be narrowly tailored to accomplish the employer's intended, legitimate purpose specified under (i). (iii) the electronic monitoring tool may only be used to accomplish the employer's intended, legitimate purpose specified in (i), and must be customized and implemented in a manner ensuring that the execution of its duties undertaken in the manner least invasive to employees of the employer while accomplishing the employer's legitimate purposes as defined by (i); (iv) the specific form of electronic monitoring is limited to the smallest number of workers, collects the least amount of data and is collected no more frequently than is necessary to accomplish the purpose, and the data collected is deleted once the purpose has been achieved. (v) the employer must ensure that any employee data that is collected utilizing an electronic monitoring tool that is not necessary to accomplish the employer's intended, legitimate purpose is not disclosed to the employer and is promptly disposed of by the vendor; (vi) the employer must ensure that employee data is not collected when the employee is off-duty; and (vii) the employer must ensure that any employee data collected utilizing an electronic monitoring tool that is necessary to accomplish the employer's intended, legitimate purpose, is stored consistent with the commonwealth's data- and cyber- privacy laws, promptly disposed of as soon as the data is no longer needed, and is not utilized by the employer, the vendor or any other third party for any reason except as provided in section 2(c) and section 3(c) of this chapter.
Pre-filed 2025-01-14
D-01.1
Chapter 149B, § 2(b)
Plain Language
Before using any electronic monitoring tool, employers must provide prior written notice to and obtain written consent from all affected employees and candidates. The notice must also be posted conspicuously. The notice must contain eleven enumerated disclosures covering the monitoring purpose, the specific data collected, collection schedule and frequency, whether data feeds into ADS tools, how data will be used in employment decisions and discipline, productivity assessment use, data storage location and retention period, why the monitoring is the least invasive method, the employee's right to refuse data sale/transfer, and how to exercise statutory rights.
(b) Any employer that uses an electronic monitoring tool shall give prior written notice and must obtain written consent from all candidates and employees subject to electronic monitoring and must also post said notice in a conspicuous place which is readily available for viewing by candidates and employees, pursuant to sections 19B, 52C, and 190(i) of chapter 149 and section 99 of chapter 272. Such notice shall include, at a minimum, the following: (i) a description of the purpose for which the electronic monitoring tool will be used, as specified in subparagraph (i) of paragraph (a) of this subdivision; (ii) a description of the specific employee data to be collected, stored, secured, and disposed of (and the schedule therefore), and the activities, locations, communications, and job roles that will be electronically monitored by the tool; (iii) a description of the dates, times, and frequency that electronic monitoring will occur; (iv) whether and how any employee data collected by the electronic monitoring tool will be used as an input in an automated employment decision tool; (v) whether and how any employee data collected by the electronic monitoring tool will alone or in conjunction with an automated employment decision tool be used to make an employment decision by the employer or employment agency; (vi) whether and how any employee data collected by the electronic monitoring tool may be stored and utilized in discipline, in internal policy compliance, in administrative agency adjudications, and in litigation (whether or not it involves the employee as a party); (vii) whether any employee data collected by the electronic monitoring tool will be used to assess employees' productivity performance or to set productivity standards, and if so, how; (viii) a description of where any employee data collected by the electronic monitoring tool will be stored and the length of time it will be retained; (ix) an explanation for how the specific electronic monitoring practice is the least invasive means available to accomplish the monitoring purpose; (x) a statement that an employee is entitled to notice and maintains the right to refuse the sale, transfer, or disclosure of the employee's employee data subject to the provisions of section 2(f); and (xi) a clear and reasonably understandable description of how an employee can exercise the rights described in this chapter.
Pre-filed 2025-01-14
D-01.4
Chapter 149B, § 2(e)
Plain Language
Employers may not use electronically monitored employee data for any purpose beyond what was disclosed in the required notice. This is a secondary-use restriction — once data is collected for a stated purpose, repurposing it requires a new notice cycle. The reference to 'paragraph (c)' appears to be a drafting error and likely intends paragraph (b) which contains the notice requirements.
(e) An employer shall not use employee data collected via an electronic monitoring tool for purposes other than those specified in the notice provided pursuant to paragraph (c) of subdivision one of this section.
Pre-filed 2025-01-14
D-01.3
Chapter 149B, § 2(f)
Plain Language
Employers are prohibited from selling, transferring, or disclosing employee monitoring data to third parties except where required by federal or state law, or where necessary for an impact assessment of an automated employment decision tool. This creates a near-absolute ban on data sharing from workplace monitoring tools.
(f) An employer shall not sell, transfer, or disclose employee data collected via an electronic monitoring tool to any other entity unless it is required to do so under federal law or the laws of the commonwealth, or necessary to do so to comply with an impact assessment of an automated employment decision tool pursuant to section one thousand twelve of this article.
Pre-filed 2025-01-16
D-01.8
Chapter 110I, Section 2(c)(i)-(iii)
Plain Language
Covered entities may not process, transfer, sell, or disclose biometric data without the end user's freely given, specific, informed, and unambiguous consent for a narrowly defined purpose. General terms-of-service acceptance is insufficient — consent must be purpose-specific and obtained through a clear affirmative action. Sale of biometric data to third parties is categorically prohibited. Any third-party recipient must be bound by contract to the same duties of care, loyalty, and confidentiality that apply to the covered entity itself. Consent obtained via abusive trade practices is void.
(c) A covered entity shall not: (i) process or transfer biometric data in any manner not consented to by the end user; (ii) engage in the sale of biometric data to a third party; (iii) disclose biometric data with any other person or entity except as consistent with the duties of loyalty, care, and confidentiality under subsections 2(a), 2(b) and 2(c)(i) and 2(c)(ii), respectively; or (iv) disclose or share biometric data with any other person unless that person enters into a contract with the covered entity that imposes on the person the same duties of care, loyalty, and confidentiality toward the end user as are imposed on the covered entity under this subsection.
Pre-filed 2025-01-16
D-01.3
Chapter 110I, Section 2(e)
Plain Language
Covered entities may not retaliate against or discriminate against end users who withhold consent to biometric data processing. Discrimination includes denying goods or services, charging different prices, degrading service quality, or even suggesting that the user will receive worse treatment. This anti-discrimination provision ensures that the consent right under Section 2(c) is meaningful — users cannot be punished for exercising it.
(e) A covered entity shall not discriminate against a consumer because of the withheld consent under this title, including, but not limited to: (i) denying goods or services to the end user; (ii) charging different prices or rates for goods or services, including through the use of discounts or other benefits or imposing penalties; (iii) providing a different level or quality of goods or services to the end user; (iv) suggesting that the end user will receive a different price or rate for goods or services or a different level or quality of goods or services.
Pre-filed 2025-01-16
D-01.5
Chapter 110I, Section 4(a)
Plain Language
Covered entities are categorically prohibited from using biometric data to help make decisions that produce legal effects or similarly significant effects on end users. This is not a requirement for human review or bias testing — it is an outright ban. The scope of 'similarly significant effects' is illustrated by a non-exhaustive list including financial services, housing, insurance, education, criminal justice, employment, healthcare, and access to basic necessities. This effectively prohibits using biometric recognition technology for consequential automated decision-making across all major life domains.
(a) Covered entities shall not use biometric data to help make decisions that produce legal effects or similarly significant effects concerning end users. Decisions that include legal effects or similarly significant effects concerning end users include, without limitation, denial or degradation of consequential services or support, such as financial or lending services, housing, insurance, educational enrollment, criminal justice, employment opportunities, health care services, and access to basic necessities, such as food and water.
Pre-filed 2025-01-17
D-01.8
Ch. 93M § 2(b)(1)-(3)
Plain Language
Before collecting, capturing, purchasing, or otherwise obtaining any biometric identifier or biometric information, a private entity must provide the individual (or their legally authorized representative) with written notice that biometric data is being collected, written notice of the specific purpose and duration of the collection and use, and must obtain informed written consent. Electronic consent is permitted. This is an affirmative opt-in requirement — collection without prior written notice and consent is prohibited regardless of the source of the data.
(b) No private entity may collect, capture, purchase, receive through trade, or otherwise obtain a person's or a customer's biometric identifier or biometric information, unless it first: (1) informs the subject or the subject's legally authorized representative in writing that a biometric identifier or biometric information is being collected or stored; (2) informs the subject or the subject's legally authorized representative in writing of the specific purpose and length of term for which a biometric identifier or biometric information is being collected, stored, and used; and (3) receives written consent executed by the subject of the biometric identifier or biometric information or the subject's legally authorized representative. Written consent may be obtained by electronic means.
Pre-filed 2025-01-17
D-01.4
Ch. 93M § 2(c)
Plain Language
Private entities are categorically prohibited from selling, leasing, trading, or otherwise profiting from biometric identifiers or biometric information. This is an absolute prohibition with no consent override — even with the individual's written consent, a private entity may not monetize biometric data. This effectively prevents commercial data brokerage of biometric information and limits permissible uses to the original stated purpose.
(c) No private entity in possession of a biometric identifier or biometric information may sell, lease, trade, or otherwise profit from a person's or a customer's biometric identifier or biometric information.
Pre-filed 2025-01-17
D-01.4
Ch. 93M § 2(d)(1)-(4)
Plain Language
Private entities may not disclose, redisclose, or disseminate biometric identifiers or biometric information to third parties except in four narrow circumstances: (1) the individual provides written consent; (2) the disclosure completes a financial transaction the individual requested or authorized; (3) the disclosure is required by applicable law or ordinance; or (4) the disclosure is required by a valid warrant or subpoena. Outside these four exceptions, all sharing is prohibited. Note that redisclosure by downstream recipients is also covered — the prohibition runs to any entity in possession of the data, not just the original collector.
(d) No private entity in possession of a biometric identifier or biometric information may disclose, redisclose, or otherwise disseminate a person's or a customer's biometric identifier or biometric information unless: (1) the subject of the biometric identifier or biometric information or the subject's legally authorized representative provides written consent to the disclosure or redisclosure; (2) the disclosure or redisclosure completes a financial transaction requested or authorized by the subject of the biometric identifier or the biometric information or the subject's legally authorized representative; (3) the disclosure or redisclosure is required by state or federal law or municipal ordinance; or (4) the disclosure is required pursuant to a valid warrant or subpoena issued by a court of competent jurisdiction.
Pending 2026-10-01
D-01.4
Commercial Law § 14–1330(F)(1)–(2)
Plain Language
Controllers must minimize the personal data they collect to what is reasonably necessary and proportionate for the purposes of this subtitle — broader collection is not permitted. In addition, controllers are categorically prohibited from using data about a user's emotional state or mental health vulnerabilities to tailor algorithms that increase the duration or frequency of chatbot use. This is both a data minimization obligation and an anti-manipulation restriction. Note that the statute uses 'controller' here without defining it, creating ambiguity about whether this means the operator, the developer, or both.
(F) (1) A controller shall limit the collection of personal data to what is reasonably necessary and proportionate to satisfy the requirements of this subtitle. (2) A controller may not use data regarding emotional state or mental health vulnerabilities to tailor algorithms to increase the duration or frequency of use of a chatbot.
Pending 2026-06-16
D-01.4
22 MRSA § 1730-B(5)
Plain Language
All records maintained by the licensed professional and all communications between the professional and a client or prospective client are confidential and may not be disclosed except as required under existing law. This applies to records generated or maintained using AI tools, including session transcripts, therapy notes, and data collected by AI systems. The provision reinforces existing confidentiality obligations in the AI context but does not create a new, independent standard — it extends existing therapy record confidentiality to AI-assisted records.
5. Disclosure of records and communications. All records kept by a licensed professional and all communications between an individual seeking therapy or psychotherapy services and a licensed professional or between a client and a licensed professional are confidential and may not be disclosed except as required under law.
Pending 2026-06-16
D-01.4
10 MRSA § 1500-SS(2)
Plain Language
Deployers face a two-part data collection restriction. First, they may not collect or store any information that conflicts with a user's safety and well-being — this is an absolute prohibition regardless of purpose. Second, all other data collection must satisfy a data minimization standard: information may only be collected for a legitimate purpose, must be relevant to that purpose, and must be limited to the minimum amount necessary. This applies to all users, not just minors. The 'safety and well-being' restriction is notably vague and could be interpreted broadly — deployers should assess whether any collected data categories could foreseeably be used to a user's detriment.
2. User information collection and storage. A deployer shall collect and store only information that does not conflict with a user's safety and well-being. A deployer may not collect and store information except to fulfill a legitimate purpose of the deployer. A deployer may collect and store information that is adequate to fulfill a legitimate purpose of the deployer, but only to the extent that the information: A. Is relevant to that legitimate purpose; and B. Is the minimum amount of information necessary to fulfill that legitimate purpose.
Pending 2026-02-24
D-01.4
Sec. 5(1)-(2)
Plain Language
Employers are prohibited from using electronic monitoring tools or automated decision tools to collect covered individuals' data except for seven enumerated purposes. These permitted purposes include facilitating essential job functions, monitoring production quality, periodic performance assessment, legal compliance, protecting health and safety, administering wages and benefits (limited to city-of-work cost-of-living data), and any other purpose the Department of Labor and Economic Opportunity determines enables business operations. This is a purpose-limitation obligation — data collection outside these enumerated purposes is flatly prohibited.
Sec. 5. (1) Except as provided in this act, an employer shall not use an electronic monitoring tool or automated decisions tool to collect a covered individual's data. (2) An employer may use an electronic monitoring tool for only the following purposes: (a) To allow an employee to accomplish or facilitate an essential job function. (b) To monitor production processes or quality. (c) To periodically assess an employee's performance. (d) To ensure or facilitate compliance with state or federal labor or employment law. (e) To protect the health, safety, or security of covered individuals. (f) To administer wages and benefits, if it can be determined that the electronic monitoring system uses only data regarding the city where the covered individual works and the costs of living in that area. (g) To accomplish any other purpose that enables business operations as determined by the department.
Pending 2026-02-24
D-01.1D-01.2D-01.3
Sec. 5(3)(a)-(d)
Plain Language
Employers using electronic monitoring or automated decision tools must provide written notice to all affected covered individuals that the tool is in use, obtain written consent from each covered individual before monitoring or using the tool, ensure data accuracy and currency, and allow covered individuals to correct inaccurate data about themselves. The notice and consent requirements are prerequisites to lawful use — the employer cannot begin using the tool until both are satisfied.
(3) An employer that uses an electronic monitoring tool or automated decisions tool must do all of the following: (a) Provide written notice that the employer is using an electronic monitoring tool or automated decisions tool to all covered individuals who are subject to the tool. (b) Obtain written consent from each covered individual to electronically monitor or use an automated decisions tool on the covered individual in accordance with this act. (c) Ensure that data collected through the electronic monitoring tool or automated decisions tool is accurate and up to date. (d) Allow a covered individual to correct inaccurate data about that covered individual.
Pending 2026-02-24
D-01.4
Sec. 5(3)(e)-(h)
Plain Language
Employers must apply strict data minimization principles to their use of electronic monitoring and automated decision tools. Each tool must be narrowly tailored to its permitted purpose, use the least invasive means possible, apply to the smallest number of covered individuals necessary, collect the minimum amount of data, and operate no more frequently than necessary. Additionally, tools may not collect any employee data when the employee is off duty. These are ongoing operational obligations — not one-time design checks.
(e) Use the tool in a narrowly tailored manner to accomplish a purpose described in subsection (2) or section 4(2). (f) Use the tool through the least invasive means possible for the covered individual whom the tool monitors. (g) Ensure the tool applies to the smallest number of covered individuals, collects the least amount of data, and is used no more frequently than necessary to accomplish a purpose described in subsection (2) or section 4(2). (h) Ensure that the tool does not collect any data of an employee when the employee is off duty.
Pending 2026-02-24
D-01.5
Sec. 5(4)(a)-(c)
Plain Language
Employers are categorically prohibited from collecting certain types of data through electronic monitoring or automated decision tools, even when used for a permitted purpose. Prohibited data includes all health and medical information, any qualified characteristic (race, sex, disability, etc.), and an exhaustive list of workplace activity data covering HR files, productivity metrics, workplace communications, device usage, geolocation, audio-video sensor data including biometrics, AI tool inputs/outputs linked to individuals, and online identifiers. Employers also may not use these tools to identify or punish protected labor activity, and may not monitor bathrooms, breakrooms, prayer areas, breast-milk expression areas, or other private spaces — including employees' homes, personal vehicles, or owned property. The workplace-activity data prohibition in (a)(iii) is remarkably broad and would appear to prohibit much of the data that even the permitted application-screening use in Sec. 4(2) would ordinarily require.
(4) An employer that uses an electronic monitoring tool for a purpose described in subsection (2) or an automated decisions tool for a purpose described in section 4(2) shall not do any of the following: (a) Collect any of the following data of a covered individual: (i) Health, medical, lifestyle, and wellness information, including, but not limited to, the covered individual's medical history, physical or mental condition, diet or physical activity patterns, heart rate, medical treatment or diagnosis by a health care professional, health insurance policy number, subscriber identification number, or other unique identifier used to identify the covered individual. (ii) A qualified characteristic. (iii) Information related to workplace activities, including, but not limited, all of the following: (A) Human resources information, including contents of a covered individual's personnel file or performance evaluations. (B) Work process information, such as productivity and efficiency information. (C) Information that captures workplace communications and interactions, including emails, texts, internal message boards, and customer interaction and ratings. (D) Device usage, including calls placed or geolocation information. (E) Audio-video information and other information collected from sensors, including movement tracking, thermal sensors, voiceprints, or facial, emotion, and gait recognition. (F) Inputs of or outputs generated by an automated decisions tool that are linked to a covered individual. (G) Online information, including a covered individual's internet protocol address, private social media activity, or other digital sources or unique identifiers associated with a covered individual. (b) Identify, punish, or obtain data about a covered individual who engages in an activity that is protected under state or federal labor or employment law. (c) Monitor bathrooms or other similar private areas, including, but not limited to, locker rooms, changing areas, breakrooms, smoking areas, employee cafeterias, lounges, areas designated to express breast milk, or areas designated for prayer or other religious activity. The prohibition under this subdivision includes data collection on the frequency of use of those private areas and conducting audio or visual monitoring of a workplace in an employee's residence, an employee's personal vehicle, or property owned or leased by an employee.
Pending 2026-08-01
D-01.8
Minn. Stat. § 325M.40, subd. 2
Plain Language
Any person must obtain an individual's consent before collecting biometric data from that individual. There is no exception for publicly available data or implied consent — consent must be affirmative and obtained prior to collection. Biometric data includes facial features, retina, iris, fingerprint, voiceprint, and hand or face geometry when capable of identifying an individual. The statute does not specify the form of consent (written vs. oral) but requires it occur before collection. Voiceprint data retained by financial institutions or their affiliates (as defined in 15 U.S.C. § 6809) is exempt from this entire section.
A person is prohibited from collecting biometric data from an individual unless the person receives the individual's consent to collect the biometric data before the collection occurs.
Pending 2026-08-01
D-01.4
Minn. Stat. § 325M.40, subd. 3(1)-(1)(iv)
Plain Language
Any person who has obtained biometric data is prohibited from selling, leasing, or otherwise disclosing it to third parties except in four narrow circumstances: (1) the individual consents to disclosure for identification purposes in disappearance or death; (2) the disclosure completes a financial transaction the individual requested or authorized; (3) the disclosure is required or permitted by federal or state law; or (4) the disclosure is made to or by law enforcement pursuant to a warrant. Outside these four exceptions, all transfers of biometric data to third parties are prohibited — there is no general consent-based disclosure exception for commercial purposes.
A person who obtains biometric data: (1) must not sell, lease, or otherwise disclose the biometric data to another person unless: (i) the individual consents to the disclosure for identification purposes in the event of the individual's disappearance or death; (ii) the disclosure completes a financial transaction that the individual requested or authorized; (iii) the disclosure is required or permitted by a federal or state law; or (iv) the disclosure is made by or to a law enforcement agency for a law enforcement purpose in response to a warrant;
Pending 2026-08-01
D-01.1D-01.2
Minn. Stat. § 181.9923, subd. 2(a)-(b), subd. 3(a)-(d)
Plain Language
Workers have the right to request copies of all their data collected, used, or produced by an ADS, including inputs, outputs, and human reviewer corroborating evidence. Employers must respond within seven days. Workers also have the right to request corrections. Upon receiving a correction request, the employer must investigate and, if inaccuracy is confirmed, promptly correct the data, review and adjust any employment decisions based on the inaccurate data, and notify third parties who received the inaccurate data. If the employer determines the data is accurate, it must inform the worker of the decision, verification steps taken, and supporting evidence. This creates a comprehensive data access and correction regime that goes beyond typical data subject requests — corrections must cascade to affected employment decisions and third-party data recipients.
Subd. 2. Record requests. (a) A worker has the right to request a copy of: (1) any of the worker's data collected, used, or produced by an automated decision system; (2) any input or output data used or produced by the automated decision system; and (3) corroborating evidence used by a human reviewer. (b) The employer must provide copies of the data requested within seven days of receiving a worker's request. Subd. 3. Record corrections. (a) A worker has the right to request corrections to: (1) any worker data collected, used, or produced by an automated decision system; (2) any input or output data used or produced by the automated decision system; and (3) any corroborating evidence used by a human reviewer. (b) An employer that receives a request to correct any of the information listed in paragraph (a) must investigate and determine whether the disputed data is inaccurate. (c) If an employer determines that the disputed data is inaccurate, the employer must: (1) promptly correct the disputed data and inform the worker of the employer's decision and action; (2) review and adjust any employment-related decisions that were partially or solely based on the inaccurate data and inform the worker of the adjustment; and (3) inform any third parties with which the employer shared the inaccurate data, or from which the employer received the inaccurate data, of the error and direct those third parties to correct the data. (d) If an employer, upon investigation, determines that the disputed data is accurate, the employer must inform the worker of: (1) the decision not to amend the disputed data; (2) the steps taken to verify the accuracy of the data; and (3) the evidence supporting the decision not to amend the disputed data.
Pending 2026-08-01
D-01.4D-01.5
Minn. Stat. § 181.9924, subd. 1(b)
Plain Language
Employers are generally prohibited from using ADS with individualized worker data to set compensation. The prohibition has a narrow exception: the employer may use such a system only if all three conditions are met — the input data is directly related to job task ability (e.g., education, training, seniority), the inputs are clearly communicated to the worker, and the system is used no more than once per six months per worker or only in connection with a meaningful change in duties like hiring or promotion. This is a data minimization and purpose limitation control specific to the compensation context.
(b) An employer must not use an automated decision system that uses individualized worker data as inputs or outputs to set compensation, unless the employer can demonstrate that: (1) the input data is directly related to the ability of the worker to complete the task, such as education, training, experience, or seniority; (2) the inputs used are clearly communicated to the worker such that the worker knows their compensation is a function of the identified attributes; and (3) the employer uses the automated decision system either: (i) not more than once per six-month period per worker; or (ii) only in conjunction with a meaningful change in work duties, such as hiring or promotion.
Pending 2026-08-01
D-01.3
Minn. Stat. § 181.9922, subd. 1(e)-(f)
Plain Language
Workers must provide affirmative written consent before being subjected to an automated decision system — merely receiving the notice is insufficient. Additionally, if reasonable alternatives to the ADS exist, the worker must be allowed to opt out entirely. This is an unusually strong consent-and-opt-out regime: it is not limited to specific data types or sensitive categories but applies to the entire ADS. The 'reasonable alternatives' qualifier limits the opt-out right — if no alternative exists, the employer may require ADS participation after obtaining consent. Employers should document the availability or unavailability of alternatives.
(e) A job applicant or worker must receive the notice required under this section and respond with affirmative written consent before the worker or applicant is subject to an automated decision system. (f) If reasonable alternatives to the use of the automated decision system exist, the worker must be allowed to opt out of being subject to the automated decision system.
Pre-filed 2026-08-01
D-01.8
Minn. Stat. § 325M.40, subd. 2
Plain Language
Before collecting any biometric data from an individual, the collecting person must obtain the individual's consent. Consent must be obtained before collection occurs — retroactive consent is insufficient. The definition of biometric data is broad, covering facial images, facial features, retinas, irises, fingerprints, voiceprints, hand geometry, and face geometry that may be used to identify an individual. Voiceprint data retained by financial institutions (as defined under 15 U.S.C. § 6809) is exempt from this requirement under Subdivision 5.
A person is prohibited from collecting biometric data from an individual unless the person receives the individual's consent to collect the biometric data before the collection occurs.
Pre-filed 2026-08-01
D-01.4
Minn. Stat. § 325M.40, subd. 3(1)-(2)
Plain Language
Once biometric data is obtained, the holder faces two ongoing obligations. First, sale, lease, or other disclosure to third parties is prohibited except under four narrow exceptions: (1) the individual consents for identification in case of disappearance or death; (2) the disclosure completes a financial transaction the individual requested; (3) disclosure is required or permitted by federal or state law; or (4) disclosure is to or by law enforcement pursuant to a warrant. Second, the holder must store, transmit, and protect biometric data using reasonable care, at a level at least as protective as the holder's treatment of other confidential information. These are use-limitation and security safeguards that restrict what can be done with collected biometric data.
A person who obtains biometric data: (1) must not sell, lease, or otherwise disclose the biometric data to another person unless: (i) the individual consents to the disclosure for identification purposes in the event of the individual's disappearance or death; (ii) the disclosure completes a financial transaction that the individual requested or authorized; (iii) the disclosure is required or permitted by a federal or state law; or (iv) the disclosure is made by or to a law enforcement agency for a law enforcement purpose in response to a warrant; (2) must store, transmit, and protect from disclosure the biometric data using reasonable care and in a manner that is at least as or more protective than the manner in which the person stores, transmits, and protects other confidential information the person possesses;
Pre-filed 2026-08-01
D-01.4
Minn. Stat. § 325M.40, subd. 3(3)
Plain Language
Biometric data must be deleted and destroyed within a reasonable time, but no later than one year after the purpose for collecting it expires. If a federal or state law mandates a longer retention period, the data must still be destroyed no later than one year after that legally required retention period ends. For employer-collected biometric data used for security purposes, the collection purpose is deemed to expire upon termination of the employment relationship — meaning the one-year deletion clock starts at termination. This creates a hard outer boundary on biometric data retention tied to purpose expiration.
(3) must delete and destroy the biometric data within a reasonable time, but no later than one year from the date the purpose for collecting the data expires, unless the data is maintained pursuant to a federal or state law that requires a longer retention period, in which case the biometric data must be destroyed within a reasonable time frame but no later than one year from the date that the state or federal law retention period expires. If an employer collects an employee's biometric data for security purposes, the purpose for collecting the data expires upon termination of the employment relationship.
Pending 2026-09-01
D-01.1D-01.2
§ 181.9923, Subd. 2(a)-(b), Subd. 3(a)-(d)
Plain Language
Workers have a right to request copies of all their data collected, used, or produced by an ADS — including inputs, outputs, and corroborating evidence used by human reviewers — and employers must respond within 7 days. Workers also have a right to request corrections to any of this data. Upon receiving a correction request, the employer must investigate and, if the data is inaccurate, promptly correct it, review and adjust any employment decisions based on the inaccurate data, and notify third parties who shared or provided the data. If the employer determines the data is accurate, it must inform the worker of the decision, the verification steps taken, and the supporting evidence. The correction right is notably expansive: it extends not just to the underlying worker data but to ADS outputs and human reviewer evidence, and requires remediation of decisions already made based on inaccurate data.
Subd. 2. Record requests. (a) A worker has the right to request a copy of: (1) any of the worker's data collected, used, or produced by an automated decision system; (2) any input or output data used or produced by the automated decision system; and (3) corroborating evidence used by a human reviewer. (b) The employer must provide copies of the data requested within seven days of receiving a worker's request. Subd. 3. Record corrections. (a) A worker has the right to request corrections to: (1) any worker data collected, used, or produced by an automated decision system; (2) any input or output data used or produced by the automated decision system; and (3) any corroborating evidence used by a human reviewer. (b) An employer that receives a request to correct any of the information listed in paragraph (a) must investigate and determine whether the disputed data is inaccurate. (c) If an employer determines that the disputed data is inaccurate, the employer must: (1) promptly correct the disputed data and inform the worker of the employer's decision and action; (2) review and adjust any employment-related decisions that were partially or solely based on the inaccurate data and inform the worker of the adjustment; and (3) inform any third parties with which the employer shared the inaccurate data, or from which the employer received the inaccurate data, of the error and direct those third parties to correct the data. (d) If an employer, upon investigation, determines that the disputed data is accurate, the employer must inform the worker of: (1) the decision not to amend the disputed data; (2) the steps taken to verify the accuracy of the data; and (3) the evidence supporting the decision not to amend the disputed data.
Pre-filed 2026-08-28
D-01.8
§ 1.566(2)(1)-(3)
Plain Language
Before collecting any biometric identifier or biometric information, a private entity must provide written notice to the individual (or their authorized representative) that biometric data is being collected, disclose the specific purpose and duration of collection, storage, and use, and obtain a written release. Critically, a valid written release cannot be obtained through a general release or user agreement — it must be specific to biometric data. In the employment context, written releases may only authorize biometric collection for physical/electronic access control (without location tracking) or timekeeping, and may be made a condition of employment.
2. No private entity shall collect, capture, purchase, receive through trade, or otherwise obtain a person's or a customer's biometric identifier or biometric information unless it first: (1) Informs the person or customer, or the person's or customer's legally authorized representative, in writing that a biometric identifier or biometric information is being collected or stored; (2) Informs the person or customer, or the person's or customer's legally authorized representative, of the specific purpose and length of term for which a biometric identifier or biometric information is being collected, stored, and used; and (3) Receives a written release executed by the person or customer, or the person's or customer's legally authorized representative.
Pre-filed 2026-08-28
D-01.5
§ 1.566(3)(2)
Plain Language
Private entities are categorically prohibited from selling, leasing, or trading any biometric identifier or biometric information they possess. There are no exceptions to this prohibition — even with consent, sale or trade of biometric data is not permitted. This is distinct from the disclosure restrictions in § 1.566(4), which allow disclosure with consent or for certain enumerated purposes; the sale/trade ban is absolute.
(2) No private entity in possession of a biometric identifier or biometric information shall sell, lease, or trade a person's or a customer's biometric identifier or biometric information.
Pre-filed 2026-08-28
D-01.4
§ 1.566(4)(1)-(4)
Plain Language
Private entities may not disclose, redisclose, or otherwise disseminate biometric identifiers or biometric information except in four narrow circumstances: (1) the individual or authorized representative provides a written release; (2) the disclosure completes a financial transaction the individual requested or authorized; (3) disclosure is required by law; or (4) disclosure is compelled by a valid warrant or subpoena. Any disclosure outside these four exceptions is a violation. Note that the written release requirement here is separate from the collection consent in § 1.566(2) — consent to collect does not automatically authorize disclosure to third parties.
4. No private entity in possession of a biometric identifier or biometric information shall disclose, redisclose, or otherwise disseminate a person's or a customer's biometric identifier or biometric information unless: (1) The person or customer, or the person's or customer's legally authorized representative, provides written release to the disclosure or redisclosure; (2) The disclosure or redisclosure completes a financial transaction requested or authorized by the person or customer, or the person's or customer's legally authorized representative; (3) The disclosure or redisclosure is required by state law, federal law, or municipal ordinance; or (4) The disclosure is required pursuant to a valid warrant or subpoena issued by a court of competent jurisdiction.
Pre-filed 2026-08-28
D-01.4
§ 1.2058(5)(2)(e)
Plain Language
Covered entities must implement data minimization and security controls specific to age verification data. Collection must be limited to what is minimally necessary for age verification or compliance. The data must be protected against unauthorized access, transmitted only with industry-standard encryption, retained no longer than reasonably necessary, and may not be shared with, transferred to, or sold to any other entity. These are standalone data governance obligations specific to age verification data — they apply even if the covered entity uses a third party for verification.
(e) A covered entity shall: a. Establish, implement, and maintain reasonable data security to: (i) Limit collection of personal data to that which is minimally necessary to verify a user's age or maintain compliance with this section; and (ii) Protect such age verification data against unauthorized access; b. Protect such age verification data against unauthorized access; c. Protect the integrity and confidentiality of such data by only transmitting such data using industry-standard encryption protocols; d. Retain such data for no longer than is reasonably necessary to verify a user's age or maintain compliance with this section; and e. Not share with, transfer to, or sell to any other entity such data.
Pending 2025-10-01
D-01.1D-01.3
Section 1(1)
Plain Language
Manufacturers of publicly distributed online media in Montana that use AI to personalize, curate, or filter what individual users see must disclose that they are using AI for this purpose and must give users the ability to opt out. This applies whether the AI controls information flow entirely or only in part. Governmental entities are excluded. The bill does not define 'manufacturers of publicly distributed online media,' leaving significant ambiguity about which entities are covered — this could include social media platforms, news aggregators, or content recommendation services.
Manufacturers of publicly distributed online media in the state that use an artificial intelligence system to direct, control, or focus the information any one individual can see, whether entirely or in part, shall disclose the use of the system and provide a user with the option to opt out.
Pending 2026-01-01
D-01.1D-01.4
G.S. § 114B-4(b)(1)-(5)
Plain Language
Licensed health information chatbot operators must implement industry-standard encryption for data at rest and in transit, maintain detailed access logs, and perform security audits at least every six months. Data breaches must be reported to the Department within 24 hours and to affected consumers within 48 hours — this overrides any contrary state breach notification law. Operators must obtain explicit user consent for data collection and use, provide users access to their personal data, and honor user deletion requests. These data rights and security obligations apply to all licensees under the Chatbot Licensing Act.
(b) A licensee shall do all of the following: (1) Implement industry-standard encryption for data in transit and at rest, maintain detailed access logs, and conduct regular security audits no less than once every six (6) months. (2) Report any data breaches within twenty-four (24) hours to the Department and within forty-eight (48) hours to affected consumers, notwithstanding any provision of law to the contrary. (3) Obtain explicit user consent for data collection and use. (4) Provide users with access to their personal data. (5) Provide users with the ability to delete their data upon request.
Pending 2026-01-01
D-01.4
G.S. § 170-3(b)(5),(7)
Plain Language
Covered platforms must limit data collection and storage to information that is adequate, relevant, and necessary for a legitimate purpose — applying a data minimization standard framed through the duty of loyalty. Platforms must also act as loyal gatekeepers of user personal information when allowing government or third-party access, avoiding conflicts with users' best interests. The three-part adequacy/relevance/necessity test mirrors GDPR-style data minimization principles. Third-party data sharing must be evaluated against the user's best interests, not merely the platform's commercial interests.
(5) Duty of loyalty in collection. — A covered platform shall collect and store only that information that does not conflict with a trusting party's best interests. Such information must be (i) adequate, in the sense that it is sufficient to fulfill a legitimate purpose of the platform; (ii) relevant, in the sense that the information has a relevant link to that legitimate purpose, and (iii) necessary, in the sense that it is the minimum amount of information which is needed for that legitimate purpose. (7) Duty of loyalty in gatekeeping. — A covered platform shall be a loyal gatekeeper of personal information from a trusted party, including avoiding conflicts to the best interests of trusting parties when allowing government or other third-party access to trusting parties and their data.
Pending 2026-01-01
D-01.4D-01.5
G.S. § 170-6(a)-(d)
Plain Language
Covered platforms must de-identify all user-related data collected through chatbot conversations or third-party cookies before storing or analyzing it. Sensitive personal information derived from chatbot use must not be incorporated into training datasets for any chatbot or generative AI system. Non-sensitive chatbot conversations must be stored for at least 60 days. For chatbots in healthcare, financial services, legal, government, mental health, education, or any domain primarily processing sensitive personal information, platforms must implement self-destructing messages that auto-delete 30 days after acquisition. All platforms must use transport encryption for user-chatbot communications. The training data prohibition on sensitive personal information is particularly significant — it effectively bars platforms from using user health, financial, identity, and communications data to improve their models.
(a) A covered platform must do each of the following: (1) Ensure that all user-related data disclosed collected through conversations between users and chatbots or through third-party cookies, undergoes a process of de-identification prior to storage and analysis; (2) Take reasonable care to prohibit the incorporation or inclusion of any sensitive personal information derived from a user during the use of a chatbot into an aggregate dataset used to train any chatbot or generative artificial intelligence system. (3) Store all chatbot conversations which does not include sensitive personal information for at least sixty (60) days. (b) Each covered platform that meets the standard set forth in subsection (a) of this section shall utilize self-destructing messages with a predetermined destruction period of thirty (30) days after the data has been acquired. (c) The requirements of subsection (b) of this section shall apply to all chatbots which are employed in: healthcare, financial services, the legal field, government services, mental health support, and education. In general, this applies to any domain, beyond those specifically listed, where chatbots are employed primarily for the processing or storage of sensitive personal information. (d) All covered platforms shall utilize transport encryption for all messages between a user and a chatbot.
Pending 2026-02-01
D-01.3
Sec. 4(4)(a)(iii)
Plain Language
Where Nebraska's data privacy law (§ 87-1107) provides consumers a right to opt out of profiling for consequential decisions, deployers of high-risk AI systems must affirmatively inform consumers of that right before using the system to make or substantially factor into a consequential decision. This provision creates a notification obligation about an existing right under another statute — it does not independently create the opt-out right itself.
(iii) If applicable, provide information to the consumer regarding the consumer's right to opt out of the processing of personal data concerning the consumer for any purpose of profiling in furtherance of decisions that produce legal or similarly significant effects concerning the consumer under subdivision (2)(e)(iii) of section 87-1107.
Passed
Section 3(a)-(b)
Plain Language
Business entities are categorically prohibited from selling, leasing, trading, sharing, or otherwise profiting from any information obtained by using a biometric surveillance system on consumers. There is no consent exception — the prohibition is absolute. This effectively treats biometric surveillance data as non-commercializable. Violations are independent unlawful practices under the Consumer Fraud Act, separate from the notice-and-use requirements in Section 2.
a. A business entity shall not sell, lease, trade, share, or otherwise profit from information obtained through the business entity's use of a biometric surveillance system on a consumer. b. A violation of this section shall be an unlawful practice and a violation of P.L.1960, c.39 (C.56:8-1 et seq.).
Passed 2026-01-01
D-01.8
Section 2(a)-(b)
Plain Language
Business entities are categorically prohibited from using biometric surveillance systems on consumers at their physical premises unless two conditions are met: (1) the business provides clear and conspicuous notice, which can be satisfied by posting a sign at the perimeter of the surveilled area, and (2) the system is used for a lawful purpose. This is a notice-and-lawful-purpose framework rather than an opt-in consent requirement — unlike Illinois BIPA, which requires written informed consent before any biometric identifier collection, New Jersey permits use with posted signage notice alone. The definition of facial recognition is notably broad, covering not only identification but also emotion inference, association tracking, and location inference from face, head, or body characteristics.
a. It shall be an unlawful practice and a violation of P.L.1960, c.39 (C.56:8-1 et seq.) for a business entity to use any biometric surveillance system on a consumer at the physical premises of the business entity, except as provided in subsection c. of this section. b. A business entity may use a biometric surveillance system on a consumer at the physical premises of the business entity, if: (1) the business entity provides clear and conspicuous notice to the consumer regarding its use of a biometric surveillance system; and (2) the biometric surveillance system is used for a lawful purpose. The business entity may satisfy the notice requirement of paragraph (1) of this section by posting a sign in a conspicuous location at the perimeter of any area where a biometric surveillance system is being used.
Passed 2026-01-01
D-01.4
Section 3(a)-(b)
Plain Language
Business entities are categorically prohibited from selling, leasing, trading, sharing, or otherwise profiting from any information obtained through biometric surveillance of consumers. This is a blanket prohibition on secondary use and monetization of biometric surveillance data — there is no consent exception. Unlike the notice-and-lawful-purpose framework in Section 2 that permits use with posted signage, this data commercialization prohibition is absolute. This is comparable to Illinois BIPA's prohibition on profiting from biometric identifiers, though BIPA structures it as a consent requirement rather than an outright ban.
a. A business entity shall not sell, lease, trade, share, or otherwise profit from information obtained through the business entity's use of a biometric surveillance system on a consumer. b. A violation of this section shall be an unlawful practice and a violation of P.L.1960, c.39 (C.56:8-1 et seq.).
Pre-filed 2026-02-02
D-01.4
Section 1.b.
Plain Language
Employers may only share applicant video interviews with service providers whose expertise or technology is necessary to evaluate the applicant's fitness for the position. This limits secondary use and distribution of applicant video data — the employer cannot share recordings with third parties for marketing, training unrelated AI models, or any purpose other than evaluating the specific applicant's suitability for the role.
b. An employer shall not share an applicant's video except with a service provider whose expertise or technology is necessary to evaluate the applicant's fitness for a position.
Pre-filed 2026-02-02
D-01.2
Section 1.c.
Plain Language
Applicants have the right to request deletion of their video interviews. Upon receiving the request, the employer must delete the videos within 30 days and instruct all third parties (including service providers) who received copies to delete them as well, including all backup copies. Third parties and service providers are independently obligated to comply with the employer's deletion instructions. This functions as a right-to-delete specific to AI video interview data.
c. Upon request from the applicant, an employer, within 30 days after receipt of the request, shall delete an applicant's interviews and instruct any other persons who received copies of the applicant's video interviews to also delete the videos, including all electronically generated backup copies. Any other person or service provider shall comply with the employer's instructions.
Pending 2025-04-27
D-01.4
State Tech. Law § 504(5)
Plain Language
Designers, developers, and deployers must ensure that data used in automated systems is appropriate and relevant to the system's purpose. Residents are also protected from compounded harms arising from data reuse — meaning data collected for one automated system purpose should not be repurposed in ways that compound risk or harm. This is a data minimization and purpose limitation obligation applied to AI system design and deployment.
5. New York residents are entitled to protection from inappropriate or irrelevant data use in the design, development, and deployment of automated systems, and from the compounded harm of its reuse.
Pending 2025-04-27
D-01.4
State Tech. Law § 506(1)-(2)
Plain Language
Automated systems must incorporate privacy protections by design and by default. Data collection must conform to reasonable user expectations and must be limited to data that is strictly necessary for the specific context of use. This is a combined privacy-by-design and data minimization obligation. The 'strictly necessary' standard is among the most restrictive formulations — it goes beyond 'reasonably necessary' or 'proportionate' standards used in other jurisdictions.
1. New York residents shall be protected from abusive data practices via built-in protections and shall maintain agency over the use of their personal data.
2. Privacy violations shall be mitigated through design choices that include privacy protections by default, ensuring that data collection conforms to reasonable expectations and that only strictly necessary data for the specific context is collected.
Pending 2025-04-27
D-01.1D-01.2D-01.3
State Tech. Law § 506(3)-(6)
Plain Language
Designers, developers, and deployers must respect New York residents' decisions regarding their data — including collection, use, access, transfer, and deletion — to the fullest extent possible, and must implement privacy-by-design alternatives where full user control is not feasible. Systems may not use dark patterns or privacy-invasive defaults that obscure user choice. Consent may only justify data collection where it can be meaningfully given — not through lengthy, incomprehensible terms of service. This effectively prohibits reliance on blanket consent buried in complex notices and requires that consent mechanisms be brief, plain-language, and context-specific.
3. Designers, developers, and deployers of automated systems must seek and respect the decisions of New York residents regarding the collection, use, access, transfer, and deletion of their data in all appropriate ways and to the fullest extent possible. Where not possible, alternative privacy by design safeguards must be implemented.
4. Automated systems shall not employ user experience or design decisions that obscure user choice or burden users with default settings that are privacy-invasive.
5. Consent shall be used to justify the collection of data only in instances where it can be appropriately and meaningfully given. Any consent requests shall be brief, understandable in plain language, and provide New York residents with agency over data collection and its specific context of use.
6. Any existing practice of complex notice-and-choice for broad data use shall be transformed, emphasizing clarity and user comprehension.
Pending 2025-04-27
D-01.4D-01.5
State Tech. Law § 506(7)
Plain Language
In sensitive domains — areas where activities can cause material harms to human rights, autonomy, dignity, or civil liberties — data and inferences about individuals may only be used for necessary functions. This data must be safeguarded by ethical review processes and subject to use prohibitions. The sensitive data definition is extremely broad, capturing genomic data, biometric data, behavioral data, geolocation data, criminal justice data, relationship history, and all data generated by minors. The combination of the broad sensitive data definition and the 'necessary functions only' restriction creates a strict purpose limitation regime for AI systems operating in these domains.
7. Enhanced protections and restrictions shall be established for data and inferences related to sensitive domains. In sensitive domains, individual data and related inferences may only be used for necessary functions, safeguarded by ethical review and use prohibitions.
Pending 2025-07-26
D-01.5
State Tech. Law § 522(1)-(3)
Plain Language
Licensees may share information and source code with third parties, but when biometric information (faceprints, voiceprints, fingerprints, gaitprints, irisprints, psychological profiles, or other identifying body/mind data) is shared, the receiving party becomes jointly liable for any harm or violations. The Secretary may prohibit specific persons from accessing a licensee's information or source code with written justification. This applies only to information received or generated by the licensee and source code created by the licensee — not to third-party system integration. The joint liability provision for biometric data sharing is a significant compliance consideration for data partnerships.
§ 522. Information and source code sharing. 1. Licensees shall be permitted to share information and source code with any third party, provided however, that where information is biometric information such party shall be jointly liable for any harm or violations under this article with the licensee. The secretary may, in their discretion, prohibit any person from accessing the information or source code of a licensee provided however that the secretary shall provide a written justification for such a prohibition. 2. For purposes of this section, "biometric information" shall include a person's: (a) faceprint; (b) voiceprint; (c) fingerprint; (d) gaitprint; (e) irisprint; (f) psychological profile; or (g) any other data related to a person's body or mind that can be used to identify a person. 3. This section shall only apply to the sharing of information received or generated by the licensee or source code created by the licensee and shall not apply to a third party integrating their systems with the licensee.
Pending 2025-05-26
D-01.8
Gen. Bus. Law § 676-b(2)(a)-(c)
Plain Language
Before collecting, capturing, purchasing, or otherwise obtaining any biometric identifier or biometric information from an individual, a private entity must: (1) provide written notice that biometric data is being collected or stored, (2) provide written notice of the specific purpose and duration of the collection, storage, and use, and (3) obtain a written release from the individual or their legally authorized representative. In the employment context, the written release may be executed as a condition of employment. All three steps must be completed before any collection occurs — retroactive notice and consent is insufficient.
2. No private entity may collect, capture, purchase, receive through trade, or otherwise obtain a person's or a customer's biometric identifier or biometric information, unless it first: (a) informs the subject or the subject's legally authorized representative in writing that a biometric identifier or biometric information is being collected or stored; (b) informs the subject or the subject's legally authorized representative in writing of the specific purpose and length of term for which a biometric identifier or biometric information is being collected, stored, and used; and (c) receives a written release executed by the subject of the biometric identifier or biometric information or the subject's legally authorized representative.
Pending 2025-05-26
D-01.4
Gen. Bus. Law § 676-b(1)
Plain Language
Any private entity possessing biometric identifiers or biometric information must develop and make publicly available a written policy setting a retention schedule and destruction guidelines. Biometric data must be permanently destroyed within a reasonable time — and no later than 60 days — after the earlier of (a) the data is no longer necessary for the purpose identified in the original notice or authorization, or (b) three years from the individual's last interaction with the entity. The entity must comply with its own published schedule and destruction guidelines unless compelled by a valid warrant or subpoena. This is both a documentation obligation (create and publish the policy) and a substantive data minimization obligation (actually destroy the data on schedule).
1. A private entity in possession of biometric identifiers or biometric information must develop a written policy, made available to the public, establishing a retention schedule and guidelines for permanently destroying biometric identifiers and biometric information within a reasonable time, but in no event later than sixty days, after it is no longer necessary to maintain for the permissible purpose or purposes identified in the notice or for which the individual provided valid authorization or within three years of the individual's last interaction with the private entity, whichever occurs first. Absent a valid warrant or subpoena issued by a court of competent jurisdiction, a private entity in possession of biometric identifiers or biometric information must comply with its established retention schedule and destruction guidelines.
Pending 2025-04-09
D-01.8
Gen. Bus. Law § 676-b(2)(a)-(c)
Plain Language
Before collecting, capturing, purchasing, or otherwise obtaining any person's biometric identifier or biometric information, a private entity must: (1) provide written notice that biometric data is being collected or stored; (2) provide written notice of the specific purpose and the length of the retention period; and (3) obtain a written release from the individual or their legally authorized representative. All three steps must be completed before any collection occurs. In the employment context, a written release executed as a condition of employment satisfies the consent requirement. This is an affirmative opt-in consent regime — no collection may proceed without prior written notice and consent.
2. No private entity may collect, capture, purchase, receive through trade, or otherwise obtain a person's or a customer's biometric identifier or biometric information, unless it first: (a) informs the subject or the subject's legally authorized representative in writing that a biometric identifier or biometric information is being collected or stored; (b) informs the subject or the subject's legally authorized representative in writing of the specific purpose and length of term for which a biometric identifier or biometric information is being collected, stored, and used; and (c) receives a written release executed by the subject of the biometric identifier or biometric information or the subject's legally authorized representative.
Pending 2025-04-09
D-01.4
Gen. Bus. Law § 676-b(1)
Plain Language
Any private entity that possesses biometric identifiers or biometric information must create and publicly make available a written policy that establishes a retention schedule and guidelines for permanent destruction of that data. The data must be destroyed within a reasonable time — no later than 60 days — after the earlier of: (a) the data is no longer needed for the purpose identified in the notice or authorized by the individual, or (b) three years from the individual's last interaction with the entity. The entity must actually comply with its own retention schedule and destruction guidelines unless subject to a valid warrant or subpoena. This is a data minimization and lifecycle management obligation — entities must both publish the policy and adhere to it.
1. A private entity in possession of biometric identifiers or biometric information must develop a written policy, made available to the public, establishing a retention schedule and guidelines for permanently destroying biometric identifiers and biometric information within a reasonable time, but in no event later than sixty days, after it is no longer necessary to maintain for the permissible purpose or purposes identified in the notice or for which the individual provided valid authorization or within three years of the individual's last interaction with the private entity, whichever occurs first. Absent a valid warrant or subpoena issued by a court of competent jurisdiction, a private entity in possession of biometric identifiers or biometric information must comply with its established retention schedule and destruction guidelines.
Pending 2025-04-09
D-01.5
Gen. Bus. Law § 676-b(3)
Plain Language
Private entities are categorically prohibited from selling, leasing, trading, or otherwise profiting from any person's biometric identifier or biometric information. There are no exceptions to this prohibition — even with consent, monetization of biometric data is barred. This goes beyond a typical consent requirement and imposes an absolute ban on commercial exploitation of biometric data.
3. No private entity in possession of a biometric identifier or biometric information may sell, lease, trade, or otherwise profit from a person's or a customer's biometric identifier or biometric information.
Pending 2025-04-09
Gen. Bus. Law § 676-b(4)(a)-(d)
Plain Language
Private entities may not disclose, redisclose, or disseminate biometric identifiers or biometric information to third parties unless one of four narrow exceptions applies: (1) consent from the individual or their authorized representative, (2) completing a financial transaction the individual requested or authorized, (3) the disclosure is required by law, or (4) pursuant to a valid warrant or subpoena. This effectively creates a closed set of permissible disclosure scenarios — any disclosure not falling within one of these four categories is unlawful.
4. No private entity in possession of a biometric identifier or biometric information may disclose, redisclose, or otherwise disseminate a person's or a customer's biometric identifier or biometric information unless: (a) the subject of the biometric identifier or biometric information or the subject's legally authorized representative consents to the disclosure or redisclosure; (b) the disclosure or redisclosure completes a financial transaction requested or authorized by the subject of the biometric identifier or the biometric information or the subject's legally authorized representative; (c) the disclosure or redisclosure is required by federal, state or local law or municipal ordinance; or (d) the disclosure is required pursuant to a valid warrant or subpoena issued by a court of competent jurisdiction.
Pending 2025-09-05
D-01.8
Gen. Bus. Law § 1155(1)
Plain Language
News media employers may not — directly or through a third party — authorize the use of a news media worker's work product to train a generative AI system without first providing notice, obtaining the worker's consent, and giving the worker an opportunity to bargain over remuneration. Workers who decline consent may not be penalized. This creates a three-part prerequisite (notice, consent, bargaining opportunity) before any training use of worker output, plus an anti-retaliation protection for workers who refuse.
News media employers shall not directly or through a third party authorize the training of a generative artificial intelligence system on the work product of a news media worker without notice, consent and an opportunity to bargain over appropriate remuneration. A news media employer shall not penalize a news media worker for declining to consent to allow their work product to be used to train a generative artificial intelligence system.
Pending 2026-07-22
D-01.5
Exec. Law § 296(23)(a)
Plain Language
The express prohibition on using zip codes as a proxy for protected classes establishes a proxy-variable restriction on AI systems used in employment. Employers may not design or deploy AI tools that infer protected characteristics from zip codes (or, by extension under general Human Rights Law principles, other facially neutral proxies) for use in consequential employment decisions. This is a specific application of the broader principle that AI systems may not circumvent non-discrimination requirements through proxy variables.
(a) It shall be an unlawful discriminatory practice for an employer to use artificial intelligence for recruitment, hiring, promotion, renewal of employment, selection for training or apprenticeship, discharge, discipline, tenure, or the terms, privileges, or conditions of employment that has the effect of subjecting employees to discrimination on the basis of age, race, creed, color, national origin, citizenship or immigration status, sexual orientation, gender identity or expression, military status, sex, disability, predisposing genetic characteristics, familial status, marital status, or status as a victim of domestic violence or to use zip codes as a proxy for such protected classes.
Pending 2026-11-01
D-01.4
Section 3(B)(1)-(3)
Plain Language
Deployers must limit data collection and storage to information that does not conflict with users' best interests and that satisfies a three-part test: adequacy (sufficient to fulfill a legitimate purpose), relevance (linked to that purpose), and necessity (the minimum amount needed). This is a data minimization obligation functionally similar to GDPR's adequacy/relevance/necessity framework. The 'trusting party' language is unusual and undefined — it likely refers to the user but introduces ambiguity. Deployers should treat this as requiring purpose-limited data collection with a necessity floor.
B. Deployers shall collect and store only that information that does not conflict with a trusting party's best interests. Such information must be: 1. Adequate, in the sense that it is sufficient to fulfill a legitimate purpose of the deployer; 2. Relevant, in the sense that the information has a relevant link to that legitimate purpose; and 3. Necessary, in the sense that it is the minimum amount of information which is needed for that legitimate purpose.
Pending 2026-10-06
D-01.4
35 Pa.C.S. § 3503(b)(6)
Plain Language
Patient data used in connection with facility AI algorithms must not be used beyond the intended and stated purpose of those algorithms. This purpose limitation is layered on top of existing HIPAA requirements and state privacy law. Facilities should clearly define and document the stated purpose of each AI algorithm to establish the boundary for permissible data use.
(6) Patient data must not be used beyond the intended and stated purpose of the artificial intelligence-based algorithms, consistent with the laws of this Commonwealth and 42 U.S.C. Ch. 7 Subch. XI Part C (relating to administrative simplification), as applicable.
Pending 2026-10-06
D-01.4
40 Pa.C.S. § 5203(b)(8)
Plain Language
Covered person data used by insurers' AI algorithms must not be used beyond the intended and stated purpose of those algorithms, consistent with state law and HIPAA. This limits secondary use of patient data processed by AI utilization review tools.
(8) The data of the covered person must not be used beyond the intended and stated purpose of the artificial intelligence-based algorithms, consistent with Commonwealth law and 42 U.S.C. Ch. 7, Subch. XI Part C (relating to administrative simplification), as applicable.
Pending 2026-10-06
D-01.4
40 Pa.C.S. § 5303(b)(8)
Plain Language
MA or CHIP managed care plans must not use enrollee data beyond the intended and stated purpose of their AI algorithms, consistent with state law and HIPAA. This mirrors the insurer data limitation in § 5203(b)(8).
(8) The data of the covered person or enrollees must not be used beyond the intended and stated purpose of the artificial intelligence-based algorithms, consistent with the laws of this Commonwealth and the Health Insurance Portability and Accountability Act of 1996 (Public Law 104-191, 110 Stat. 1936), as applicable.
Pending 2026-04-01
D-01.4
12 Pa.C.S. § 7103(a)-(d)
Plain Language
Suppliers are prohibited from selling or sharing with third parties any individually identifiable health information of consumers or any consumer input collected through the chatbot. Three narrow exceptions exist: (1) a health care provider requests and the consumer gives written consent; (2) the consumer requests sharing with a health plan and gives written consent; or (3) sharing is necessary for chatbot functionality with a contracted third party, with consumer consent, and both the supplier and third party must comply with HIPAA privacy and security requirements as if the supplier were a covered entity and the third party a business associate. Written consent must include an acknowledgment that the consumer understands and agrees to the access. This is a significant data restriction — all consumer input, not just health information, is protected from third-party sale or sharing by default.
(a) Prohibition.--Except as provided under subsections (b) and (c), a supplier may not sell to or share with a third party the following: (1) Individually identifiable health information of a consumer. (2) Consumer input. (b) Applicability.--The prohibition under subsection (a) shall not apply if: (1) Either: (i) A health care provider requests access to the individually identifiable health information of the consumer and the consumer consents to the access in accordance with subsection (d). (ii) The consumer requests that a health plan be provided access to the individually identifiable health information of the consumer and the consumer consents to the access in accordance with subsection (d). (2) The individually identifiable health information is shared in accordance with subsection (c). (c) Sharing information.-- (1) A supplier may share a consumer's individually identifiable health information if: (i) the sharing of the information is necessary to ensure the effective functionality of the chatbot with a third party with which the supplier has a contract related to the functionality; and (ii) the consumer consents to the sharing of the information in accordance with subsection (d). (2) When sharing information in accordance with this subsection, the supplier and the third party shall comply with all applicable privacy and security provisions of 45 CFR Pts. 160 (relating to general administrative requirements) and 164 (relating to security and privacy), as if the supplier were a covered entity and the third party were a business associate. (d) Consent.-- (1) A consumer may consent to access to individually identifiable health information of the consumer by a health care provider or health plan in accordance with this section. (2) To be effective, the consent under this subsection must: (i) Be in writing. (ii) Acknowledge that the consumer understands and agrees to the access of the individually identifiable health information of the consumer by a health care provider or health plan. (3) The consent under this subsection may involve the consumer initialing or signing the acknowledgment described in paragraph (2)(ii), checking a box, providing an electronic signature or hitting a button.
Pending 2026-01-28
D-01.4
R.I. Gen. Laws § 40.1-5.5-4
Plain Language
All records maintained by a licensed professional and all communications between the professional and a therapy-seeking individual must be kept confidential. Disclosure is permitted only under the existing exceptions in R.I. Gen. Laws § 40.1-5-26 (governing confidentiality of behavioral healthcare records). This extends existing confidentiality protections to the AI-assisted therapy context, ensuring that data generated through AI use in therapy sessions receives the same confidentiality treatment as traditional therapy records.
All records kept by a licensed professional and all communications between an individual seeking therapy or psychotherapy services and a licensed professional shall be confidential and shall not be disclosed except as provided pursuant to the provisions of § 40.1-5-26.
Pending 2026-02-12
D-01.4
§ 28-5.2-2(a)-(b)
Plain Language
Employers may only use electronic monitoring tools to collect employee information if the tool serves one of six enumerated legitimate purposes (essential job functions, quality assurance, periodic performance assessment, legal compliance, health/safety/security, or wage/benefits administration). The tool must be narrowly tailored to the stated purpose, implemented in the least invasive manner, limited to the smallest number of workers and least data necessary, and data must be deleted once the purpose is achieved. Unnecessary data must not be disclosed to the employer and must be disposed of by the vendor. No monitoring is permitted when employees are off-duty. Necessary data must be stored consistently with state privacy laws and disposed of promptly when no longer needed.
(a) It shall be unlawful for an employer to use an electronic monitoring tool to collect employee information unless: (1) The electronic monitoring tool is primarily used to accomplish any of the following legitimate purposes: (i) Allowing a worker to accomplish or facilitating the accomplishment of an essential job function; (ii) Ensuring the quality of goods and services; (iii) Conducting periodic assessment of worker performance; (iv) Ensuring or facilitating compliance with employment, labor, or other relevant laws; (v) Protecting the health, safety, or security of workers, or the security of the employer's facilities or computer networks; or (vi) Administering wages and benefits. (2) The department of labor and training standards may establish additional exceptions under this subsection, pursuant to chapter 35 of title 42 ("administrative procedures act.") (b)(1) The specific type and activated capabilities of an electronic monitoring tool shall be narrowly tailored to accomplish the employer's intended, legitimate purpose specified under subsection (a)(1) of this section; (2) The electronic monitoring tool shall only be used to accomplish the employer's intended, legitimate purpose specified in subsection (a)(1) of this section, and shall be customized and implemented in a manner ensuring that the execution of its duties are undertaken in the manner least invasive to employees of the employer, while still accomplishing the employer's legitimate purposes as defined by subsection (a)(1) of this section; (3) The specific form of electronic monitoring is limited to the smallest number of workers, collection of the least amount of data which shall be collected no more frequently than is necessary to accomplish the purpose, and the data collected, shall be deleted once the purpose has been achieved; (4) The employer shall ensure that any employee data that is collected utilizing an electronic monitoring tool that is not necessary to accomplish the employer's intended, legitimate purpose shall not be disclosed to the employer and shall be promptly disposed of by the vendor; (5) The employer shall ensure that employee data is not collected when the employee is off-duty; and (6) The employer shall ensure that any employee data collected utilizing an electronic monitoring tool that is necessary to accomplish the employer's intended, legitimate purpose, is stored consistent with the state's data and cyber privacy laws, promptly disposed of as soon as the data is no longer needed, and is not utilized by the employer, the vendor or any other third party for any reason except, as provided in subsection (c) of this section.
Pending 2026-02-12
D-01.5
§ 28-5.2-2(e)
Plain Language
Even where monitoring serves a legitimate purpose, employers face twelve categorical prohibitions. Key among them: employers may not use monitoring tools to collect protected-class information (health, race, sex, gender identity, etc.), may not use facial recognition, gait analysis, voice analysis, or emotion recognition technology, may not monitor off-duty employees or private areas (bathrooms, locker rooms, breakrooms, prayer areas, employee residences), and may not surveil protected labor activity. Adverse employment actions based on continuous incremental time-tracking data are prohibited except for egregious misconduct. Adverse actions based on undisclosed performance standards or improperly noticed monitoring data are also prohibited. Employees are protected from retaliation for good-faith refusal to submit to practices they believe violate this section.
(e) Notwithstanding the allowable purposes for electronic monitoring described in subsection (a) of this section, an employer shall not: (1) Use an electronic monitoring tool in such a manner that results in a violation of labor, employment, civil rights law or any other law of the state; (2) Use an electronic monitoring tool or data collected via an electronic monitoring tool in such a manner as to threaten the health, welfare, safety, or legal rights of employees or the general public; (3) Use an electronic monitoring tool to monitor employees who are off-duty or not performing work-related tasks; (4) Use an electronic monitoring tool in order to obtain information about an employee's health, including health status and health conditions, the race, color, religious creed, national origin, sex, gender identity, sexual orientation, genetic information, pregnancy or a condition related to said pregnancy including, but not limited to, lactation or the need to express breast milk for a nursing child, ancestry or status as a veteran or membership in any group protected from employment discrimination under title 28 or any other applicable law; (5) Use an electronic monitoring tool in order to identify, punish, or obtain information about employees engaging in activity protected under labor or employment law; (6) Conduct audio or visual monitoring of bathrooms or other similarly private areas, including locker rooms, changing areas, breakrooms, smoking areas, employee cafeterias, lounges, and areas designated to express breast milk, or areas designated for prayer or other religious activity, including data collection on the frequency of use of those private areas; (7) Conduct audio or visual monitoring of a workplace in an employee's residence, an employee's personal vehicle, or property owned or leased by an employee; (8) Use an electronic monitoring tool that incorporates facial recognition; (9) Use an electronic monitoring tool that incorporates gait, voice analysis, or emotion recognition technology; (10) Take adverse action against an employee, based, in whole or in part, on their opposition or refusal to submit to a practice that the employee believes in good faith violates this section; (11) Take adverse employment action against an employee on the basis of data collected via continuous incremental time-tracking tools, except in the case of egregious misconduct; or (12) Take adverse employment action against an employee based on any data collected via electronic monitoring, if such data measures an employee's performance in relation to a performance standard that has not been previously, clearly, and unmistakably disclosed to such employee, as well as to all other classes of employees to whom it applies in violation of this section, or if such data was collected without proper notice to employees or candidates pursuant to this section.
Pending 2026-02-12
D-01.4
§ 28-5.2-2(f)-(g)
Plain Language
Employers face two data governance restrictions on monitoring data: (1) a strict purpose limitation — data may only be used for the purposes disclosed in the prior written notice to employees, and (2) a transfer/sale prohibition — monitoring data may not be sold, transferred, or disclosed to any outside entity unless required by federal or state law or necessary for compliance with an impact assessment. These provisions effectively lock monitoring data into the use case described in the notice and prevent secondary commercial use.
(f) An employer shall not use employee data collected via an electronic monitoring tool for purposes other than those specified in the notice provided pursuant to subsection (c) of this section. (g) An employer shall not sell, transfer, or disclose employee data collected via an electronic monitoring tool to any other entity unless it is required to do so under federal law or the laws of the state, or necessary to do so to comply with an impact assessment of an automated decision system used pursuant to this section.
Pending 2026-02-06
D-01.4
§ 28-5.2-2(a)-(b)
Plain Language
Employers may only use electronic monitoring tools to collect employee information if the monitoring is primarily used for one of six enumerated legitimate purposes (facilitating essential job functions, quality assurance, periodic performance assessment, legal compliance, health/safety/security, or wage/benefit administration). Beyond the purpose limitation, the monitoring must be narrowly tailored: least invasive implementation, limited to the fewest workers and smallest data footprint necessary, no off-duty collection, prompt deletion of unnecessary data, and vendor disposal of data not needed for the legitimate purpose. The Department of Labor and Training may establish additional exceptions by rule.
(a) It shall be unlawful for an employer to use an electronic monitoring tool to collect employee information unless: (1) The electronic monitoring tool is primarily used to accomplish any of the following legitimate purposes: (i) Allowing a worker to accomplish or facilitating the accomplishment of an essential job function; (ii) Ensuring the quality of goods and services; (iii) Conducting periodic assessment of worker performance; (iv) Ensuring or facilitating compliance with employment, labor, or other relevant laws; (v) Protecting the health, safety, or security of workers, or the security of the employer's facilities or computer networks; or (vi) Administering wages and benefits. (2) The department of labor and training standards may establish additional exceptions under this subsection, pursuant to chapter 35 of title 42 ("administrative procedures act.") (b)(1) The specific type and activated capabilities of an electronic monitoring tool shall be narrowly tailored to accomplish the employer's intended, legitimate purpose specified under subsection (a)(1) of this section; (2) The electronic monitoring tool shall only be used to accomplish the employer's intended, legitimate purpose specified in subsection (a)(1) of this section, and shall be customized and implemented in a manner ensuring that the execution of its duties are undertaken in the manner least invasive to employees of the employer, while still accomplishing the employer's legitimate purposes as defined by subsection (a)(1) of this section; (3) The specific form of electronic monitoring is limited to the smallest number of workers, collection of the least amount of data which shall be collected no more frequently than is necessary to accomplish the purpose, and the data collected, shall be deleted once the purpose has been achieved; (4) The employer shall ensure that any employee data that is collected utilizing an electronic monitoring tool that is not necessary to accomplish the employer's intended, legitimate purpose shall not be disclosed to the employer and shall be promptly disposed of by the vendor; (5) The employer shall ensure that employee data is not collected when the employee is off-duty; and (6) The employer shall ensure that any employee data collected utilizing an electronic monitoring tool that is necessary to accomplish the employer's intended, legitimate purpose, is stored consistent with the state's data and cyber privacy laws, promptly disposed of as soon as the data is no longer needed, and is not utilized by the employer, the vendor or any other third party for any reason except, as provided in subsection (c) of this section.
Pending 2026-02-06
D-01.1
§ 28-5.2-2(c)
Plain Language
Before deploying any electronic monitoring tool, the employer must provide detailed prior written notice to all affected employees and candidates, obtain written acknowledgment from each, and post the notice conspicuously in the workplace. The notice must cover eleven enumerated elements: the monitoring purpose, specific data collected and its lifecycle, monitoring schedule, whether data feeds into an ADS, whether it informs employment decisions, use in discipline or litigation, productivity standard setting, data storage and retention details, a least-invasive-means explanation, the employee's right to refuse data sale/transfer, and instructions for exercising statutory rights.
(c) Any employer that uses an electronic monitoring tool shall give prior written notice and shall obtain written acknowledgment from all candidates and employees subject to electronic monitoring and shall also post said notice in a conspicuous place which is readily available for viewing by candidates for employment and employees. Such notice shall include, at a minimum, the following: (1) A description of the purpose for which the electronic monitoring tool will be used, as specified in subsection (a)(1) of this section; (2) A description of the specific employee data to be collected, stored, secured, and disposed of (and the schedule therefor), and the activities, locations, communications, and job roles that will be electronically monitored by the tool; (3) A description of the dates, times, and frequency that electronic monitoring will occur; (4) Whether and how any employee data collected by the electronic monitoring tool will be used as an input in an automated decision system; (5) Whether and how any employee data collected by the electronic monitoring tool will alone or in conjunction with an automated decision system be used to make an employment decision by the employer or employment agency; (6) Whether and how any employee data collected by the electronic monitoring tool may be stored and utilized in discipline, in internal policy compliance, in administrative agency adjudications, in litigation (whether or not it involves the employee or not as a party); (7) Whether any employee data collected by the electronic monitoring tool will be used to assess employees' productivity performance or to set productivity standards, and if so, how; (8) A description of where any employee data collected by the electronic monitoring tool will be stored and the length of time it will be retained; (9) An explanation for how the specific electronic monitoring practice is the least invasive means available to accomplish the monitoring purpose; (10) That an employee is entitled to notice and maintains the right to refuse the sale, transfer, or disclosure of their employee data, subject to the provisions of subsection (g) of this section; and (11) A clear and reasonably understandable description of how an employee can exercise the rights described in this chapter.
Pending 2026-02-06
D-01.5
§ 28-5.2-2(e)(1)-(9)
Plain Language
Even where monitoring is otherwise conducted for a legitimate purpose, employers face a set of absolute prohibitions. They may not use electronic monitoring to: violate any state law; threaten employee health, safety, welfare, or legal rights; monitor off-duty or non-work activities; obtain protected-class information (health, race, sex, gender identity, sexual orientation, pregnancy, genetic information, religion, veteran status, etc.); identify or punish protected labor activity; surveil private areas (bathrooms, locker rooms, breakrooms, prayer areas, lactation areas); monitor employees' homes, personal vehicles, or personal property; or deploy facial recognition, gait analysis, voice analysis, or emotion recognition technology. The facial recognition and biometric technology prohibitions are categorical — no legitimate purpose exception applies.
(e) Notwithstanding the allowable purposes for electronic monitoring described in subsection (a) of this section, an employer shall not: (1) Use an electronic monitoring tool in such a manner that results in a violation of labor, employment, civil rights law or any other law of the state; (2) Use an electronic monitoring tool or data collected via an electronic monitoring tool in such a manner as to threaten the health, welfare, safety, or legal rights of employees or the general public; (3) Use an electronic monitoring tool to monitor employees who are off-duty or not performing work-related tasks; (4) Use an electronic monitoring tool in order to obtain information about an employee's health, including health status and health conditions, the race, color, religious creed, national origin, sex, gender identity, sexual orientation, genetic information, pregnancy or a condition related to said pregnancy including, but not limited to, lactation or the need to express breast milk for a nursing child, ancestry or status as a veteran or membership in any group protected from employment discrimination under title 28 or any other applicable law; (5) Use an electronic monitoring tool in order to identify, punish, or obtain information about employees engaging in activity protected under labor or employment law; (6) Conduct audio or visual monitoring of bathrooms or other similarly private areas, including locker rooms, changing areas, breakrooms, smoking areas, employee cafeterias, lounges, and areas designated to express breast milk, or areas designated for prayer or other religious activity, including data collection on the frequency of use of those private areas; (7) Conduct audio or visual monitoring of a workplace in an employee's residence, an employee's personal vehicle, or property owned or leased by an employee; (8) Use an electronic monitoring tool that incorporates facial recognition; (9) Use an electronic monitoring tool that incorporates gait, voice analysis, or emotion recognition technology;
Pending 2026-02-06
D-01.4
§ 28-5.2-2(f)-(g)
Plain Language
Employee data collected through electronic monitoring may only be used for the purposes disclosed in the written notice provided to employees under subsection (c). Any secondary use is prohibited. Additionally, employers may not sell, transfer, or disclose monitoring-collected employee data to any third party unless required by federal or state law, or necessary to comply with a required ADS impact assessment. This creates a strong purpose limitation and near-total prohibition on external data sharing.
(f) An employer shall not use employee data collected via an electronic monitoring tool for purposes other than those specified in the notice provided pursuant to subsection (c) of this section. (g) An employer shall not sell, transfer, or disclose employee data collected via an electronic monitoring tool to any other entity unless it is required to do so under federal law or the laws of the state, or necessary to do so to comply with an impact assessment of an automated decision system used pursuant to this section.
Pre-filed 2026-01-01
D-01.4
S.C. Code § 39-80-20(A)(1)
Plain Language
Chatbot providers may not process personal data to inform chatbot outputs unless two conditions are met: (1) the processing is necessary to fulfill an express user request, and (2) the user has provided affirmative consent. Affirmative consent has strict requirements — it cannot be obtained through terms of service, dark patterns, or inaction. This effectively creates a purpose-limitation and consent-gating obligation for all personal data processing in chatbot outputs.
(A) A chatbot provider may not: (1) process personal data to inform a chatbot output unless processing personal data is necessary to fulfill an express request that is made by a user and the user provides affirmative consent;
Pre-filed 2026-01-01
D-01.4
S.C. Code § 39-80-20(A)(2)
Plain Language
Chatbot providers are categorically prohibited from using a user's chat logs for any advertising purpose — whether to decide whether to show an ad, select what to advertise, or customize ad content. This is an absolute prohibition with no consent override, unlike the personal data processing restriction in subsection (A)(1). The advertising prohibition extends to any use of chat log data to facilitate targeted or personalized advertising.
(A) A chatbot provider may not: (2) process a user's chat log: (a) to determine whether to display an advertisement for a product or service to a user; (b) to determine a product or service or category of a product or service to advertise to a user; or (c) to customize an advertisement for presentation to a user;
Pre-filed 2026-01-01
D-01.4
S.C. Code § 39-80-20(A)(3)(a)-(d)
Plain Language
Chatbot providers face layered restrictions on processing chat logs and personal data. For minors (known or reasonably should be known): all processing of chat logs and personal data requires parental/guardian affirmative consent, with no exception. For adults: training use of chat logs and personal data requires affirmative consent. For all users: profiling beyond what is necessary to fulfill an express request is prohibited. Training is defined narrowly to exclude safety testing and compliance-related adjustments, meaning providers can still process data for safety purposes without consent.
(A) A chatbot provider may not: (3) process a user's chat log and personal data: (a) if the chatbot provider knows or reasonably should have known that based on knowledge of objective circumstances the user is a minor and the user's parent or legal guardian did not provide affirmative consent; (b) for training purposes if the chatbot provider knows or reasonably should have known that based on knowledge of objective circumstances the user is a minor and the user's parent or legal guardian did not provide affirmative consent; (c) for training purposes if the user is an adult, unless the chatbot provider first obtains affirmative consent; or (d) to engage in profiling beyond what is necessary to fulfill an express request;
Pre-filed 2026-01-01
D-01.4
S.C. Code § 39-80-20(A)(4)
Plain Language
Chatbot providers may not profile users — classifying or designating personality traits or behavioral characteristics — beyond what is strictly necessary to fulfill an express user request. This is a purpose-limitation restriction on profiling activity that applies to all users regardless of age. Safety-related processing is carved out from the definition of profiling and is therefore not restricted by this provision.
(A) A chatbot provider may not: (4) profile a user based on any classification or designation of the user's personality or behavioral characteristic beyond what is necessary to fulfill an express request made by the user;
Pre-filed 2026-01-01
D-01.4
S.C. Code § 39-80-20(A)(5)-(6)
Plain Language
Chatbot providers are categorically prohibited from selling user chat logs — no consent override is available. Chat logs must also not be retained for more than ten years, unless retention is required for compliance with this chapter or other law. The definition of 'sell' has carve-outs for service-provider disclosures, user-directed disclosures with affirmative consent, and data the user made publicly available without restrictions.
(A) A chatbot provider may not: (5) sell a user's chat logs; (6) retain a user's chat log for more than ten years, unless retention is necessary to comply with this chapter or otherwise required by law;
Pre-filed 2026-01-01
D-01.1D-01.2
S.C. Code § 39-80-20(B)
Plain Language
Users have a right to access their own chat logs at any time. Upon request, the chatbot provider must deliver the chat log in a downloadable, easily readable format. Providers may not discriminate or retaliate against users who exercise this access right. This is a data access right analogous to CCPA-style 'right to know' but specific to chatbot interaction records.
(B) A user has a right to access the user's own chat logs at any time. A chatbot provider shall provide a user's own chat log on request by the user and shall provide the chat log in a downloadable and easy to read format. A chatbot provider may not discriminate or retaliate against a user that requests the user's chat.
Pre-filed 2026-01-01
D-01.4
S.C. Code § 39-80-20(E)
Plain Language
Chatbot providers must implement physical, administrative, and technical measures to prevent reidentification of deidentified data throughout its lifecycle — during processing, retention, and transfer. The data must be maintained without any reasonable means of reidentification. This is a continuing safeguard obligation, not a one-time deidentification step.
(E) A chatbot provider shall take the necessary physical, administrative, and technical measures to prevent deidentified data from being reidentified and to process, retain, and transfer deidentified data without any reasonable means of reidentification.
Pending 2025-01-13
S.C. Code § 40-1-730
Plain Language
All records maintained by a licensed professional — which would include any AI-generated notes, transcripts, or data products — and all communications between the professional and the patient are confidential. Disclosure is permitted only as required under the existing South Carolina confidentiality exception in Section 44-22-100 (governing disclosure of mental health records). This effectively extends existing therapy confidentiality obligations to cover AI-processed data and outputs.
All records kept by a licensed professional and all communications between an individual seeking therapy or psychotherapy services and a licensed professional shall be confidential and shall not be disclosed except as required pursuant to Section 44-22-100.
Pending 2025-01-01
D-01.4
S.C. Code § 39-80-20(A)(1)
Plain Language
Chatbot providers may not process any personal data to shape chatbot outputs unless two conditions are met: (1) the processing is necessary to fulfill a specific express request made by the user, and (2) the user has provided affirmative consent. The definition of affirmative consent is rigorous — it requires a clear standalone disclosure in accessible and multilingual format, with the decline option equally prominent as the accept option. Consent cannot be inferred from inaction, continued use, or broad terms of service. This effectively prohibits ambient personalization based on personal data unless users specifically ask for and consent to it.
(A) A chatbot provider may not: (1) process personal data to inform a chatbot output unless processing personal data is necessary to fulfill an express request that is made by a user and the user provides affirmative consent;
Pending 2025-01-01
D-01.4
S.C. Code § 39-80-20(A)(2)
Plain Language
Chatbot providers are categorically prohibited from using a user's chat logs for any advertising purpose. This covers targeting (deciding whether to show an ad), selection (deciding which product category to advertise), and customization (tailoring the ad's content to the user). This is an absolute prohibition — there is no consent exception. The definition of chat log covers both user input data and all chatbot-generated output data.
(A) A chatbot provider may not: (2) process a user's chat log: (a) to determine whether to display an advertisement for a product or service to a user; (b) to determine a product or service or category of a product or service to advertise to a user; or (c) to customize an advertisement for presentation to a user;
Pending 2025-01-01
D-01.4D-01.6
S.C. Code § 39-80-20(A)(3)(a)-(d)
Plain Language
Chatbot providers face tiered restrictions on processing chat logs and personal data. For known or reasonably-known minor users: all processing of chat logs and personal data requires parental or guardian affirmative consent, and training use specifically requires parental consent. For adult users: training use requires affirmative consent. For all users: profiling is prohibited beyond what is necessary to fulfill an express user request. The definition of training explicitly excludes safety testing, harm-mitigation adjustments, and compliance-related actions, so providers may use chat data for those safety purposes without consent. Similarly, processing chat logs for user safety is excluded from the profiling prohibition.
(A) A chatbot provider may not: (3) process a user's chat log and personal data: (a) if the chatbot provider knows or reasonably should have known that based on knowledge of objective circumstances the user is a minor and the user's parent or legal guardian did not provide affirmative consent; (b) for training purposes if the chatbot provider knows or reasonably should have known that based on knowledge of objective circumstances the user is a minor and the user's parent or legal guardian did not provide affirmative consent; (c) for training purposes if the user is an adult, unless the chatbot provider first obtains affirmative consent; or (d) to engage in profiling beyond what is necessary to fulfill an express request;
Pending 2025-01-01
D-01.4
S.C. Code § 39-80-20(A)(4)
Plain Language
Chatbot providers may not build user profiles based on personality traits or behavioral characteristics beyond what is strictly necessary to satisfy a specific user request. This prohibits speculative or pre-emptive profiling, background personality modeling, and behavioral classification that the user did not ask for. Processing chat logs for user safety or legal compliance is excluded from this prohibition.
(A) A chatbot provider may not: (4) profile a user based on any classification or designation of the user's personality or behavioral characteristic beyond what is necessary to fulfill an express request made by the user;
Pending 2025-01-01
S.C. Code § 39-80-20(A)(5)-(6)
Plain Language
Chatbot providers are categorically prohibited from selling user chat logs — there is no consent exception for sales. The definition of 'sell' is broad, covering any exchange for monetary or other valuable consideration, but carves out processor disclosures, user-directed disclosures with affirmative consent, and data the user intentionally made public. Additionally, providers must delete chat logs after ten years unless retention is required for compliance with this chapter or other law. These are absolute restrictions with no opt-in override.
(A) A chatbot provider may not: (5) sell a user's chat logs; (6) retain a user's chat log for more than ten years, unless retention is necessary to comply with this chapter or otherwise required by law;
Pending 2025-01-01
S.C. Code § 39-80-20(A)(7)
Plain Language
Chatbot providers may not discriminate or retaliate against users who refuse to consent to training use of their chat logs or personal data. This includes denying services, charging higher prices, or degrading service quality. Users must be able to decline consent to training use without any service consequences.
(A) A chatbot provider may not: (7) discriminate or retaliate against a user, including: (a) denying products or services to the user; (b) charging different prices or rates for products or services to the user; or (c) providing lower quality products or services to the user for refusing to consent to the use of chat logs or personal data for training purposes.
Pending 2025-01-01
S.C. Code § 39-80-20(B)
Plain Language
Users have a right to access and download their own chat logs at any time. Chatbot providers must produce the chat log in a downloadable, easy-to-read format upon request. Providers may not discriminate or retaliate against users who exercise this access right. This is an on-demand data portability right with no limit on frequency of requests.
(B) A user has a right to access the user's own chat logs at any time. A chatbot provider shall provide a user's own chat log on request by the user and shall provide the chat log in a downloadable and easy to read format. A chatbot provider may not discriminate or retaliate against a user that requests the user's chat.
Pending 2025-01-01
S.C. Code § 39-80-20(E)
Plain Language
Chatbot providers must implement physical, administrative, and technical measures to ensure deidentified data cannot be reidentified. The provider must process, retain, and transfer deidentified data without any reasonable means of reidentification. This is a continuing technical obligation — not a one-time deidentification step — requiring ongoing safeguards throughout the data lifecycle.
(E) A chatbot provider shall take the necessary physical, administrative, and technical measures to prevent deidentified data from being reidentified and to process, retain, and transfer deidentified data without any reasonable means of reidentification.
Pending 2026-07-01
D-01.3
§ 2.2-1202.2(B)(3)
Plain Language
State agencies must provide all individuals subject to automated employment decisions the right to opt out of the automated decision system entirely. In addition, agencies must provide a separate process for individuals with disabilities to seek accommodations for the system. This creates both a universal opt-out right and a disability-specific accommodation process.
Provide to all individuals the right to opt out of the use of the automated decision system for employment decisions and a process by which individuals with disabilities may seek accommodations for the automated decision system;
Pending 2026-07-01
D-01.3
§ 15.2-1500.2(B)(3)
Plain Language
Local government entities must provide all individuals subject to automated employment decisions the right to opt out entirely, plus a separate disability accommodation process. This mirrors the state agency opt-out obligation in § 2.2-1202.2(B)(3) but applies at the local government level.
Provide to all individuals the right to opt out of the use of the automated decision system for employment decisions and a process by which individuals with disabilities may seek accommodations for the automated decision system;
Pending 2027-01-01
D-01.4
§ 59.1-618
Plain Language
Operators may not use a minor's inputs to train the companion chatbot's underlying model unless the minor's parent or guardian has provided affirmative written consent specifically authorizing use of the minor's personal information for that purpose. This is an opt-in requirement — consent must be affirmative and written, and must be specific to the training purpose. General terms of service acceptance would not satisfy this requirement. The prohibition applies to the underlying model training, not to session-level personalization or contextual memory.
An operator shall not train the underlying model of a companion chatbot with the inputs of a minor unless the minor's parent or guardian has affirmatively provided written consent to the operator to use the minor's personal information for that specific purpose.
Pre-filed 2026-07-01
D-01.4
§ 59.1-615(C)
Plain Language
Deployers must limit data collection and storage to information that does not conflict with a user's best interests and that meets a three-part test: adequacy (sufficient for a legitimate purpose), relevance (linked to that purpose), and necessity (the minimum amount needed). This is a data minimization obligation that applies to all users, not just minors. The 'user's best interests' standard is subjective and undefined, which creates compliance ambiguity — it goes beyond typical necessity-based minimization by adding an affirmative user-interest requirement.
C. A deployer shall collect and store only such information as does not conflict with a user's best interests. Such information shall be (i) adequate, in the sense that it is sufficient to fulfill a legitimate purpose of the deployer; (ii) relevant, in the sense that the information has a relevant link to such legitimate purpose; and (iii) necessary, in the sense that it is the minimum amount of information that is needed for such legitimate purpose.
Pre-filed 2025-07-01
D-01.4
21 V.S.A. § 495q(b)
Plain Language
Employers may not electronically monitor employees unless the monitoring satisfies all five conditions: (1) it serves one of seven enumerated permissible purposes (e.g., essential job function, safety, compliance, periodic performance assessment); (2) the specific monitoring form is necessary and used exclusively for that purpose; (3) it is the least invasive means available; (4) it applies to the smallest number of employees, collects the minimum data, at the minimum frequency necessary; and (5) only authorized persons access the data and only for the noticed purpose and duration. This creates a multi-factor test that must be satisfied in full — failure on any prong makes the monitoring unlawful.
(b) Employee monitoring restricted. An employer shall not engage in electronic monitoring of an employee unless all of the following requirements are met: (1) the employer's purpose in utilizing the electronic monitoring is to: (A) assist or allow the employee to accomplish an essential job function; (B) monitor production processes or quality; (C) ensure compliance with applicable employment or labor laws; (D) protect the health, safety, or security of the employee, clients, or the public; (E) secure the employer's physical or digital property; (F) conduct periodic assessment of employee performance; or (G) track time worked or production output for purposes of determining the employee's compensation; (2) the specific form of electronic monitoring is necessary to accomplish the purpose identified pursuant to subdivision (1) of this subsection and is used exclusively to accomplish that purpose; (3) the specific form of electronic monitoring is the least invasive means, with respect to the employee, of accomplishing the purpose identified pursuant to subdivision (1) of this subsection; (4) the specific form of electronic monitoring is used with the smallest number of employees, collects the smallest amount of data necessary to accomplish the purpose identified pursuant to subdivision (1) of this subsection, and is collected not more frequently than necessary to accomplish that purpose; and (5) the employer ensures that only authorized persons have access to any data produced through the electronic monitoring and that the data is only used for the purpose and duration that the employee has been notified of pursuant to subsection (c) of this section.
Pre-filed 2025-07-01
D-01.1
21 V.S.A. § 495q(c)(1)-(3)
Plain Language
Employers must provide each affected employee with a detailed written notice at least 15 calendar days before beginning any electronic monitoring. The notice must cover 14 specific items, including the monitoring method, purpose, data use, technologies used, who has data access, retention periods, employee rights, and complaint instructions. If monitoring tracks productivity or performance, additional disclosures about standards, measurement methods, and consequences are required. The notice must be in plain language in the employee's primary language, include a cover sheet summary, and must be updated whenever the employer makes significant changes to monitoring practices.
(c) Required notice for employee monitoring. (1) At least 15 calendar days prior to commencing any form of electronic monitoring, an employer shall provide notice of the electronic monitoring to each employee who will be subject to it. The notice shall, at a minimum, include the following information: (A) the specific form of electronic monitoring; (B) a description of the intended purpose of the electronic monitoring and why the electronic monitoring is necessary to accomplish that purpose; (C) a description of how any data generated by the electronic monitoring will be used, including whether and how the data generated by the electronic monitoring will be used to inform employment-related decisions; (D) a description of the technologies that will be used to conduct the electronic monitoring; (E) a description of the specific activities, locations, communications, and job roles that will be electronically monitored; (F) the name of any person conducting electronic monitoring on the employer's behalf and any associated contract language related to the monitoring; (G) the name of any person, apart from the employer, who will have access to any data generated by the electronic monitoring and the reason why the person will have access to the data; (H) the positions within the employer that will have access to any data generated by the electronic monitoring; (I) when, where, and how frequently monitoring will occur; (J) the period of time for which any data generated by the electronic monitoring will be retained by the employer or another person and when that data will be destroyed; (K) notice of how an employee may access the data generated by the electronic monitoring and the process to correct any errors in the data; (L) a cover sheet that concisely summarizes the details contained in the notice; (M) notice of an employee's rights pursuant to this section and the judicial and administrative remedies available for redressing the wrongful use of electronic monitoring; and (N) instructions on how an employee can file a complaint against an employer for violations of this section. (2) If an employer uses electronic monitoring to track employee productivity or performance, the employer shall include the following information in the notice required by subdivision (1) of this subsection: (A) the performance or productivity standards by which employees will be assessed and how employees will be measured against those standards; (B) how performance or productivity data will be monitored and collected, including the identity of the employees subject to such monitoring and when, where, and how the monitoring and data collection will occur; and (C) any adverse consequences for failing to meet a performance or productivity standard and whether there is any bonus or incentive program associated with meeting or exceeding each standard. (3)(A) Notice of electronic monitoring provided pursuant to this section shall be written in plain, clear, and concise language and provided to each employee in the employee's primary language. (B) An employer shall provide a new, updated notice to employees if it makes any significant changes to the manner of electronic monitoring or to the way that the employer utilizes the electronic monitoring or any data generated by it.
Pre-filed 2025-07-01
D-01.1
21 V.S.A. § 495q(c)(4)
Plain Language
The 15-day advance notice requirement for electronic monitoring may be bypassed if the employer has reasonable grounds to believe an employee is engaged in illegal conduct, conduct violating others' legal rights, or creating a hostile work environment — and the monitoring is reasonably likely to produce evidence of that conduct, is narrowly tailored to identifying it, and otherwise complies with all other provisions of this section. This is an exception to the notice requirement only — all other substantive monitoring restrictions still apply.
(4) Notwithstanding subdivisions (1) and (2) of this subsection, prior notice of electronic monitoring shall not be required if: (A) the employer has reasonable grounds to believe that the employee is engaged in conduct that: (i) is illegal; (ii) violates the legal rights of the employer or another employee; or (iii) creates a hostile work environment; and (B) the electronic monitoring is reasonably likely to produce evidence of the conduct, is otherwise conducted in compliance with the previsions of this section, and is narrowly tailored to the purpose of identifying the conduct.
Pre-filed 2025-07-01
D-01.1
21 V.S.A. § 495q(c)(5)
Plain Language
Employers using electronic monitoring must annually provide each employee with a list of all monitoring systems currently in use in relation to that employee, in the employee's primary language. 'Currently in use' encompasses systems actively used, used within the past 90 days, or intended for use within the next 30 days. This is a recurring disclosure obligation separate from the initial 15-day pre-monitoring notice.
(5)(A) An employer that utilizes electronic monitoring shall annually provide each of its employees with a list of all electronic monitoring systems currently in use by the employer in relation to that employee. The list shall be provided in the primary language of the employee. (B) As used in this subdivision (5), "currently in use" means that the employer: (i) is currently using the system in relation to the employee; (ii) used the electronic monitoring system in relation to the employee within the past 90 days; or (iii) intends to use the electronic monitoring system in relation to the employee within the next 30 days.
Pre-filed 2025-07-01
D-01.5
21 V.S.A. § 495q(d)
Plain Language
Even where electronic monitoring satisfies the permissible-purpose and necessity requirements of subsection (b), it is categorically prohibited for nine enumerated uses. These include: monitoring off-duty employees, monitoring to suppress legal rights exercise, audio-visual monitoring of private areas (bathrooms, breakrooms, lactation rooms), tracking usage frequency of private areas, monitoring employee residences or personal vehicles (except for health/safety or data security), collecting protected-characteristic information, taking adverse action based on continuous incremental time-tracking data, and any monitoring that harms employee health, safety, or legal rights. Subdivision (7) is especially broad, covering an extensive list of protected characteristics including neurodiversity, reproductive health care, and political affiliation.
(d) Prohibitions on employee monitoring. Notwithstanding the purposes for electronic monitoring set forth in subdivision (b)(1) of this section, electronic monitoring shall not be used: (1) in any manner that violates State or federal labor, employment, civil rights, or human rights laws; (2) in relation to employees who are off-duty and not performing work-related tasks, including employees on-call; (3) to identify, punish, or obtain information about employees exercising legal rights, including rights guaranteed by labor and employment laws; (4) for audio-visual monitoring of bathrooms, locker rooms, changing areas, breakrooms, smoking areas, areas designated for the expression of breast milk, employee cafeterias, lounges, or other similarly private areas; (5) to determine the frequency with which employees visit or use bathrooms, locker rooms, changing areas, breakrooms, smoking areas, employee cafeterias, lounges, or other similarly private areas; (6) for monitoring, including audio-visual monitoring, of any space within an employee's residence or personal vehicle, or a property owned or rented by the employee, unless the monitoring is necessary to ensure the employee's health and safety or to verify the security of employer or client data; (7) to obtain information about an employee's actual or perceived age, color, disability, ethnicity, genetic information, limited proficiency in the English language, national origin, race, religion, pursuit or receipt of reproductive health care, sex, sexual orientation, gender identity or expression, marital status, family responsibilities, personal appearance, immigration status, political affiliation or association, neurodiversity, veteran status, or other classification protected under State or federal law; (8) to take adverse employment action against an employee on the basis of data collected via continuous incremental time-tracking tools; or (9) in a manner that harms health or safety or violates the legal rights of any employee.
Pre-filed 2025-07-01
D-01.4
21 V.S.A. § 495q(e)
Plain Language
Employers may not require employees to install monitoring applications on personal devices, or to wear or attach monitoring devices to their person or clothing, unless the monitoring is both necessary for essential job functions and limited to the times and activities required for those functions. Location tracking must be disabled outside essential-function activity periods. Physical implantation of monitoring devices on an employee's body is categorically prohibited — no exception applies.
(e) Restriction of employee monitoring through personal devices. (1) An employer shall not require an employee to install an application on a personal device for purposes of electronic monitoring or to wear a device or attach, embed, or physically implant a device on the employee's clothing that can be used for electronic monitoring, unless the electronic monitoring is: (A) necessary to accomplish the employee's essential job function; and (B) limited to only the times and activities necessary to accomplish the essential job functions. (2) Any location tracking function of an application or device shall be disabled outside of the times when the employee is engaged in activities necessary to accomplish essential job functions. (3) An employer shall not require an employee to physically implant a device on the employee's body for purposes of employee monitoring.
Pre-filed 2025-07-01
D-01.1D-01.2
21 V.S.A. § 495q(j)
Plain Language
Employees have the right to request and receive, within seven days, access to any data about them that was produced or used by electronic monitoring or an automated decision system. Employees may also request correction of errors, and within seven days the employer must either correct the data and explain what was done, or explain why the data was not corrected and what verification steps were taken. Both responses must be in plain, clear, concise language. The seven-day turnaround is unusually fast compared to most state data access frameworks.
(j) Employee right to access and correct data. (1) Within seven days of receiving a request, an employer shall provide an employee with access to any data that relates to the employee that was produced or utilized by electronic monitoring or an automated decision system used by the employer. (2) Within seven days of receiving a request to correct potential errors identified by an employee, an employer shall: (A) correct the erroneous information or data and provide the employee with a notice that complies with subdivision (c)(3)(A) of this section, explaining the steps taken by the employer; or (B) provide the employee with a notice explaining that the employer has not corrected the information or data and describing the steps the employer has taken to verify the accuracy of the disputed information or data.
Pre-filed 2026-07-01
D-01.4
9 V.S.A. § 4193b(a)(1)
Plain Language
Chatbot providers may not process any personal data beyond the user's direct input data to inform chatbot outputs, unless the processing is necessary to fulfill an express user request and the user has given affirmative consent. 'Affirmative consent' is defined with strict requirements — it must be a clear standalone request, cannot be bundled in general terms of use, and cannot be inferred from inaction or continued use. This effectively creates a data minimization requirement: by default, only input data may be used to generate outputs.
A chatbot provider shall not: (1) process personal data other than input data to inform chatbot outputs unless the processing of personal data is necessary to fulfill an express request made by a user and that user has provided affirmative consent;
Pre-filed 2026-07-01
D-01.4
9 V.S.A. § 4193b(a)(2)
Plain Language
Chatbot providers are categorically prohibited from using a user's chat logs for any advertising-related purposes — including deciding whether to show an ad, selecting what category of ad to show, or customizing how an ad is presented. This is an absolute prohibition with no consent override. Note that 'advertisement' is broadly defined to include any promotional content displayed in exchange for monetary or other valuable consideration, including data-sharing arrangements between the chatbot provider and the advertiser.
A chatbot provider shall not: (2) process a user's chat log to: (A) determine whether to display an advertisement for a product or service to the user; (B) determine a product, service, or category of product or service to advertise to the user; or (C) customize an advertisement or how an advertisement is presented to the user;
Pre-filed 2026-07-01
D-01.4D-01.6
9 V.S.A. § 4193b(a)(3)
Plain Language
This provision imposes four distinct data processing restrictions: (A) For known or constructively known minor users, all processing of chat logs or personal data requires parental or guardian affirmative consent. (B) Minor users' chat logs and personal data may never be used for model training — this is an absolute prohibition with no consent override. (C) Adult users' chat logs and personal data may be used for training only with prior affirmative consent. (D) Profiling — classifying personality or behavioral characteristics — may not exceed what is necessary to fulfill an express user request. Note that 'training' carves out safety testing and compliance activities, and 'profiling' carves out processing for user safety purposes.
A chatbot provider shall not: (3) process a user's chat log or personal data: (A) if the chatbot provider knows or should have known, based on knowledge fairly implied on the basis of objective circumstances, that the user is under 18 years of age without the affirmative consent of that user's parent or legal guardian; (B) for training purposes, if the chatbot provider knows or should have known, based on knowledge fairly implied on the basis of objective circumstances, that a user is under 18 years of age; (C) of a user over 18 years of age for training purposes, unless the chatbot provider first obtains affirmative consent; or (D) to engage in profiling beyond what is necessary to fulfill an express request from the user;
Pre-filed 2026-07-01
D-01.4
9 V.S.A. § 4193b(a)(4)
Plain Language
Even where profiling results already exist, chatbot providers may not use any classification or designation of a user's personality or behavioral characteristics beyond what is necessary to fulfill an express user request. This is a use restriction on profiling outputs — distinct from the § 4193b(a)(3)(D) prohibition on engaging in profiling, this provision restricts downstream use of profiling-derived classifications.
A chatbot provider shall not: (4) use any classification or designation of a user's personality or behavioral characteristics created through profiling beyond what is necessary to fulfill an express request made by the user;
Pre-filed 2026-07-01
D-01.1
9 V.S.A. § 4193b(b)-(b)(2)
Plain Language
Users have the right to access any of their own retained chat logs at any time, in a portable, downloadable, human- and machine-readable format. Chat logs include both the user's input data and the chatbot's generated outputs. Chatbot providers may not discriminate or retaliate against users for exercising this access right — including through service denial, price changes, or quality degradation.
(b) Right to access. A user has the right to access, in a portable and readily usable format and at any time, any of the user's own chat logs that a chatbot provider has retained. (1) Chat logs must be made available to users in a downloadable and human- and machine-readable format. (2) A chatbot provider shall not discriminate or retaliate against any user, including by denying products or services, charging different prices or rates for products or services, or providing lower-quality products or services to the user, for accessing their own chat logs.
Passed 2026-07-01
D-01.8
18 V.S.A. § 1893(a)-(b)
Plain Language
No person may collect or record neural data from a brain-computer interface unless they first provide the individual with a written notice explaining how the data will be used, and then receive written informed consent. This is an affirmative opt-in requirement — collection is prohibited by default. The written notice must precede and be separate from the consent itself. Consent must be voluntary, from an individual with capacity, and given after full disclosure of the nature, benefits, risks, and consequences.
(a) Prohibition. Subject to the limited exceptions provided in this section, no person shall: (1) collect or record an individual's neural data gathered from a brain-computer interface; or (2) share with a third party an individual's neural data gathered from a brain-computer interface. (b) Consent to collect. A person shall not collect or record an individual's neural data gathered from a brain-computer interface unless the person: (1) provides the individual with a written notice explaining how the person will use the individual's neural data; and (2) thereafter receives written informed consent from the individual to collect or record the individual's neural data.
Passed 2026-07-01
D-01.8
18 V.S.A. § 1893(c)
Plain Language
Before sharing any individual's neural data from a brain-computer interface with a third party, the person must provide a written request to the individual specifying the purpose for sharing and the name and address of the third party, and must receive written informed consent. This is a separate consent requirement from collection — even if an individual consented to collection, sharing requires its own specific, written informed consent naming each third-party recipient.
(c) Consent to share. A person shall not share with a third party an individual's neural data gathered from a brain-computer interface unless the person: (1) provides the individual with a written request for the individual's neural data to be shared with a third party and for what purposes, including the name and address of the third party; and (2) thereafter receives written informed consent from the individual to share the individual's neural data with the third party.
Passed 2026-07-01
D-01.3
18 V.S.A. § 1893(d)
Plain Language
Individuals have the right to revoke consent to collect, record, or share neural data at any time. The revocation mechanism must be at least as easy as the original consent process. Upon receiving a revocation notice, the person must destroy all records of the individual's neural data within 10 days, immediately cease all third-party sharing, and notify all third parties with whom neural data was shared. This creates both a deletion right and a downstream notification obligation — merely stopping collection is insufficient.
(d) Revocation of consent. (1) An individual who has provided written informed consent allowing a person to collect, record, or share the individual's neural data pursuant to this section has the right to revoke consent at any time thereafter by providing written notice to the person initially receiving the consent. This revocation of consent notice shall be as easy or easier for the individual to provide as compared to the requirements for initially providing consent. (2) A person who receives written notice from an individual revoking consent pursuant to subdivision (1) of this subsection shall: (A) destroy all records of the individual's neural data not later than 10 days after receiving the notice; and (B) in the case of the revocation of consent to share an individual's neural data, immediately: (i) cease sharing an individual's neural data with all third parties upon receipt of the notice; and (ii) inform all third parties with whom the person has shared the individual's neural data that the individual has revoked consent.
Passed 2026-07-01
D-01.4
18 V.S.A. § 9761(a)-(b)
Plain Language
Suppliers of mental health chatbots are broadly prohibited from selling or sharing Vermont users' individually identifiable health information or user inputs with third parties. Three narrow exceptions apply: (1) when a health care provider requests the information with user consent, (2) when the user requests the information be sent to their health plan, or (3) when sharing with a contractor is necessary for the chatbot's effective functionality. In the contractor-sharing exception, both the supplier and contractor must comply with HIPAA privacy and security rules as if the supplier were a HIPAA covered entity and the contractor a business associate. Notably, user input is protected absolutely — even the contractor exception applies only to individually identifiable health information, not to user input.
(a)(1) Except as provided in subdivision (2) of this subsection, a supplier of a mental health chatbot shall not sell to or share with any third party any: (A) individually identifiable health information of a Vermont user; or (B) user input of a Vermont user. (2) The prohibition set forth in subdivision (1) of this subsection shall not apply to individually identifiable health information that is: (A) requested by a health care provider with the consent of the Vermont user; (B) provided to a health plan of a Vermont user upon request of the Vermont user; or (C) shared in compliance with subsection (b) of this section. (b)(1) A supplier may share individually identifiable health information necessary to ensure the effective functionality of the mental health chatbot with another person with whom the supplier has a contract related to such functionality. (2) When sharing information pursuant to subdivision (1) of this subsection, the supplier and the other person shall comply with all applicable privacy and security provisions of 45 C.F.R. Part 160 and 45 C.F.R. Part 164, Subparts A and E, as if the supplier were a covered entity and the other person were a business associate, as those terms are defined in 45 C.F.R. § 160.103.
Pending 2026-07-01
D-01.8
§ 16-5EE-4(1)-(2)
Plain Language
Before collecting, using, or disclosing a consumer's genetic data, an entity must provide both a high-level privacy policy overview and a prominent, publicly available privacy notice covering at minimum the entity's data collection, consent, use, access, disclosure, transfer, security, and retention and deletion practices. The entity must also obtain initial express consent that clearly describes how the genetic data will be used, who within the entity can access test results, and how the data may be shared. Consent must come from the consumer or, where applicable, a parent, guardian, or power of attorney.
To safeguard the privacy, confidentiality, security, and integrity of a consumer's genetic data, an entity shall: (1) Provide clear and complete information regarding the entity's policies and procedures for the collection, use, or disclosure of genetic data by making available to a consumer: (A) A high-level privacy policy overview that includes basic, essential information about the entity's collection, use, or disclosure of genetic data; and (B) A prominent, publicly available privacy notice that includes, at a minimum, information about the entity's data collection, consent, use, access, disclosure, transfer, security, and retention and deletion practices for genetic data; (2) Obtain initial express consent from a consumer, parent, guardian, or power of attorney for the collection, use, or disclosure of the consumer's genetic data that: (A) Clearly describes the entity's use of the genetic data that the entity collects through the entity's genetic testing product or service; (B) Specifies the categories of individuals within the entity that have access to test results; and (C) Specifies how the entity may share the genetic data;
Pending 2026-07-01
D-01.8
§ 16-5EE-4(4)
Plain Language
For several categories of heightened-risk activities, an entity must obtain separate, specific express consent beyond the initial consent. These activities include: transferring genetic data or biological samples to third parties (with the third party's name disclosed), using genetic data beyond the primary testing purpose, retaining biological samples after completing testing, transferring data for research purposes, and marketing based on genetic data or selling genetic data. Each category requires its own consent. Transfers to processors under qualifying contracts are exempt from the third-party transfer consent requirement, but the processor contract must prohibit any use beyond the contracted services.
(4) If the entity engages in any of the following, obtain a consumer's: (A) Separate express consent for: (i) The transfer or disclosure of the consumer's genetic data or biological sample to any third party other than the entity's processors, including the name of the third party to which the consumer's genetic data or biological sample will be transferred or disclosed with the consumer's express consent; (ii) The use of genetic data beyond the primary purpose of the entity's genetic testing product or service and inherent contextual uses; or (iii) The entity's retention of any biological sample provided by the consumer following the entity's completion of the initial testing service requested by the consumer; (B) Informed express consent for transfer or disclosure of the consumer's genetic data to third party persons for: (i) Research purposes; or (ii) Research conducted under the control of the entity for the purpose of publication or generalizable knowledge; and (C) Express consent for: (i) Marketing to a consumer based on the consumer's genetic data; (ii) Marketing by a third-party person to a consumer based on the consumer having ordered or purchased a genetic testing product or service. Marketing does not include the provision of customized content or offers on the websites or through the applications or services provided by the entity with the first-party relationship to the consumer; or (iii) Sale or other valuable consideration of the consumer's genetic data.
Pending 2026-07-01
D-01.1D-01.2
§ 16-5EE-4(6)(A)
Plain Language
Entities must develop and maintain a comprehensive security program to protect genetic data from unauthorized access, use, or disclosure. In addition, entities must provide consumers with a process to: access their own genetic data, request deletion of that data, revoke any previously provided consent, and request destruction of their biological samples. These are ongoing operational obligations — the security program and consumer-rights processes must be actively maintained, not merely established at launch.
(6) Develop, implement, and maintain a comprehensive security program to protect a consumer's genetic data against unauthorized access, use, or disclosure; and (A) Provide a process for a consumer to: (i) Access the consumer's genetic data; (ii) Delete the consumer's genetic data; (iii) Revoke any consent provided by the consumer; and (iv) Request and obtain the destruction of the consumer's biological sample.
Pending 2026-07-01
D-01.4
§ 16-5EE-4(7)
Plain Language
Genetic data and biological samples of West Virginia residents may not be stored in any country sanctioned by OFAC or designated as a foreign adversary. Any transfer or storage of this data outside the United States requires the resident's consent. This is both a categorical prohibition (sanctioned/adversary countries) and a consent-gated restriction (all other non-U.S. storage).
(7) Genetic data and biometric samples of West Virginia residents collected in the state may not be stored within the territorial boundaries of any country currently sanctioned in any way by the United States office of foreign asset control or designated as a foreign adversary under 15 CFR 7.4(a). Genetic data or biometric data of West Virginia residents collected in the state may only be transferred or stored outside the United States with the consent of the resident.
Pending 2026-06-06
D-01.8
§ 15-17-3(b)
Plain Language
Before collecting, capturing, purchasing, or otherwise obtaining any biometric identifier or biometric information, a private entity must provide the individual (or their legally authorized representative) with written notice that biometric data is being collected or stored, written notice of the specific purpose and duration of collection, storage, and use, and must obtain a written release from the individual. All three steps must be completed before the data is obtained. In the employment context, a written release executed as a condition of employment satisfies the consent requirement. This is structurally identical to the Illinois BIPA §15(b) informed consent requirement.
(b) No private entity may collect, capture, purchase, receive through trade, or otherwise obtain a person's or a customer's biometric identifier or biometric information, unless it first: (1) Informs the subject or the subject's legally authorized representative in writing that a biometric identifier or biometric information is being collected or stored; (2) Informs the subject or the subject's legally authorized representative in writing of the specific purpose and length of term for which a biometric identifier or biometric information is being collected, stored, and used; and (3) Receives a written release executed by the subject of the biometric identifier or biometric information or the subject's legally authorized representative.
Pending 2026-06-06
D-01.4
§ 15-17-3(c)
Plain Language
Private entities are categorically prohibited from selling, leasing, trading, or otherwise profiting from any person's biometric identifier or biometric information. This is an absolute prohibition with no exceptions — there is no consent carve-out that would permit monetization even with the individual's agreement. This goes beyond typical data minimization requirements by entirely banning commercial exploitation of biometric data.
(c) No private entity in possession of a biometric identifier or biometric information may sell, lease, trade, or otherwise profit from a person's or a customer's biometric identifier or biometric information.
Pending 2026-06-06
D-01.4
§ 15-17-3(d)
Plain Language
Private entities may not disclose, redisclose, or disseminate a person's biometric identifier or biometric information except in four narrow circumstances: (1) the subject or their authorized representative consents; (2) the disclosure completes a financial transaction the subject requested or authorized; (3) disclosure is required by law or ordinance; or (4) disclosure is required by a valid warrant or subpoena. Any disclosure outside these four categories is a violation. Unlike the collection consent requirement in § 15-17-3(b), consent for disclosure does not need to be in writing — the statute says 'consents' without specifying a writing requirement.
(d) No private entity in possession of a biometric identifier or biometric information may disclose, redisclose, or otherwise disseminate a person's or a customer's biometric identifier or biometric information unless: (1) The subject of the biometric identifier or biometric information or the subject's legally authorized representative consents to the disclosure or redisclosure; (2) The disclosure or redisclosure completes a financial transaction requested or authorized by the subject of the biometric identifier or the biometric information or the subject's legally authorized representative; (3) The disclosure or redisclosure is required by state or federal law or municipal ordinance; or (4) The disclosure is required pursuant to a valid warrant or subpoena issued by a court of competent jurisdiction.