S-35
MA · State · USA
MA
USA
● Pre-filed
Proposed Effective Date
2025-01-14
Massachusetts Senate No. 35 — An Act fostering artificial intelligence responsibility
Regulates employer use of electronic monitoring tools and automated decision systems (ADS) in employment contexts in Massachusetts. Employers may only use electronic monitoring for enumerated legitimate purposes, must provide detailed written notice and obtain consent, and must narrowly tailor monitoring to be the least invasive means available. Automated employment decision tools require independent impact assessments evaluating disparate impact across protected classes before use and annually thereafter, with results submitted to the Department of Labor Standards for a public registry. Employers must maintain meaningful human oversight over consequential employment decisions and may not rely primarily on electronic monitoring data or ADS outputs. The bill also restricts state agency use of automated decision systems unless specifically authorized by law, with mandatory impact assessments. Creates a private right of action with liquidated damages of double restitution, attorney fees, and potential punitive damages.
Summary

Regulates employer use of electronic monitoring tools and automated decision systems (ADS) in employment contexts in Massachusetts. Employers may only use electronic monitoring for enumerated legitimate purposes, must provide detailed written notice and obtain consent, and must narrowly tailor monitoring to be the least invasive means available. Automated employment decision tools require independent impact assessments evaluating disparate impact across protected classes before use and annually thereafter, with results submitted to the Department of Labor Standards for a public registry. Employers must maintain meaningful human oversight over consequential employment decisions and may not rely primarily on electronic monitoring data or ADS outputs. The bill also restricts state agency use of automated decision systems unless specifically authorized by law, with mandatory impact assessments. Creates a private right of action with liquidated damages of double restitution, attorney fees, and potential punitive damages.

Enforcement & Penalties
Enforcement Authority
The Attorney General and Department of Labor Standards have enforcement authority. The Attorney General is directed to promulgate rules and regulations. The Division of Licensing is also directed to promulgate rules. Enforcement includes civil citation or order under section 27C. Private right of action is available to individuals subjected to adverse employment action based on conduct prohibited by the Act. Individual liability extends to the president, treasurer, and responsible managers of the employer.
Penalties
Employees subjected to adverse employment action may recover restitution and consequential damages, liquidated damages constituting double the amount of restitution, pre- and post-judgment interest, reasonable attorneys' fees and costs. A court may also impose punitive damages where appropriate. The statute does not limit the availability of other remedies at law or in equity.
Who Is Covered
"Employer", any person who directly or indirectly, or through an agent or any other person, employs, exercises, or reserves control, individually or jointly, over the wages, benefits, other compensation, hours, working conditions, access to work or job opportunities, or other terms or conditions of employment, of any worker, including the commonwealth, county, town, city, school district, public authority or other governmental subdivision of any kind. "Employer" includes any of the employer's agents, contractors, or subcontractors.
"Vendor", any person or entity who sells, distributes, or develops for sale an automated employment decision tool to be used in an employment decision made by an employer in the commonwealth. "Vendor" includes any of the vendor's agents, contractors, or subcontractors.
What Is Covered
"Automated Decision System (ADS)," any computational process, automated system, or algorithm utilizing machine learning, statistical modeling, data analytics, artificial intelligence, or similar methods that issues an output, including a score, classification, ranking, or recommendation, that is used to assist or replace human decision making on decisions that impact natural persons. "Automated decision tool" does not include a tool that does not assist or replace employment decision processes and that does not materially impact natural persons, including, but not limited to, a junk email filter, firewall, antivirus software, calculator, spreadsheet, database, data set, or other compilation of data.
"Electronic monitoring tool", any system, application, or instrument that facilitates the collection of data concerning worker activities or communications by any means other than direct observation by a natural person, including but not limited to the use of a computer, telephone, wire, radio, camera, electromagnetic, photoelectronic, or photo-optical system, or obtaining employee data from a third-party.
Compliance Obligations 21 obligations · click obligation ID to open requirement page
D-01 Automated Processing Rights & Data Controls · D-01.4 · Deployer · Employment
Chapter 149B, § 2(a)
Plain Language
Employers may only use electronic monitoring tools to collect employee data if the monitoring serves one of six enumerated legitimate purposes (facilitating essential job functions, quality assurance, periodic performance assessment, legal compliance, health/safety/security, or wages/benefits administration). Beyond purpose limitation, the tool must be narrowly tailored, implemented in the least invasive manner, limited to the fewest workers and least data necessary, and data must be deleted once the purpose is achieved. Off-duty monitoring is prohibited. Excess data must be disposed of by the vendor without disclosure to the employer. This creates a comprehensive data minimization regime for workplace surveillance.
Statutory Text
(a) It shall be unlawful for an employer to use an electronic monitoring tool to collect employee information unless: (i) the electronic monitoring tool is primarily used to accomplish any of the following purposes: (A) allowing a worker to accomplish or facilitating the accomplishment of an essential job function; (B) ensuring the quality of goods and services; (C) conducting periodic assessment of worker performance; (D) ensuring or facilitating compliance with employment, labor, or other relevant laws; (E) protecting the health, safety, or security of workers, or the security of the employer's facilities or computer networks; or (F) administering wages and benefits. The department of labor standards may establish additional exceptions under clause (i) through notice and comment rulemaking in compliance with chapter 30A. (ii) the specific type and activated capabilities of an electronic monitoring tool must be narrowly tailored to accomplish the employer's intended, legitimate purpose specified under (i). (iii) the electronic monitoring tool may only be used to accomplish the employer's intended, legitimate purpose specified in (i), and must be customized and implemented in a manner ensuring that the execution of its duties undertaken in the manner least invasive to employees of the employer while accomplishing the employer's legitimate purposes as defined by (i); (iv) the specific form of electronic monitoring is limited to the smallest number of workers, collects the least amount of data and is collected no more frequently than is necessary to accomplish the purpose, and the data collected is deleted once the purpose has been achieved. (v) the employer must ensure that any employee data that is collected utilizing an electronic monitoring tool that is not necessary to accomplish the employer's intended, legitimate purpose is not disclosed to the employer and is promptly disposed of by the vendor; (vi) the employer must ensure that employee data is not collected when the employee is off-duty; and (vii) the employer must ensure that any employee data collected utilizing an electronic monitoring tool that is necessary to accomplish the employer's intended, legitimate purpose, is stored consistent with the commonwealth's data- and cyber- privacy laws, promptly disposed of as soon as the data is no longer needed, and is not utilized by the employer, the vendor or any other third party for any reason except as provided in section 2(c) and section 3(c) of this chapter.
D-01 Automated Processing Rights & Data Controls · D-01.1 · Deployer · Employment
Chapter 149B, § 2(b)
Plain Language
Before using any electronic monitoring tool, employers must provide prior written notice to and obtain written consent from all affected employees and candidates. The notice must also be posted conspicuously. The notice must contain eleven enumerated disclosures covering the monitoring purpose, the specific data collected, collection schedule and frequency, whether data feeds into ADS tools, how data will be used in employment decisions and discipline, productivity assessment use, data storage location and retention period, why the monitoring is the least invasive method, the employee's right to refuse data sale/transfer, and how to exercise statutory rights.
Statutory Text
(b) Any employer that uses an electronic monitoring tool shall give prior written notice and must obtain written consent from all candidates and employees subject to electronic monitoring and must also post said notice in a conspicuous place which is readily available for viewing by candidates and employees, pursuant to sections 19B, 52C, and 190(i) of chapter 149 and section 99 of chapter 272. Such notice shall include, at a minimum, the following: (i) a description of the purpose for which the electronic monitoring tool will be used, as specified in subparagraph (i) of paragraph (a) of this subdivision; (ii) a description of the specific employee data to be collected, stored, secured, and disposed of (and the schedule therefore), and the activities, locations, communications, and job roles that will be electronically monitored by the tool; (iii) a description of the dates, times, and frequency that electronic monitoring will occur; (iv) whether and how any employee data collected by the electronic monitoring tool will be used as an input in an automated employment decision tool; (v) whether and how any employee data collected by the electronic monitoring tool will alone or in conjunction with an automated employment decision tool be used to make an employment decision by the employer or employment agency; (vi) whether and how any employee data collected by the electronic monitoring tool may be stored and utilized in discipline, in internal policy compliance, in administrative agency adjudications, and in litigation (whether or not it involves the employee as a party); (vii) whether any employee data collected by the electronic monitoring tool will be used to assess employees' productivity performance or to set productivity standards, and if so, how; (viii) a description of where any employee data collected by the electronic monitoring tool will be stored and the length of time it will be retained; (ix) an explanation for how the specific electronic monitoring practice is the least invasive means available to accomplish the monitoring purpose; (x) a statement that an employee is entitled to notice and maintains the right to refuse the sale, transfer, or disclosure of the employee's employee data subject to the provisions of section 2(f); and (xi) a clear and reasonably understandable description of how an employee can exercise the rights described in this chapter.
G-01 AI Governance Program & Documentation · G-01.3 · Deployer · Employment
Chapter 149B, § 2(c)
Plain Language
Employers must maintain contemporaneous, true, and accurate records of all electronically monitored data for three years, and destroy the data no later than 37 months after collection absent employee consent to longer retention. Employers must implement reasonable administrative, technical, and physical data security measures appropriate to the data's volume and nature. Employees have the right to request corrections to erroneous data. The 37-month window slightly exceeds the 3-year record preservation obligation, giving employers a one-month buffer.
Statutory Text
(c) An employer shall establish, maintain, and preserve for three years contemporaneous, true, and accurate records of data collected via an electronic monitoring tool to ensure compliance with employee or commissioner requests for data. The employer shall destroy any employee information collected via an electronic monitoring tool no later than thirty-seven months after collection unless the employee has provided written and informed consent to the retention of their data by the employer. An employer shall establish, implement and maintain reasonable administrative, technical and physical data security practices to protect the confidentiality, integrity and accessibility of employee data appropriate to the volume and nature of the employee data at issue. An employee shall have the right to request corrections to erroneous employee data.
S-02 Prohibited Conduct & Output Restrictions · Deployer · EmploymentBiometrics
Chapter 149B, § 2(d)
Plain Language
Even where electronic monitoring is otherwise permissible, employers face twelve categorical prohibitions. Key prohibitions include: no monitoring to obtain protected-class information (health, race, sex, etc.); no monitoring of off-duty employees; no audio/visual monitoring of bathrooms, breakrooms, lactation rooms, prayer areas, or employees' homes/vehicles; a near-total ban on gait, voice analysis, and emotion recognition technology; facial recognition only for facility/worker security; no retaliation against employees who oppose monitoring in good faith; no adverse action based on continuous time-tracking data except for egregious misconduct; and no adverse action based on undisclosed performance standards.
Statutory Text
(d) Notwithstanding the allowable purposes for electronic monitoring described in paragraph (a) of subdivision one of this section, an employer shall not: (i) use an electronic monitoring tool in such a manner that results in a violation of labor, employment, civil rights law or any other law of the commonwealth; (ii) use an electronic monitoring tool or data collected via an electronic monitoring tool in such a manner as to threaten the health, welfare, safety, or legal rights of employees or the general public; (iii) use an electronic monitoring tool to monitor employees who are off-duty and not performing work-related tasks; (iv) use an electronic monitoring tool in order to obtain information about an employee's health, including health status and health conditions, the race, color, religious creed, national origin, sex, gender identity, sexual orientation, genetic information, pregnancy or a condition related to said pregnancy including, but not limited to, lactation or the need to express breast milk for a nursing child, ancestry or status as a veteran or membership in any group protected from employment discrimination under chapter 151B or any other applicable law; (v) use an electronic monitoring tool in order to identify, punish, or obtain information about employees engaging in activity protected under labor or employment law; (vi) conduct audio or visual monitoring of bathrooms or other similarly private areas, including locker rooms, changing areas, breakrooms, smoking areas, employee cafeterias, lounges, areas designated to express breast milk, or areas designated for prayer or other religious activity, including data collection on the frequency of use of those private areas; (vii) conduct audio or visual monitoring of a workplace in an employee's residence, an employee's personal vehicle, or property owned or leased by an employee; (viii) use an electronic monitoring tool that incorporates facial recognition, unless such technology is necessary to protect the security of workers or the security of the employer's facilities; (ix) use an electronic monitoring tool that incorporates gait, voice analysis, or emotion recognition technology; (x) take adverse action against an employee based in whole or in part on their opposition or refusal to submit to a practice that the employee believes in good faith violates this article; (xi) take adverse employment action against an employee on the basis of data collected via continuous incremental time-tracking tools except in the case of egregious misconduct; or (xii) take adverse employment action against an employee based on any data collected via electronic monitoring if such data measures an employee's performance in relation to a performance standard that has not been previously, clearly, and unmistakably disclosed to such employee as well as to all other classes of employees to whom it applies in violation of subparagraph (vi) of paragraph (b) of subdivision one of this section, or if such data was collected without proper notice to employees or candidates pursuant to sections 19B, 52C, and 190(i) of chapter 149 and section 99 of chapter 272.
D-01 Automated Processing Rights & Data Controls · D-01.4 · Deployer · Employment
Chapter 149B, § 2(e)
Plain Language
Employers may not use electronically monitored employee data for any purpose beyond what was disclosed in the required notice. This is a secondary-use restriction — once data is collected for a stated purpose, repurposing it requires a new notice cycle. The reference to 'paragraph (c)' appears to be a drafting error and likely intends paragraph (b) which contains the notice requirements.
Statutory Text
(e) An employer shall not use employee data collected via an electronic monitoring tool for purposes other than those specified in the notice provided pursuant to paragraph (c) of subdivision one of this section.
D-01 Automated Processing Rights & Data Controls · D-01.3 · Deployer · Employment
Chapter 149B, § 2(f)
Plain Language
Employers are prohibited from selling, transferring, or disclosing employee monitoring data to third parties except where required by federal or state law, or where necessary for an impact assessment of an automated employment decision tool. This creates a near-absolute ban on data sharing from workplace monitoring tools.
Statutory Text
(f) An employer shall not sell, transfer, or disclose employee data collected via an electronic monitoring tool to any other entity unless it is required to do so under federal law or the laws of the commonwealth, or necessary to do so to comply with an impact assessment of an automated employment decision tool pursuant to section one thousand twelve of this article.
Other · Employment
Chapter 149B, § 2(g)
Plain Language
Employers may not require employees to physically implant data-collecting devices (including subcutaneous implants or wearable accessories), install monitoring applications on personal devices, or use devices with location tracking unless the tracking is limited to work hours and strictly necessary for essential job functions. This is a categorical prohibition on the most invasive forms of workplace surveillance hardware and software.
Statutory Text
(g) An employer shall not require employees to: (i) physically implant devices that collect or transmit data, including devices that are installed subcutaneously or incorporated into items of clothing or personal accessories; (ii) install applications on personal devices that collect or transmit employee data or to wear or embed those devices; or (iii) carry or use any device with location tracking applications or services enabled unless the location tracking is: (A) conducted during work hours only; and (B) strictly necessary to accomplish essential job functions and narrowly limited to only the activities and times necessary to accomplish essential job functions.
H-01 Human Oversight of Automated Decisions · H-01.6 · Deployer · Employment
Chapter 149B, § 2(h)
Plain Language
When employment decisions (hiring, promotion, discipline, termination, compensation) are based in whole or part on electronically monitored data, an employer may not rely primarily on that data. Three requirements apply: (1) the employer must establish meaningful human oversight — including a designated internal reviewer with expertise, authority to override, and adequate time/resources; (2) a human decision-maker must actually review the monitoring data, verify its accuracy, address pending correction requests, and exercise independent judgment; and (3) the human must consider non-monitoring information such as supervisory evaluations, personnel files, work products, or peer reviews.
Statutory Text
(h) An employer shall not rely primarily on employee data collected through electronic monitoring when making hiring, promotion, disciplinary decisions up to and including termination, or compensation decisions. For an employer to satisfy the requirements of this paragraph: (i) An employer shall establish meaningful human oversight of such decisions based in whole or in part on data collected through electronic monitoring. (ii) A human decision-maker must actually review any information collected through electronic monitoring, verify that such information is accurate and up to date, review any pending employee requests to correct erroneous data, and exercise independent judgment in making each such decision; and (iii) The human decision-maker must consider information other than information collected through electronic monitoring when making each such decision, such as but not limited to, supervisory or managerial evaluations, personnel files, employee work products, or peer reviews.
H-01 Human Oversight of Automated Decisions · H-01.1H-01.2 · Deployer · Employment
Chapter 149B, § 2(i)
Plain Language
When an employment decision is based in whole or part on electronically monitored data, the employer must disclose to affected employees — at least 30 days before the decision takes effect — four categories of information: that monitoring data was used, which specific tools were used and how they work, the specific data and derived judgments used in the decision, and any non-monitoring information also used. The 30-day advance notice requirement is unusually long and creates a significant operational constraint, effectively requiring employers to finalize their decision rationale a month before implementation.
Statutory Text
(i) When an employer makes a hiring, promotion, termination, disciplinary or compensation decision based in whole or part on data gathered through the use of electronic monitoring, it shall disclose to affected employees no less than thirty days prior to the decision going into effect: (i) that the decision was based in whole or part on data gathered through electronic monitoring; (ii) the specific electronic monitoring tool or tools used to gather such data, how the tools work to gather and analyze the data, and the increments of time in which the data is gathered; (iii) the specific data, and judgments based upon such data, used in the decision-making process; and (iv) any information used in the decision-making process gathered through sources other than electronic monitoring.
H-02 Non-Discrimination & Bias Assessment · H-02.3H-02.6 · Deployer · Employment
Chapter 149B, § 2(j)
Plain Language
Before using electronic monitoring (alone or with an ADS), an employer must have an independent impact assessment conducted. The assessment must be completed within one year before use (or within six months of the statute's effective date for existing monitoring). The auditor must be independent with no financial or legal conflicts. The assessment must evaluate data protection practices, identify allowable purposes, describe potential legal violations and mitigation steps, and assess negative impacts on employee privacy and job quality. The five-year look-back independence requirement for auditors is unusually strict.
Statutory Text
(j) It shall be unlawful for an employer to use electronic monitoring, alone or in conjunction with an automated employment decision system, unless the employer's proposed use of electronic monitoring has been the subject of an impact assessment. Such impact assessments must: (i) be conducted no more than one year prior to the use of such electronic monitoring, or where the electronic monitoring began before the effective date of this article, within six months of the effective date of this article; (ii) be conducted by an independent and impartial party with no financial or legal conflicts of interest; (iii) evaluate whether the data protection and security practices surrounding the electronic monitoring are consistent with applicable law and cybersecurity industry best practices; (iv) identify which allowable purpose(s) described in this chapter; (vi) consider and describe any other ways in which the electronic monitoring could result in a violation of applicable law and, for any finding that a violation of law may occur, any necessary or appropriate steps to prevent such violation of law; and (vii) consider and describe whether the electronic monitoring may negatively impact employees' privacy and job quality, including wages, hours, and working conditions.
H-02 Non-Discrimination & Bias Assessment · H-02.3H-02.6H-02.7H-02.8 · Deployer · Employment
Chapter 149B, § 3(a)-(b)
Plain Language
Before using any ADS for employment decisions, an employer must have a comprehensive independent impact assessment conducted. The assessment must cover thirteen enumerated elements including: modeling techniques and attributes, scientific validity, proxy analysis for protected classes, training data disparities, output disparate impact, disability accessibility, post-deployment adverse impact risks, least-discriminatory-method analysis, legal compliance, privacy/job quality impacts, and a catch-all discrimination risk assessment. The completed assessment must be submitted to the Department of Labor Standards within 60 days for inclusion in a public registry, and distributed to affected employees. Annual follow-up assessments are required for as long as the tool remains in use, evaluating any changes in validity or disparate impact.
Statutory Text
a) It shall be unlawful for an employer to use an automated employment decision tool for an employment decision, alone or in conjunction with electronic monitoring, unless such tool has been the subject of an impact assessment. Impact assessments must: (i) be conducted no more than one year prior to the use of such tool, or where the tool was in use by the employer before the effective date of this article, within six months of the effective date of this article; (ii) be conducted by an independent and impartial party with no financial or legal conflicts of interest; (iii) identify and describe the attributes and modeling techniques that the tool uses to produce outputs; (iv) evaluate whether those attributes and techniques are a scientifically valid means of evaluating an employee or candidate's performance or ability to perform the essential functions of a role, and whether those attributes may function as a proxy for belonging to a protected class under chapter 151B or any other applicable law; (v) consider, identify, and describe any disparities in the data used to train or develop the tool and describe how those disparities may result in a disparate impact on persons based on their race, color, religious creed, national origin, sex, gender identity, sexual orientation, genetic information, pregnancy or a condition related to said pregnancy including, but not limited to, lactation or the need to express breast milk for a nursing child, ancestry or status as a veteran, and what actions may be taken by the employer or vendor of the tool to reduce or remedy any disparate impact; (vi) consider, identify, and describe any outputs produced by the tool that may result in a disparate impact on persons based on their race, color, religious creed, national origin, sex, gender identity, sexual orientation, genetic information, pregnancy or a condition related to said pregnancy including, but not limited to, lactation or the need to express breast milk for a nursing child, ancestry or status as a veteran, and what actions may be taken by the employer or vendor of the tool to reduce or remedy that disparate impact; (vii) evaluate whether the use of the tool may limit accessibility for persons with disabilities, or for persons with any specific disability, and what actions may be taken by the employer or vendor of the tool to reduce or remedy the concern; (viii) consider and describe potential sources of adverse impact against individuals or groups based on race, color, religious creed, national origin, sex, gender identity, sexual orientation, genetic information, pregnancy or a condition related to said pregnancy including, but not limited to, lactation or the need to express breast milk for a nursing child, ancestry or status as a veteran that may arise after the tool is deployed; (ix) identify and describe any other assessment of risks of discrimination or a disparate impact of the tool on individuals or groups based on race, color, religious creed, national origin, sex, gender identity, sexual orientation, genetic information, pregnancy or a condition related to said pregnancy including, but not limited to, lactation or the need to express breast milk for a nursing child, ancestry or status as a veteran that arise over the course of the impact assessment, and what actions may be taken to reduce or remedy that risk; (x) for any finding of a disparate impact or limit on accessibility, evaluate whether the data set, attribute, or feature of the tool at issue is the least discriminatory method of assessing a candidate's performance or ability to perform job functions; (xi) consider and describe any other ways in which the tool could result in a violation of applicable law and, for any finding that a violation of law may occur, any necessary or appropriate steps to prevent such violation of law; (xii) consider and describe whether use of the tool may negatively impact employees' privacy and job quality, including wages, hours, and working conditions; and (xiii) be submitted in its entirety or an accessible summary form to the department for inclusion in a public registry of such impact assessments within sixty days of completion and distributed to employees who may be subject to the tool. (b) An employer shall conduct or commission subsequent impact assessments each year that the tool is in use to assist or replace employment decisions. Subsequent impact assessments shall comply with the requirements of paragraph (a) of this section, and shall assess and describe any change in the validity or disparate impact of the tool.
G-01 AI Governance Program & Documentation · G-01.3G-01.4 · DeployerDeveloper · Employment
Chapter 149B, § 3(c)-(d)
Plain Language
Employers and their vendors must retain all documentation necessary for impact assessments, including data sources, technical specifications, developer identities, historical use data, and a version history of the tool. Vendors must grant employers a license to access this documentation for sharing with labor organizations or courts as required by law. Documentation must be stored in a form that is legible and accessible to auditors, per commissioner specifications. Employee data collected for impact assessments must be handled to protect privacy, cannot be shared with the employer, and may only be shared with others strictly necessary for the assessment.
Statutory Text
(c) An employer or its vendor shall retain all documentation pertaining to the design, development, use, and data of an automated employment decision tool that may be necessary to conduct an impact assessment. To the extent held by a vendor, the employer shall be granted a license to access this documentation and share this documentation with a labor organization to the extent required by federal or state law, or to the extent required by a court or agency in connection with employment or labor litigation. This includes but is not limited to the source of the data used to develop the tool, the technical specifications of the tool, individuals involved in the development of the tool, and historical use data for the tool. Such documentation must include a historical record of versions of the tool, such that an employer shall be able to attest in the event of litigation disputing an employment decision, the nature and specifications of the tool as it was used at the time of that employment decision. Such documentation shall be stored in accordance with such record-keeping, data retention, and security requirements as the commissioner may specify, and in such a manner as to be legible and accessible to the party conducting an impact assessment. (d) If an initial or subsequent impact assessment requires the collection of employee data to assess a tool's disparate impact on employees, such data shall be collected, processed, stored, retained, and disposed of in such a manner as to protect the privacy of employees, and shall comply with any data retention and security requirements specified by the commissioner. Employee data provided to auditors for the purpose of an impact assessment shall not be shared with the employer, nor shall it be shared with any person, business entity, or other organization unless strictly necessary for the completion of the impact assessment.
H-02 Non-Discrimination & Bias Assessment · H-02.3 · DeployerDeveloper · Employment
Chapter 149B, § 3(e)-(f)
Plain Language
If an impact assessment finds disparate impact or accessibility limitations, the employer must immediately cease using the tool until remediation is complete. The employer must take reasonable steps to remedy the issue and document those steps in writing to employees, the auditor, and the Department. If the employer disputes the finding or believes its remediation is sufficient, it must provide a written explanation showing the tool is the least discriminatory method available. Separately, it is unlawful for any auditor, vendor, or employer to manipulate, conceal, or misrepresent impact assessment results — this is an independent prohibition that applies to all three parties.
Statutory Text
(e) If an initial or subsequent impact assessment concludes that a data set, feature, or application of the automated employment decision tool results in a disparate impact on individuals or groups based on race, color, religious creed, national origin, sex, gender identity, sexual orientation, genetic information, pregnancy or a condition related to said pregnancy including, but not limited to, lactation or the need to express breast milk for a nursing child, ancestry or status as a veteran, or unlawfully limits accessibility for persons with disabilities, an employer shall refrain from using the tool until it: (i) takes reasonable and appropriate steps to remedy that disparate impact or limit on accessibility and describe in writing to employees, the auditor, and the department what steps were taken; and (ii) if the employer believes the impact assessment finding of a disparate impact or limit on accessibility is erroneous, or that the steps taken in accordance with subparagraph (i) of this paragraph sufficiently address those findings such that the tool may be lawfully used in accordance with this article, describes in writing to employees, the auditor, and the department how the data set, feature, or application of the tool is the least discriminatory method of assessing an employee's performance or ability to complete essential functions of a position. (f) It shall be unlawful for an independent auditor, vendor, or employer to manipulate, conceal, or misrepresent the results of an impact assessment.
H-01 Human Oversight of Automated Decisions · H-01.3 · Deployer · Employment
Chapter 149B, § 4(a)-(b)
Plain Language
Employers using ADS tools to evaluate employees or candidates must provide detailed notice at least 10 business days before use. The notice must cover six categories: that an ADS tool will be used, the qualifications/characteristics assessed and data inputs/outputs, data sources and retention policies, results of the most recent impact assessment including disparate impact findings, how to request an alternative non-ADS process or accommodation, and how to request reevaluation or file a civil complaint. The notice must be in plain language, included in job postings, posted on the employer's website in all regularly used languages, provided directly to candidates in their language, and made accessible for persons with disabilities.
Statutory Text
(a) Any employer that uses an automated employment decision tool to assess or evaluate an employee or candidate shall notify employees and candidates subject to the tool no less than ten business days before such use: (i) that an automated employment decision tool will be used in connection with the assessment or evaluation of such employee or candidate; (ii) the job qualifications and characteristics that such automated employment decision tool will assess, what employee or candidate data or attributes the tool will use to conduct that assessment, and what kind of outputs the tool will produce as an evaluation of such employee or candidate; (iii) what employee or candidate data is collected for the automated employment decision tool, the source of such data and the employer's data retention policy. Information pursuant to this section shall not be disclosed where such disclosure would violate local, state, or federal law, or interfere with a law enforcement investigation; (iv) the results of the most recent impact assessment of the automated employment decision tool, including any findings of a disparate impact and associated response from the employer, or information about how to access that information if publicly available; (v) information about how an employee or candidate may request an alternative selection process or accommodation that does not involve the use of an automated employment decision tool and details about that alternative process or accommodation process; and (vi) information about how the employee or candidate may: (A) request reevaluation of the employment decision made by the automated employment decision tool in accordance with section one thousand thirteen of this article; and (B) notification of the employee or candidate's right to file a complaint in a civil court in accordance with section seven of this chapter or otherwise exercise the rights described in this chapter. (b) The notice required by this section shall be: (i) written in clear and plain language; (ii) included in each job posting or advertisement for each position for which the automated employment decision tool will be used; (iii) posted on the employer's website in any language that the employer regularly uses to communicate with employees; (iv) provided directly to each candidate who applies for a position in the language with which that candidate communicates with the employer; (v) made available in formats that are reasonably accessible to and usable by individuals with disabilities; and (vi) otherwise presented in a manner that ensures the notice clearly and effectively communicates the required information to employees.
S-02 Prohibited Conduct & Output Restrictions · Deployer · EmploymentBiometrics
Chapter 149B, § 5(a)
Plain Language
Seven categorical prohibitions apply to ADS use in employment: no use that violates any law; no use that harms employee health or safety including through dangerous productivity quotas; no personality, behavior, belief, or emotional state predictions about employees or candidates; no interference with protected labor activity; no wage deductions for time exercising legal rights; no deviation from the tool's post-impact-assessment specifications; and no facial recognition, gait, or emotion recognition technology. The ban on behavior and personality prediction is notably broad and would restrict common pre-employment assessment tools.
Statutory Text
(a) Notwithstanding the provisions of subdivision one of this section, an employer shall not, alone or in conjunction with an electronic monitoring tool, use an automated decision tool: (i) in such a manner that results in a violation of labor, employment, or civil rights law or any other law of the commonwealth; (ii) in a manner that harms or is likely to harm the health or safety of employees, including by setting productivity quotas in a manner that is likely to cause physical or mental illness or injury; (iii) to make predictions about an employee or candidate for employment's behavior, beliefs, intentions, personality, emotional state, or other characteristic or behavior; (iv) to predict, interfere with, restrain, or coerce employees engaging in activity protected under labor and employment law; (v) to subtract from an employee's wages time spent exercising their legal rights; (vi) in a manner that deviates from the specification of the automated employment decision tool as implemented after the incorporation of any alterations made pursuant to the impact assessment required by subdivision one of this section; or (vii) that involves facial recognition, gait, or emotion recognition technologies.
H-01 Human Oversight of Automated Decisions · H-01.6 · Deployer · Employment
Chapter 149B, § 5(b)-(c)
Plain Language
Employers may not rely primarily on ADS output for consequential employment decisions. Four requirements apply: meaningful human oversight with a qualified internal reviewer (evaluated based on tool complexity, the reviewer's training and experience, and ability to consult experts); actual human review of ADS output with independent judgment; the human must consider non-ADS information; and the employer itself must consider non-ADS information. Additionally, employers may not condition employment consideration on consent to ADS evaluation and may not disadvantage anyone who requests an accommodation. The reviewer competency standard is more detailed than most jurisdictions, providing multi-factor guidance.
Statutory Text
(b) An employer shall not rely primarily on output from an automated decision tool when making hiring, promotion, termination, disciplinary, or compensation decisions. For an employer to satisfy the requirements of this paragraph: (i) An employer must establish meaningful human oversight of such decisions based in whole or in part on the output of automated employment decision tools. In determining whether an internal reviewer employs the requisite knowledge and skill to provide meaningful human oversight, relevant factors include the relative complexity and specialized nature of the automated decision tool, the reviewer's general experience, the reviewer's training and experience in the field, the preparation and study the reviewer is able to give the matter and whether it is feasible to refer the matter to, or associate or consult with, an expert with established competence in the field automated decision tools. (ii) A human decision-maker must actually review any output of an automated employment decision tool and exercise independent judgment in making each such decision; (iii) The human decision-maker must consider information other than automated employment decision tool outputs when making each such decision, such as but not limited to supervisory or managerial evaluations, personnel files, employee work products, or peer reviews; and (iv) An employer shall consider information other than automated employment decision tool outputs when making hiring, promotion, termination, disciplinary, or compensation decisions, such as supervisory or managerial evaluations, personnel files, employee work products, or peer reviews. (c) An employer shall not require employees or candidates to consent to the use of an automated employment decision tool in an employment decision in order to be considered for an employment decision, nor shall an employer discipline or disadvantage an employee or candidate for employment as a result of their request for accommodation.
G-03 Whistleblower & Anti-Retaliation Protections · G-03.3 · Deployer · Employment
Chapter 149B, § 6(a)
Plain Language
Employees who exercise independent judgment or hold professional licenses are protected from retaliation when they refuse to follow AI system output, provided four conditions are met: (1) the employee has independent judgment or required licensure; (2) the employee notified their employer that the AI output may cause harm, illegality, or licensing violations, and the employer failed to adjust; (3) the refusal was in good faith based on training/experience; and (4) the urgency of potential harm does not allow time for correction through department action. This is a conditional safe harbor for professional override of AI outputs, similar to whistleblower protections but specifically tailored to AI output refusal.
Statutory Text
a) An employee shall be protected from termination, disciplinary action, retaliation, or other adverse employment action for refusing to follow the output of an artificial intelligence system, automated decision system, algorithm, or other similar technology if the following conditions are met: i) The employee holds independent judgment and discretion in executing their work duties, or the work duties to be performed by the employee require licensure or certification by the commonwealth as a condition of employment, independent accreditation by the employer; ii) The employee has notified a supervisor, manager, or their employer that the output from the artificial intelligence system, automated decision system, algorithm, or other similar technology may, in the employee's professional opinion and/or educational or work related- experience, lead to harm of a natural person, damage to physical property, an illegal action, an action contrary to the licensure or certification requirements of the Federal government, commonwealth, or an applicable private licensing or certifying authority, or an outcome contrary to the goal of the employer, and the employer refused or otherwise failed to adjust the output; iii) The employee has refused to follow the output in good faith and with the knowledge or reasonable belief, based upon training, education, or experience, that the output would cause harm or have an adverse impact; and iv) Due to the urgency of the potential harm or adverse impact, there is not enough time for the output to be corrected through department action.
Other · Employment
Chapter 149B, § 7
Plain Language
Employees may not be penalized for exercising their rights under this chapter. Employer retaliation against employees who complain, assist investigations, or testify is independently unlawful and subject to civil citation under section 27C. Employees subjected to adverse employment actions based on prohibited conduct may file civil actions against the employer and individual officers (president, treasurer, responsible managers), recovering restitution, consequential damages, double liquidated damages, interest, attorney fees, and potentially punitive damages. This section establishes the enforcement mechanism rather than creating a standalone compliance obligation.
Statutory Text
No employee shall be penalized by an employer in any way as a result of any action on the part of an employee to seek the employee's rights under the provisions of this chapter. Any employer who discharges or in any other manner discriminates against any employee because such employee has made a complaint to the attorney general or any other department, agency, or person, or assists the attorney general or department in any investigation under this chapter, or has instituted, or caused to be instituted any proceeding under or related to this chapter, or has testified or is about to testify in any such proceedings, shall have violated this section and shall be punished or shall be subject to a civil citation or order as provided in section 27C. An individual subjected to an adverse employment action based on conduct prohibited by this Act may file a civil action against an employer, as well as the president, treasurer, and any responsible managers of the employer in their individual capacity. If liability is found, the employee shall be entitled to restitution and consequential damages, as well as liquidated damages constituting double the amount of restitution, pre- and post- judgment interest, reasonable attorneys' fees and costs. Where appropriate, a court may also impose punitive damages. Nothing in this section shall limit the availability of other remedies at law or in equity.
PS-01 Government AI Accountability · PS-01.4 · Government · Government SystemAutomated Decisionmaking
M.G.L. c. 30, § 66
Plain Language
State agencies and their contractors are categorically prohibited from using any automated decision system for three categories of functions — public assistance delivery, functions materially impacting individual rights/safety/welfare, or functions affecting statutory or constitutional rights — unless the specific use is authorized by law. This is an extremely restrictive default-prohibition approach: agencies must affirmatively identify statutory authorization before deploying any ADS in these contexts. The scope is broad enough to cover virtually any consequential government ADS use.
Statutory Text
Any agency or department of the commonwealth, or any entity acting on behalf of an agency or department, shall be prohibited from, directly or indirectly, utilizing or applying any automated decision system in performing any function that: (i) is related to the delivery of any public assistance benefit; (ii) will have a material impact on the rights, civil liberties, safety, or welfare of any individual within the commonwealth; or (iii) affects any statutorily or constitutionally provided right of an individual; unless such utilization or application is specifically authorized in law.
PS-01 Government AI Accountability · PS-01.4 · Government · Government SystemAutomated Decisionmaking
M.G.L. c. 30B, § 24(a)
Plain Language
State executive branch entities may not procure, purchase, or acquire any service or system utilizing automated decision systems unless the use is specifically authorized by law. This extends the prohibition on government ADS use from operational deployment (Section 66) to the procurement stage — agencies cannot even acquire ADS tools without statutory authorization, creating a procurement-stage gate in addition to the deployment-stage gate.
Statutory Text
a) No executive office, department, division, agency, or commission of the commonwealth shall authorize any procurement, purchase, or acquisition of any service or system utilizing, or relying on, automated decision systems, except where the use of such system is specifically authorized in law.
PS-01 Government AI Accountability · PS-01.2 · Government · Government SystemAutomated Decisionmaking
M.G.L. c. 30B, § 24(b)-(d)
Plain Language
State agencies that have statutory authorization to use ADS must still conduct comprehensive impact assessments before deployment, every two years thereafter, and before any material system change. Assessments must cover six areas: system objectives and effectiveness, algorithms and training data, testing for accuracy/fairness/bias/discrimination/cybersecurity/safety/misuse, personal data use, and individual notification mechanisms. If an assessment finds discriminatory or biased outcomes, the agency must immediately cease all use of the system and any information it produced. Assessments must be submitted to the Governor, Senate President, and House Speaker at least 60 days before implementation and published on the agency's website (with limited redaction authority for safety, privacy, or IT security concerns, accompanied by an explanatory statement).
Statutory Text
b) No state agency shall utilize or apply any automated decision system unless the agency, or an entity acting on behalf of such state agency, shall have conducted an impact assessment for the application and use of such automated decision system. Following the first impact assessment, an impact assessment shall be conducted at least once every two years. An impact assessment shall be conducted prior to any material change to the automated decision-making system that may change the outcome or effect of such system. Such impact assessments shall include: i) a description of the objectives of the automated decision system; ii) an evaluation of the ability of the automated decision system to achieve its stated objectives; iii) a description and evaluation of the objectives and development of the automated decision system including: 1) A summary of the underlying algorithms, computational modes, and artificial intelligence tools that are used within the automated decision system; and 2) The design and training data used to develop the automated decision-making process. iv) testing for: 1) Accuracy, fairness, bias, and discrimination, and an assessment of whether the use of the automated decision-making system produces discriminatory results on the basis of a consumer's or a class of consumers' actual or perceived race, ethnicity, religion, national origin, sex, gender, gender identity, sexual orientation, familial status, biometric information, source of income, or disability and outlines mitigations for any identified performance differences in outcomes across relevant groups impacted by such use; 2) Any cybersecurity vulnerabilities and privacy risks resulting from the deployment and use of the automated decision-making system, and the development or existence of safeguards to mitigate the risks; 3) Any public health or safety risks resulting from the deployment and use of the automated decision-making system; 4) Any reasonably foreseeable misuse of the automated decision-making system and the development or existence of safeguards against such misuse; v) the extent to which the deployment and use of the automated decision-making system requires the input of sensitive and personal data, how that data is used and stored, and any control users may have over their data; and vi) the notification mechanism or procedure, if any, by which individuals impacted by the utilization of the automated decision-making system may be notified of the use of such automated decision-making system and of the individual's personal data, and informed of their rights and options relating to such use. c) Notwithstanding the provisions of this section or any other law, if an impact assessment finds that the automated decision-making system produces discriminatory or biased outcomes, the state agency shall cease any utilization, application, or function of such automated decision-making system, and of any information produced using that system. d) Any impact assessment conducted pursuant to this section shall be submitted to the governor, the president of the senate, and the speaker of the house at least 60 days prior to the implementation of the automated decision-making system that is the subject of such assessment. The impact statement of an automated decision-making system that is approved and utilized, shall be published on the website of the relevant agency. If the state agency makes a determination that the disclosure of any information required in the impact assessment would result in a substantial negative impact on health or safety of the public, infringe upon the privacy rights of individuals, or significantly impact the state agency's ability to protect its information technology, it may redact such information, provided that an explanatory statement on the process by which the state agency made such determination is published along with the redacted impact assessment.