HB-4980
IL · State · USA
IL
USA
● Pending
Proposed Effective Date
2026-01-01
Illinois HB 4980 — Meaningful Human Control of Artificial Intelligence Act
Imposes obligations on public employers (including contractors and subcontractors) that use automated decision-making systems in functions affecting public assistance programs, employee rights, or employee welfare. Requires meaningful and continuing human review of all such systems, pre-deployment and biennial impact assessments signed by an independent auditor, employee notification and appeals processes, and publication of impact assessments. Prohibits use of automated systems for behavioral/emotional predictions about employees, wage deductions for exercising legal rights, employment decisions without human review, and facial/gait/emotion recognition. Protects employees from retaliation for refusing to follow AI outputs. Enforced by the Department of Labor with a $5,000 per-violation civil penalty, plus a private right of action with compensatory damages, liquidated damages, up to $500 per violation, and attorney's fees.
Summary

Imposes obligations on public employers (including contractors and subcontractors) that use automated decision-making systems in functions affecting public assistance programs, employee rights, or employee welfare. Requires meaningful and continuing human review of all such systems, pre-deployment and biennial impact assessments signed by an independent auditor, employee notification and appeals processes, and publication of impact assessments. Prohibits use of automated systems for behavioral/emotional predictions about employees, wage deductions for exercising legal rights, employment decisions without human review, and facial/gait/emotion recognition. Protects employees from retaliation for refusing to follow AI outputs. Enforced by the Department of Labor with a $5,000 per-violation civil penalty, plus a private right of action with compensatory damages, liquidated damages, up to $500 per violation, and attorney's fees.

Enforcement & Penalties
Enforcement Authority
Department of Labor has primary enforcement authority, including power to investigate, issue cease-and-desist orders, conduct inspections, compel testimony via subpoena, and assess civil penalties. Any interested party (employee or exclusive bargaining representative) may file a complaint with the Department within 180 days of an alleged violation; the Department must notify the employer within 120 days. Investigations may include public hearings. Separately, any interested party or aggrieved employee may bring a private action in circuit court without exhausting administrative remedies. Class actions are permitted on behalf of similarly situated interested parties.
Penalties
Administrative enforcement: $5,000 per violation payable to the Department, plus affirmative relief including rehiring, reinstatement, and back pay. Private right of action: (i) lost wages, salary, employment benefits, or other compensation denied or lost plus an equal amount in liquidated damages; (ii) compensatory damages and up to $500 for each violation; (iii) in the case of unlawful retaliation, all legal or equitable relief as may be appropriate; and (iv) attorney's fees and costs. Statutory per-violation damages do not require proof of actual monetary harm.
Who Is Covered
"Employer" means a public body or any entity acting on behalf of a public body, including, but not limited to, contractors and subcontractors.
"Employee" means any person employed by a public body, State agency, or any entity acting on behalf of a State agency, including, but not limited to, contractors and subcontractors.
"Interested party" means an employee with an interest in compliance with this Act, or an exclusive bargaining representative of an employee.
What Is Covered
"Automated decision-making system" means any software that uses algorithms, computational models, or artificial intelligence techniques, or a combination thereof, to automate, support, or replace human decision-making, without any meaningful human review, including, without limitation, systems that process data and apply predefined rules or machine learning algorithms to analyze the data and generate conclusions, recommendations, outcomes, assumptions, projections, or predictions. "Automated decision-making system" does not include any software used primarily for basic computerized processes, such as calculators, spellcheck tools, autocorrect functions, spreadsheets, electronic communications, or any tool that relates only to internal management affairs, such as inventory control and ordering or processing payments, that does not adversely affect the rights, liberties, benefits, safety, or welfare of any individual in this State.
Compliance Obligations 11 obligations · click obligation ID to open requirement page
H-01 Human Oversight of Automated Decisions · H-01.6 · Government · Government SystemEmploymentAutomated Decisionmaking
Section 10(a)
Plain Language
Public employers may not use, procure, or acquire any automated decision-making system for functions related to public assistance administration, employee rights, civil liberties, safety, or welfare without meaningful and continuing human review. The human reviewer must understand the system's risks and limitations, be trained on the system, have actual authority to intervene and override outputs (including rejecting uncorroborated outputs), and have adequate time and resources. This is not a one-time gate — human review must be continuing throughout the system's operation. The obligation covers both direct use and indirect use through contractors and subcontractors.
Statutory Text
(a) An employer shall not use or apply, or authorize any procurement, purchase, or acquisition of any service or system using or relying on any automated decision-making system, directly or indirectly, without meaningful and continuing human review when performing any function that: (1) is related to the administration of any public assistance program; (2) will have an adverse impact on the rights, civil liberties, safety, or welfare of any employee in this State; or (3) affects any statutorily or constitutionally provided rights of an employee.
H-01 Human Oversight of Automated Decisions · H-01.3H-01.4H-01.5 · Government · Government SystemEmploymentAutomated Decisionmaking
Section 10(b)
Plain Language
Before or at the time of any automated decision affecting an employee under a covered function, the employer must: (1) notify the affected employee that the decision was made using an automated decision-making system; (2) provide an appeals process for employees directly impacted by such decisions; and (3) provide the opportunity for an alternative human review by an individual working for or on behalf of the employer, independent of the automated system. All three requirements are prerequisites to use — the system may not be used without them in place. The alternative review must be by a human who is independent of the automated system, meaning they cannot simply rubber-stamp the system's output.
Statutory Text
(b) An employer shall not use or apply any automated decision-making system, directly or indirectly, to perform any function described in subsection (a) without providing: (1) a notice to any affected employee no later than the time a decision is issued to that employee that a decision concerning the employee was made using an automated decision-making system; (2) an appeals process for decisions made by automated decision-making system in which an employee is impacted as a direct result of the use of the automated decision-making system; and (3) the opportunity for an affected employee to have an appropriate alternative review, by an individual working for or on behalf of the employer with respect to the decision, independent of the automated decision-making system.
S-02 Prohibited Conduct & Output Restrictions · Government · Government SystemEmploymentAutomated DecisionmakingBiometrics
Section 10(c)
Plain Language
Public employers face four categorical prohibitions on automated decision-making system use: (1) predicting employees' or candidates' behavior, beliefs, intentions, personality, or emotional state; (2) automatically deducting wages for time spent exercising legal rights; (3) using such systems for any employment decision — including hiring, firing, promotion, discipline, performance evaluation, work assignment, productivity requirements, and workplace safety — for employees, candidates, independent contractors, subcontractors, or interns; and (4) any use involving facial recognition, gait recognition, or emotion recognition. These are outright prohibitions, not disclosure-triggered obligations. Note that prohibition (3) is extremely broad — it effectively bars automated decision-making systems from the entire employment lifecycle without the meaningful human review required by Section 10(a).
Statutory Text
(c) An employer shall not use or apply any automated decision-making system, directly or indirectly: (1) to make predictions about an employee's or employment candidate's behavior, beliefs, intentions, personality, emotional state, or other characteristics or behaviors; (2) to subtract from an employee's wages for time spent exercising the employee's legal rights; (3) in relation to performance evaluation, hiring, recruitment, discipline, promotion, termination, duties, assignment of work, access to work opportunities, productivity requirements, workplace health and safety, or other terms or conditions of employment for any persons classified as employees, candidates for employment, independent contractors, subcontractors, or interns; or (4) that involves facial recognition, gait recognition, or emotion recognition technologies.
D-01 Automated Processing Rights & Data Controls · D-01.1D-01.2 · Government · Government SystemEmploymentAutomated Decisionmaking
Section 10(f)
Plain Language
Whenever an automated decision-making system collects data about employees, both the affected employees and their exclusive bargaining representatives have the right to view the data collected. This is a data access right — employees can see what data the system has gathered about them. The right extends to bargaining representatives, enabling union oversight of data collection practices. The statute does not specify a mechanism or timeframe for requests, but the right is unconditional whenever data collection is occurring.
Statutory Text
(f) If an automated decision-making system is collecting employee data, employees and their exclusive bargaining representatives have a right to view the data collected by the automated decision-making system.
Other · Government SystemEmploymentAutomated Decisionmaking
Section 10(e)
Plain Language
Before procuring, purchasing, or using any automated decision-making system, employers must give prior notice to any labor organization and negotiate with exclusive bargaining representatives of potentially affected employees. Additionally, the system's deployment must not result in any employee displacement (including reductions in hours, wages, or benefits), transfer of existing or future employee duties to the automated system, or any negative impact on employee rights, benefits, civil service status, or collective bargaining membership. This is both a mandatory bargaining obligation and a substantive workforce-protection restriction — the system simply may not cause workforce reduction or displacement.
Statutory Text
(e) The procurement, purchase, acquisition, or use of an automated decision-making system shall not occur without prior notice to a labor organization and negotiations between the employer and any exclusive representatives of potentially affected employees and shall not result in: (1) discharge, displacement, or loss of position, including partial displacement, such as a reduction in hours, wages, or other employment benefits; (2) transfer of existing duties and functions currently performed by employees to an automated decision-making system; (3) transfer of future duties and functions ordinarily performed by employees to an automated decision-making system; or (4) any negative impact on the rights, benefits, and privileges of all existing employees, including terms and conditions of employment, civil service status, and collective bargaining unit membership, which shall be preserved and protected.
H-02 Non-Discrimination & Bias Assessment · H-02.1H-02.2H-02.3H-02.5H-02.6 · Government · Government SystemEmploymentAutomated Decisionmaking
Section 15(a)-(b)
Plain Language
Before deploying any automated decision-making system, the employer must complete an initial impact assessment at least 30 days prior to implementation. The assessment must be signed by both the human reviewer(s) responsible for meaningful human review and a qualified independent auditor who has had no involvement with, employment relationship with, or financial interest in the system's developer or deployer within the preceding five years. Subsequent assessments are required at least every two years and before any material change. Each assessment must include, in plain language: (1) system objectives and effectiveness evaluation; (2) a description of algorithms, AI tools, design, and training; (3) testing across seven risk categories — disparate impact on protected characteristics, accessibility, privacy and job quality, cybersecurity, public health/safety, foreseeable misuse, and sensitive data handling; and (4) an employee notification mechanism. The independent auditor conflict-of-interest bar is strict — a five-year lookback covering development involvement, employment, and direct or material indirect financial interest.
Statutory Text
(a) An employer seeking to use or apply an automated decision-making system permitted under Section 10 shall conduct an initial impact assessment, 30 days prior to implementation of the automated decision-making system, bearing the signature of: (1) one or more individuals responsible for meaningful human review of the system; and (2) an independent auditor. A person shall not be an independent auditor under this subsection if, at any point in the 5 years preceding the impact assessment, that person: (i) was involved in using, developing, offering, licensing, or deploying the automated decision-making system under review; (ii) had an employment relationship with a developer or deployer that uses, offers, or licenses the automated decision-making system under review; or (iii) had a direct or material indirect financial interest in a developer or deployer that uses, offers, or licenses the automated decision-making system under review. (b) Following the initial impact assessment, additional impact assessments shall be conducted at least once every 2 years and prior to any material changes to the automated decision-making system. Each impact assessment shall include, in plain language: (1) a description of the objectives of the automated decision-making system; (2) an evaluation of the system's ability to achieve those objectives; (3) a description and evaluation of the algorithms, computational models, and artificial intelligence tools used, including: (A) a summary of underlying algorithms and artificial intelligence tools; and (B) a description of the design and training to be used; (4) testing for: (A) disparate impact or discrimination based on protected characteristics, including, but not limited to discriminating against, persons based on their race, color, religious creed, national origin, sex, disability or perceived disability, gender identity, sexual orientation, genetic information, pregnancy or a condition related to pregnancy, ancestry, or status as a veteran and any actions to mitigate any impacts; (B) accessibility limitations for persons with disabilities; (C) privacy and job quality impacts, including wages, hours, and conditions and safeguards; (D) cybersecurity vulnerabilities and safeguards; (E) public health or safety risks; (F) foreseeable misuse and safeguards; and (G) use, storage, and control of sensitive or personal data; and (5) a notification mechanism for employees impacted by the use of the automated decision-making system.
H-02 Non-Discrimination & Bias Assessment · Government · Government SystemEmploymentAutomated Decisionmaking
Section 15(c)
Plain Language
If an impact assessment reveals the automated system produces discriminatory, biased, or inaccurate outcomes — or fails any of the employee notice, appeals, and alternative review requirements of Section 10(b) — the employer must immediately halt all use of the system and all reliance on its outputs. The employer must also take all steps necessary to remedy the identified harms. This is a mandatory shutdown obligation — there is no cure period or mitigation alternative. The system cannot resume until the deficiencies are resolved.
Statutory Text
(c) If an impact assessment finds that an automated decision-making system produces discriminatory, biased, or inaccurate outcomes or fails to meet or negatively impacts any of the measures described in subsection (b) of Section 10, the employer shall immediately cease any use or function of that system and of any information produced by it, and shall take all steps necessary to remedy the discriminatory, biased or inaccurate outcomes produced by the automated decision-making system.
G-02 Public Transparency & Documentation · G-02.4 · Government · Government SystemEmploymentAutomated Decisionmaking
Section 15(d)-(e)
Plain Language
Employers must notify affected employees and exclusive bargaining representatives of the results of each impact assessment and provide a copy of the assessment upon request. Additionally, each impact assessment must be published on the employer's website. Publication is subject to redaction limitations under Section 20, which permits redaction where disclosure would substantially harm public health or safety, infringe privacy, impair cybersecurity, or reveal security-related technology details — but any redaction must be accompanied by a published explanatory statement.
Statutory Text
(d) The employer shall notify affected employees and any exclusive bargaining representative, the results of each impact assessment, and provide a copy of the impact assessment upon request. (e) Each impact assessment shall be published on the employer's website, subject to the limitations set forth in Section 20.
R-02 Regulatory Disclosure & Submissions · R-02.1 · Government · Government SystemEmploymentAutomated Decisionmaking
Section 20(a)-(c)
Plain Language
State agencies must submit each impact assessment to the Governor and General Assembly at least 30 days before deploying the automated system. Other public bodies must submit the assessment to their director or governing body leadership on the same 30-day pre-implementation timeline. Employers may redact information from the assessment under two circumstances: (1) where disclosure would substantially harm public health/safety, infringe privacy, or impair IT/operational security; or (2) where the assessment covers security, fraud detection, or anti-harassment technology. In either case, the redacted assessment must be published alongside an explanatory statement describing the redaction rationale.
Statutory Text
(a) Each impact assessment conducted by a State agency under this Act shall be submitted to the Governor and the General Assembly at least 30 days prior to implementation of the automated decision-making system that is the subject of the assessment. Each impact assessment conducted by any other public body under this Act shall be submitted to the director of the public body or the executive officers or primary administrator of the relevant governing body at least 30 days prior to implementation of the automated decision-making system that is the subject of the assessment. (b) If the employer determines that disclosure of any information in the impact assessment would result in a substantial negative impact on public health or safety, infringe upon privacy rights, or significantly impair the employer's ability to protect its information technology or operational assets, the information may be redacted, if an explanatory statement describing the determination process for redaction is published along with the redacted assessment. (c) If the impact assessment covers technology used to prevent, detect, protect against, or respond to security incidents, identity theft, fraud, harassment, or other illegal activity, the employer may redact related information, if an explanatory statement describing the determination process for redaction is published along with the redacted assessment.
G-03 Whistleblower & Anti-Retaliation Protections · G-03.3 · Government · Government SystemEmploymentAutomated Decisionmaking
Section 25
Plain Language
Employees are protected from termination, discipline, retaliation, or any adverse employment action for refusing to follow an automated system's output when five conditions are met: (1) the employee's role involves independent judgment or requires state licensure/certification; (2) the employee notified a supervisor that the output may cause harm, illegality, or a bad outcome and the employer failed to correct it; (3) the employee is engaging in concerted activity for mutual aid and protection; (4) the refusal is in good faith based on training, education, or experience; and (5) urgency prevented waiting for a correction. Note that the statute uses 'and' connecting all five conditions — a strict reading requires all five to be satisfied simultaneously for protection to apply, which significantly narrows the anti-retaliation shield.
Statutory Text
An employee shall be protected from termination, disciplinary action, retaliation, or other adverse employment action for refusing to follow the output of an automated decision-making system if: (1) the employee exercises independent judgment and discretion in the employee's duties, or the employee's duties require State licensure, certification, or accreditation; (2) the employee notifies a supervisor or manager that the system's output may, in the employee's professional opinion, lead to harm, illegality, or an outcome contrary to the employer's goals, and the employer fails to correct the output; (3) the employee is engaging in concerted activity for the purpose of mutual aid and protection; (4) the employee refuses to follow the output in good faith based on training, education, or experience; and (5) due to urgency, there is insufficient time for correction.
H-01 Human Oversight of Automated Decisions · Government · Government SystemEmploymentAutomated Decisionmaking
Section 10(d)
Plain Language
The deployment of an automated decision-making system must not diminish existing employee rights under collective bargaining agreements or alter existing representational or bargaining relationships between employers and labor organizations. This is a preservation clause — it creates no new obligation but confirms that AI adoption cannot be used to circumvent existing labor agreements.
Statutory Text
(d) The use of an automated decision-making system shall not affect: (1) existing rights of employees covered by a collective bargaining agreement; or (2) existing representational relationships among labor organizations or bargaining relationships between an employer and a labor organization.