HB-1421
IN · State · USA
IN
USA
● Pending
Proposed Effective Date
2026-07-01
Indiana House Bill No. 1421 — Ban on Employer Use of Automated Decision Systems
Regulates employers with 11 or more covered individuals (including state agencies and political subdivisions) that use automated decision systems for employment-related decisions such as hiring, firing, promotion, pay, and scheduling. Prohibits employers from relying exclusively on an automated decision system for any employment decision. Permits use of automated decision system outputs only if the system has undergone predeployment testing for efficacy, anti-discrimination compliance, and NIST AI RMF conformance; is independently tested for bias at least annually with publicly available results; a human with relevant experience independently corroborates the output; and the covered individual receives detailed documentation and dispute/appeal rights. Requires comprehensive pre-use disclosures to all covered individuals. Enforced through a private right of action (with mandatory pre-suit notice to the Department of Labor) and Department of Labor complaint and investigation authority, with statutory damages ranging from $5,000 to $100,000 per violation.
Summary

Regulates employers with 11 or more covered individuals (including state agencies and political subdivisions) that use automated decision systems for employment-related decisions such as hiring, firing, promotion, pay, and scheduling. Prohibits employers from relying exclusively on an automated decision system for any employment decision. Permits use of automated decision system outputs only if the system has undergone predeployment testing for efficacy, anti-discrimination compliance, and NIST AI RMF conformance; is independently tested for bias at least annually with publicly available results; a human with relevant experience independently corroborates the output; and the covered individual receives detailed documentation and dispute/appeal rights. Requires comprehensive pre-use disclosures to all covered individuals. Enforced through a private right of action (with mandatory pre-suit notice to the Department of Labor) and Department of Labor complaint and investigation authority, with statutory damages ranging from $5,000 to $100,000 per violation.

Enforcement & Penalties
Enforcement Authority
Dual enforcement: the Indiana Department of Labor may receive complaints, investigate violations, and require employers to file reports or answers relating to automated decision system use. Private right of action available to any covered individual or labor organization adversely affected by an alleged violation. Before filing suit, the plaintiff must provide written notice to the Department, which then has 60 days to decide whether to intervene. The Department may also adopt rules to implement the chapter.
Penalties
Prevailing plaintiff may recover: (1) actual damages or up to treble damages; (2) statutory damages of $5,000–$20,000 per violation of Sections 10–13, or $10,000–$40,000 for willful or repeated violations of those sections; (3) statutory damages of $5,000–$50,000 per violation of Section 14 (anti-retaliation), or $10,000–$100,000 for willful or repeated violations; (4) injunctive relief; (5) equitable relief; (6) temporary relief including reinstatement for retaliation claims; and (7) mandatory reasonable attorney's fees and costs. Statutory damages are adjusted annually by CPI-U beginning FY 2027. Court considers nature, seriousness, number, persistence, duration of violations, willfulness, and employer's financial condition when setting statutory damages.
Who Is Covered
"employer" means the following: (1) A sole proprietor, corporation, partnership, limited liability company, or other entity that: (A) employs; or (B) otherwise engages for the performance of work for remuneration; eleven (11) or more covered individuals. (2) A state agency (as defined in IC 22-2-20-5). (3) A political subdivision (as defined in IC 36-1-2-13). (b) The term includes: (1) any person who acts, directly or indirectly, in the interest of an employer in relation to any covered individual performing work for remuneration for the employer; and (2) any successor in interest of an employer. (c) The term does not include a labor organization (as defined in IC 22-6-6-5), other than when the labor organization acts as an employer, or anyone acting in the capacity of an officer or agent of the labor organization.
What Is Covered
"automated decision system" means a system, software, or process, including a system, software, or process derived from machine learning, statistics, or other data processing or artificial intelligence techniques, that: (1) uses computation, in whole or in part, to: (A) determine outcomes; (B) make or aid decisions, including through evaluations, metrics, or scoring; (C) inform policy implementation; or (D) collect data or observations; and (2) is not passive computing infrastructure.
Compliance Obligations 11 obligations · click obligation ID to open requirement page
H-01 Human Oversight of Automated Decisions · H-01.6 · Deployer · EmploymentAutomated Decisionmaking
IC 22-5-10.4-10(1)
Plain Language
Employers are categorically prohibited from relying exclusively on an automated decision system — with no human involvement — to make any employment-related decision affecting a covered individual. This is an absolute prohibition: no amount of predeployment testing, disclosure, or documentation can cure a fully automated employment decision. Every employment decision using an automated decision system must include meaningful human involvement.
Statutory Text
An employer may not: (1) rely exclusively on an automated decision system in making an employment related decision with respect to a covered individual;
H-02 Non-Discrimination & Bias Assessment · H-02.1H-02.2H-02.7H-02.8 · Deployer · EmploymentAutomated Decisionmaking
IC 22-5-10.4-10(2)(A)-(B)
Plain Language
Before an employer may use any automated decision system output in an employment decision, the system must have completed predeployment testing and validation covering four areas: (i) system efficacy, (ii) compliance with a comprehensive list of federal employment discrimination statutes (Title VII, ADEA, ADA, GINA, EPA, Rehabilitation Act, Pregnant Workers Fairness Act), (iii) absence of discriminatory impact across race, color, religion, sex (including pregnancy, sexual orientation, and gender identity), national origin, age, disability, and genetic information, and (iv) compliance with the NIST AI Risk Management Framework (January 2023) or its successor. Additionally, the system must be independently tested for discriminatory impact or bias at least annually, and the results of that independent testing must be made publicly available. The annual testing requirement is ongoing — not a one-time predeployment check.
Statutory Text
An employer may not: (2) use an automated decision system output in making an employment related decision with respect to a covered individual unless: (A) the automated decision system used to generate the automated decision system output has had predeployment testing and validation with respect to: (i) the efficacy of the system; (ii) the compliance of the system with applicable employment discrimination laws, including Title VII of the Civil Rights Act of 1964 (42 U.S.C. 2000e et seq.), the Age Discrimination in Employment Act of 1967 (29 U.S.C. 621 et seq.), Title I of the Americans with Disabilities Act of 1990 (42 U.S.C. 12111 et seq.), Title II of the Genetic Information Nondiscrimination Act of 2008 (42 U.S.C. 2000ff et seq.), Section 6(d) of the Fair Labor Standards Act of 1938 (29 U.S.C. 206(d)), Sections 501 and 505 of the Rehabilitation Act of 1973 (29 U.S.C. 791 and 29 U.S.C. 793), and the Pregnant Workers Fairness Act (42 U.S.C. 2000gg); (iii) the lack of any potential discriminatory impact of the system, including discriminatory impact based on race, color, religion, sex (including pregnancy, sexual orientation, or gender identity), national origin, age, or disability, and genetic information (including family medical history); and (iv) the compliance of the system with the Artificial Intelligence Risk Management Framework released by the National Institute of Standards and Technology on January 26, 2023, or a successor framework; (B) the automated decision system is, not less than annually, independently tested for discriminatory impact described in clause (A)(iii) or potential biases and the results of the test are made publicly available;
H-01 Human Oversight of Automated Decisions · H-01.6 · Deployer · EmploymentAutomated Decisionmaking
IC 22-5-10.4-10(2)(E)
Plain Language
As a condition of using automated decision system output in any employment decision, the employer must have a human with appropriate and relevant experience independently corroborate the output through meaningful oversight. This is not a rubber-stamp review — the human must have subject-matter expertise relevant to the employment decision and must exercise independent judgment. The statute separately requires that the appeal reviewer (Section 10(2)(G)(ii)) be a different human than the one performing corroboration, creating a two-person minimum for human oversight.
Statutory Text
the employer independently corroborates, via meaningful oversight by a human with appropriate and relevant experience, the automated decision system output;
H-01 Human Oversight of Automated Decisions · H-01.1H-01.2 · Deployer · EmploymentAutomated Decisionmaking
IC 22-5-10.4-10(2)(F)
Plain Language
Within seven days after making an employment-related decision using an automated decision system output, the employer must provide the affected covered individual — at no cost — with comprehensive, plain-language documentation covering: (1) a description of the automated decision system, (2) a plain-language description and explanation of the input data used, plus a machine-readable copy of that data, (3) how the output was used in the decision, and (4) the reasoning for using the output. This is an individualized post-decision disclosure — not a general policy notice. The documentation must be 'full, accessible, and meaningful,' which likely requires more than boilerplate language.
Statutory Text
not later than seven (7) days after making the employment related decision, the employer provides full, accessible, and meaningful documentation in plain language and at no cost to the covered individual on the automated decision system output, including: (i) a description of the automated decision system used to generate the automated decision system output; (ii) a description and explanation, in plain language, of the input date to the automated decision system used to generate the automated decision system output and a machine readable copy of the data; (iii) a description and explanation of how the automated decision system output was used in making the employment related decision; and (iv) the reasoning for the use of the automated decision system output in the employment related decision;
H-01 Human Oversight of Automated Decisions · H-01.4H-01.5 · Deployer · EmploymentAutomated Decisionmaking
IC 22-5-10.4-10(2)(G)
Plain Language
After receiving the post-decision documentation, the covered individual must be allowed to (1) dispute the automated decision system output itself to a qualified human, through a process that is accessible, equitable, and not unreasonably burdensome, and (2) separately appeal the employment-related decision to a different qualified human — one who was not the person who corroborated the output under Section 10(2)(E). This creates two distinct rights: a challenge to the AI output and an appeal of the ultimate decision, with the appeal reviewer required to be independent from the initial corroboration step.
Statutory Text
the employer allows the covered individual to, after receiving the documentation described in clause (F): (i) dispute, in a manner that is accessible, equitable, and does not pose an unreasonable burden on the covered individual, the automated decision system output to a human with appropriate and relevant experience; and (ii) appeal the employment related decision to a human with appropriate and relevant experience who is not the human for purposes of the corroboration under clause (E).
H-01 Human Oversight of Automated Decisions · H-01.3 · Deployer · EmploymentAutomated Decisionmaking
IC 22-5-10.4-11(a)-(c)
Plain Language
Employers must provide a comprehensive advance disclosure to every covered individual describing: the fact that automated decision system outputs are or will be used, a detailed description of the system (data types collected, characteristics measured, job-relatedness of those characteristics, measurement methodology, and plain-language interpretation guidance), the identity of the system operator, how the output factors into employment decisions, and how to dispute or appeal. For employees hired on or before July 1, 2026, the disclosure must be provided by August 1, 2026. For individuals hired after July 1, 2026 — including candidates — the disclosure must be provided before hiring. Any significant changes to the disclosed information, or significant new information becoming available, triggers a 30-day update obligation. This is a pre-decision disclosure obligation distinct from the post-decision documentation required under Section 10(2)(F).
Statutory Text
Sec. 11. (a) An employer that uses or intends to use an automated decision system output in making an employment related decision with respect to a covered individual shall, in accordance with subsections (b) and (c), disclose to the covered individual: (1) that the employer uses or intends to use an automated decision system output in making an employment related decision; (2) a description and explanation of the automated decision system used or intended to be used to generate the automated decision system output, including: (A) the types of data collected or intended to be collected as inputs to the automated decision system and the circumstances of the collection; (B) the characteristics that the automated decision system measures or is intended to measure, such as the knowledge, skills, or abilities of the covered individual; (C) how the characteristics relate or would relate to any function required for the work or potential work of the covered individual; (D) how the system measures or is intended to measure the characteristics; and (E) how the covered individual can interpret the automated decision system output in plain language; (3) the identity of the covered individual or entity that operates the automated decision system that provides the automated decision system output; (4) how the employer uses or intends to use the automated decision system output in making the employment related decision; and (5) how the covered individual may dispute or appeal an employment related decision made with respect to the covered individual using an automated decision system output. (b) An employer shall provide the disclosures required by subsection (a) to a covered individual as follows: (1) In the case of a covered individual who was hired on or before July 1, 2026, the disclosure must be provided to the covered individual not later than August 1, 2026. (2) In the case of a covered individual who is hired after July 1, 2026, the disclosure must be provided to the covered individual before hiring. (c) Not later than thirty (30) days after: (1) any information provided by an employer to a covered individual through a disclosure required by subsection (a) significantly changes; or (2) any significant new information required to be provided in the disclosure becomes available; the employer shall provide the covered individual with an updated disclosure.
S-01 AI System Safety Program · S-01.5 · Deployer · EmploymentAutomated Decisionmaking
IC 22-5-10.4-10(2)(A)(iv)
Plain Language
As a precondition to using automated decision system output in any employment decision, the employer must validate the system's compliance with the NIST AI Risk Management Framework (version released January 26, 2023) or any successor framework. This is a mandatory compliance requirement, not a safe harbor — the NIST AI RMF is referenced as a required baseline rather than an optional benchmark. Employers must be prepared to demonstrate this compliance as part of their predeployment testing documentation.
Statutory Text
the compliance of the system with the Artificial Intelligence Risk Management Framework released by the National Institute of Standards and Technology on January 26, 2023, or a successor framework;
Other · EmploymentAutomated Decisionmaking
IC 22-5-10.4-12
Plain Language
Employers must train every person or entity that operates the automated decision system or uses its outputs. The required training curriculum covers eight topics: input data, the appeals process, potential biases, system limitations, potential adverse effects on covered individuals (listed twice, likely a drafting error), potential errors or problems, and examples of inappropriate uses. This training must encompass both the employer's internal staff and any external vendor operating the system on the employer's behalf. The statute does not specify training frequency, format, or certification requirements.
Statutory Text
Sec. 12. An employer that uses or intends to use an automated decision system output in making an employment related decision with respect to a covered individual shall train any individual or entity that operates the automated decision system or uses the automated decision system output on: (1) the input information used by the automated decision system; (2) the appeals process for the automated decision system output; (3) potential biases in automated decision systems; (4) any limitations of the automated decision system; (5) any potential adverse effects to covered individuals due to the automated decision system; (6) any potential adverse effects to covered individuals due to the automated decision system; (7) any potential errors or problems related to the automated decision system; and (8) examples of inappropriate uses of the automated decision system.
D-01 Automated Processing Rights & Data Controls · D-01.3 · Deployer · EmploymentAutomated Decisionmaking
IC 22-5-10.4-13
Plain Language
When an employer uses an automated decision system to manage a covered individual on an ongoing basis (e.g., algorithmic scheduling, performance monitoring, task assignment), the individual has the right to opt out entirely and be managed by a human manager who has authority to make employment decisions. This is broader than the dispute/appeal rights in Section 10(2)(G) — it applies to ongoing algorithmic management, not just discrete employment decisions. The employer must ensure a human alternative manager is available and empowered to make all employment-related decisions for any individual who exercises this opt-out right.
Statutory Text
Sec. 13. An employer that manages a covered individual through an automated decision system shall allow the covered individual to: (1) opt out of the management through the automated decision system; and (2) be managed through a human manager who is able to make employment related decisions with respect to the covered individual.
G-03 Whistleblower & Anti-Retaliation Protections · G-03.3 · Deployer · EmploymentAutomated Decisionmaking
IC 22-5-10.4-14
Plain Language
Employers are prohibited from discriminating or retaliating — including through intimidation, threats, coercion, or harassment — against any covered individual for exercising rights under this chapter or for filing complaints, seeking assistance, participating in proceedings, providing information, or testifying (or being about to testify) in connection with this chapter. The protection extends to individuals acting at the request of the covered individual. This also covers 'worker privacy related concerns' raised with government entities or worker representatives, broadening the anti-retaliation protection beyond violations of this specific chapter. Violations carry enhanced statutory damages ($5,000–$50,000 per violation, or $10,000–$100,000 for willful/repeated violations) and temporary relief including reinstatement.
Statutory Text
Sec. 14. An employer may not discriminate or retaliate, including through intimidation, threats, coercion, or harassment, against any covered individual: (1) for exercising or attempting to exercise any right provided under this chapter; or (2) because the covered individual or another individual acting at the request of the covered individual has: (A) filed a written or oral complaint to the employer or a federal, state, or local government entity of a violation of this chapter; (B) sought assistance or intervention with respect to a worker privacy related concern from the employer, a federal, state, or local government, or a worker representative; (C) instituted, caused to be instituted, or otherwise participated in any inquiry or proceeding under this chapter; (D) given, or is about to give, any information in connection with any inquiry or proceeding relating to any right provided under this chapter; or (E) testified, or is about to testify, in any inquiry or proceeding relating to any right provided under this chapter.
R-02 Regulatory Disclosure & Submissions · R-02.2 · Deployer · EmploymentAutomated Decisionmaking
IC 22-5-10.4-15
Plain Language
The Department of Labor has broad investigative and reporting authority. It may receive complaints, investigate potential violations, and require employers to file annual or special reports — or answer specific written questions — about their use of automated decision systems for employment decisions. When the Department requires a report, the employer must comply within the manner and timeframe the Department specifies. Separately, employers have a standing recordkeeping obligation: they must maintain, preserve, and make available to the Department all records pertaining to compliance with this chapter. This recordkeeping duty is ongoing and not contingent on a Department request.
Statutory Text
Sec. 15. (a) The department may do the following: (1) Receive complaints regarding alleged violations of this chapter. (2) Investigate any facts, conditions, practices, or matters as the department deems necessary or appropriate to determine whether an employer has violated this chapter. (3) Require an employer to file with the department, on a form prescribed by the department, annual or special reports or answers in writing to specific questions relating to the use of an automated decision system for employment related decisions. (b) If the department requires an employer to file a report or answers under subsection (a)(3), the employer shall file the report or answers in the manner and time period required by the department. (c) An employer shall maintain, keep, preserve, and make available to the department records pertaining to compliance with this chapter.