HB-0035
IL · State · USA
IL
USA
● Pending
Proposed Effective Date
2025-01-01
Illinois HB 0035 — Artificial Intelligence Systems Use in Health Insurance Act
Requires health insurance issuers authorized to do business in Illinois to ensure that adverse consumer outcomes — including denials, reductions, or terminations of health insurance coverage or benefits — do not result solely from the use of AI systems or predictive models. All such AI-informed decisions must be meaningfully reviewed by a human with override authority; when the decision is an adverse determination under the Managed Care Reform and Patient Rights Act, the reviewer must be a clinical peer. Issuers must maintain an AI systems program covering governance, risk management, and internal audit. The Department of Insurance has oversight authority including the power to request documentation on AI governance, third-party AI diligence, and compliance. The Department may also adopt rules for disclosure standards regarding AI use, including pre-decision notice, post-decision notice, data correction processes, and appeal instructions.
Summary

Requires health insurance issuers authorized to do business in Illinois to ensure that adverse consumer outcomes — including denials, reductions, or terminations of health insurance coverage or benefits — do not result solely from the use of AI systems or predictive models. All such AI-informed decisions must be meaningfully reviewed by a human with override authority; when the decision is an adverse determination under the Managed Care Reform and Patient Rights Act, the reviewer must be a clinical peer. Issuers must maintain an AI systems program covering governance, risk management, and internal audit. The Department of Insurance has oversight authority including the power to request documentation on AI governance, third-party AI diligence, and compliance. The Department may also adopt rules for disclosure standards regarding AI use, including pre-decision notice, post-decision notice, data correction processes, and appeal instructions.

Enforcement & Penalties
Enforcement Authority
Department of Insurance. Enforcement is agency-initiated through investigations and market conduct actions. The Department has authority to review the development, implementation, and use of AI systems or predictive models and their outcomes. The Department may request information and documentation from health insurance issuers and other persons described in subsection (b) of Section 132 of the Illinois Insurance Code, who must comply with such requests. No private right of action is created.
Penalties
The Act does not specify monetary penalties, statutory damages, or civil remedies. Enforcement is through the Department of Insurance's existing regulatory authority, including investigations and market conduct actions under the Illinois Insurance Code.
Who Is Covered
"Health insurance issuer" has the meaning given to that term in Section 5 of the Illinois Health Insurance Portability and Accountability Act. "Health insurance issuer" includes a company offering accident and health insurance under Class 1(b) or 2(a) of Section 4 of the Illinois Insurance Code, a dental service plan corporation, a health maintenance organization, a limited health service organization, a limited health services organization, or a health services plan corporation transacting or authorized to transact business under the Department's jurisdiction.
Compliance Obligations 5 obligations · click obligation ID to open requirement page
HC-01 Healthcare AI Decision Restrictions · HC-01.1HC-01.2 · Deployer · Healthcare
Section 10(b)
Plain Language
Health insurance issuers may not deny, reduce, or terminate health insurance coverage or benefits based solely on the output of an AI system or predictive model. Every such AI-informed adverse decision must be meaningfully reviewed by a human who has authority to override the AI's determination, following procedures the Department of Insurance will establish by rule. When the adverse outcome constitutes an adverse determination under the Managed Care Reform and Patient Rights Act, the human reviewer must be a clinical peer as defined under that Act — i.e., a licensed physician or health professional in the same or similar specialty as the treating provider.
Statutory Text
(b) A health insurance issuer authorized to do business in this State shall not issue an adverse consumer outcome with regard to the denial, reduction, or termination of health insurance coverage or benefits that result solely from the use or application of any AI system or predictive model. Any decision-making process concerning the denial, reduction, or termination of insurance plans or benefits that results from the use of AI systems or predictive models shall be meaningfully reviewed, in accordance with review procedures established by Department rules, by an individual with authority to override the AI systems and the determinations of the AI systems. When an adverse consumer outcome is an adverse determination regulated under the Managed Care Reform and Patient Rights Act, the individual with authority to override the AI systems and the determinations of the AI systems shall be a clinical peer as required and defined under that Act.
R-02 Regulatory Disclosure & Submissions · R-02.2 · Deployer · Healthcare
Section 10(a)
Plain Language
Health insurance issuers must comply with Department of Insurance requests for information and documentation during investigations or market conduct actions. The Department's authority extends to reviewing the development, implementation, and use of AI systems and predictive models, including their outcomes. Required documentation categories include AI governance and risk management protocols, pre-acquisition and pre-utilization diligence on third-party AI systems and data, monitoring and auditing records for third-party tools, and compliance records for the issuer's AI systems program. This is a respond-on-demand obligation — issuers must maintain documentation in a form that can be produced when the Department requests it.
Statutory Text
(a) The Department's regulatory oversight of health insurance coverage includes oversight of the use of AI systems or predictive models to make or support adverse consumer outcomes. The Department's authority in an investigation or market conduct action includes review regarding the development, implementation, and use of AI systems or predictive models and the outcomes from the use of those AI systems or predictive models. The Department may also request other information or documentation relevant to an investigation or market conduct action, and a health insurance issuer or any other person described in subsection (b) of Section 132 of the Illinois Insurance Code must comply with that request. The Department's inquiries may include, but are not limited to, questions regarding any specific model, AI system, or application of a model or AI system. The Department may also make requests for information and documentation relating to AI systems governance, risk management, and use protocols; information and documentation relating to the health insurance issuer's preacquisition and preutilization diligence, monitoring, and auditing of data or AI systems developed or used by a third party; and information and documentation relating to implementation and compliance with the health insurance issuer's AI systems program.
G-01 AI Governance Program & Documentation · G-01.1 · Deployer · Healthcare
Section 20(b)
Plain Language
Health insurance issuers must establish and maintain an AI systems program — defined as their controls and processes for responsible AI use, including governance, risk management, and internal audit functions — that includes policies and procedures ensuring compliance with this Act by all employees, directors, trustees, agents, representatives, and contractors involved in administering coverage. The issuer bears ultimate responsibility for noncompliance regardless of whether a third party performed the noncompliant action. Separately, third parties and other persons are not relieved of liability for failing to cooperate with Department investigations or market conduct actions.
Statutory Text
(b) A health insurance issuer shall ensure that its health insurance coverage is administered in conformity with this Act. The health insurance issuer's AI systems program shall include policies and procedures to ensure such conformity by all employees, directors, trustees, agents, representatives, and persons directly or indirectly contracted to administer the health insurance coverage. The health insurance issuer shall be responsible for any noncompliance under this Act with respect to its health insurance coverage. Nothing in this Section relieves any other person from liability for failure to comply with the Department's investigations or market conduct actions related to a health insurance issuer's compliance with this Act.
HC-01 Healthcare AI Decision Restrictions · HC-01.6HC-01.7 · Deployer · Healthcare
Section 15
Plain Language
The Department of Insurance may adopt rules establishing disclosure standards for health insurance issuers' use of AI systems affecting consumers. Potential rule content includes pre-use notice, post-adverse-decision notice, explanation of how personal information informs decisions, a process for correcting inaccurate information, and appeal instructions. This is currently a rulemaking authorization — the specific disclosure obligations will not be operative until the Department promulgates rules. However, issuers should anticipate that future rules may require all of these disclosure elements and should begin developing compliance infrastructure accordingly.
Statutory Text
Section 15. Disclosure of AI system utilization. The Department of Insurance may adopt rules that include standards for the full and fair disclosure of a health insurance issuer's use of AI systems that may impact consumers, that set forth the manner, content, and required disclosures including notice before the use of AI systems, notice after an adverse decision, the way personal information is used to inform decisions, a process for correcting inaccurate information, and instructions for appealing decisions.
Other · Healthcare
Section 20(a)
Plain Language
Health insurance issuers must comply with this Act when making or supporting consumer-impacting decisions through AI systems and machine learning. This provision also confirms that all existing federal and state insurance laws — including unfair trade practices and unfair discrimination laws — continue to apply to AI-driven decisions. This is a general compliance and preservation clause that creates no new independent obligation beyond the Act's other operative sections.
Statutory Text
(a) All health insurance issuers authorized to do business in Illinois shall comply with this Act regarding any decisions impacting consumers that are made or supported by AI systems and machine learning, and must comply with all applicable insurance laws and regulations, including laws addressing unfair trade practices and unfair discrimination. All decisions made and actions taken by authorized health insurance issuers using AI systems must comply with applicable federal and State laws, regulations, and rules.