HB-1915
OK · State · USA
OK
USA
● Pre-filed
Proposed Effective Date
2025-11-01
Oklahoma HB 1915 — An Act relating to artificial intelligence (AI) devices in health care
Imposes governance, safety, and oversight obligations on hospitals, physician practices, and other health care facilities (deployers) that implement AI or machine learning-enabled medical devices for patient care. Requires that AI devices be used exclusively by qualified end-users (licensed physicians with specific training), who must review and validate all AI-generated data before patient care decisions and retain authority to override AI outputs. Deployers must establish an AI governance group, maintain a device inventory, implement a Quality Assurance Program, conduct regular performance evaluations and risk assessments, document use cases and training procedures, and continuously monitor device performance. Enforcement is by the State Department of Health, which has rulemaking authority and may set penalties for noncompliance. No private right of action is created.
Summary

Imposes governance, safety, and oversight obligations on hospitals, physician practices, and other health care facilities (deployers) that implement AI or machine learning-enabled medical devices for patient care. Requires that AI devices be used exclusively by qualified end-users (licensed physicians with specific training), who must review and validate all AI-generated data before patient care decisions and retain authority to override AI outputs. Deployers must establish an AI governance group, maintain a device inventory, implement a Quality Assurance Program, conduct regular performance evaluations and risk assessments, document use cases and training procedures, and continuously monitor device performance. Enforcement is by the State Department of Health, which has rulemaking authority and may set penalties for noncompliance. No private right of action is created.

Enforcement & Penalties
Enforcement Authority
The State Department of Health has authority to enforce the act and to make rules to enforce it. Enforcement is agency-initiated. No private right of action is created. No cure period or safe harbor is specified.
Penalties
Noncompliance will result in penalties set by the State Department of Health. The statute does not specify penalty amounts, ranges, or types of remedies. The Department is authorized to set penalties by rule.
Who Is Covered
"Deployer" means a hospital, physician practice, or other health care facility responsible for implementing an AI device for patient care purposes;.
"Qualified end-user" means a user of an AI device that is a licensed physician with the necessary qualifications and training to independently provide the same diagnostic, prognostic, or therapeutic procedure without the aid of the AI device, and who possesses specific qualifications and training in the use of the AI device, including the ability to assess the validity of its output.
What Is Covered
"Artificial intelligence (AI) device" or "machine learning-enabled device" means a medical device as defined by Section 201(h)(1) of the Federal Food, Drug, and Cosmetic Act (FD&C Act) that includes a machine-based function that, based on training data, infers from the input it receives how to generate outputs that enhance or support a medical diagnosis, prognosis, or treatment;
Compliance Obligations 13 obligations · click obligation ID to open requirement page
Other · Healthcare
63 O.S. § 5502(A)
Plain Language
All AI/ML-enabled medical devices used in healthcare settings must comply with FDA regulations and relevant federal guidance on AI/ML software medical devices. This provision does not create a new state-law obligation but rather cross-references and incorporates existing federal requirements.
Statutory Text
A. All artificial intelligence (AI) devices or machine learning-enabled devices used in health care settings that meet the definition of a medical device under Section 201(h)(1) of the Federal Food, Drug, and Cosmetic Act (FD&C Act) shall be deployed and utilized in accordance with federal regulations established by the U.S. Food and Drug Administration (FDA) and other federal agencies, including relevant guidance on AI or machine learning-enabled software medical devices.
HC-02 AI in Licensed Professional Practice Restrictions · HC-02.1 · DeployerProfessional · Healthcare
63 O.S. § 5502(B)
Plain Language
AI medical devices may only be used by licensed physicians who (1) are independently qualified to perform the same diagnostic, prognostic, or therapeutic procedure without the AI device, and (2) have specific training in the AI device's use, including the ability to assess output validity. No other staff — nurses, technicians, or unlicensed personnel — may operate the device. Deployers must ensure this exclusivity is maintained in practice.
Statutory Text
B. An AI device shall be used exclusively by a qualified end-user.
S-01 AI System Safety Program · S-01.5 · Deployer · Healthcare
63 O.S. § 5502(C)
Plain Language
Deployers (hospitals, physician practices, and other healthcare facilities) must implement and maintain a formal Quality Assurance Program to ensure AI devices are used safely, effectively, and in compliance with the act. This is a standing programmatic obligation — not a one-time setup — that requires ongoing maintenance. The specific components of the QA Program are detailed in Section 4 of the act (63 O.S. § 5504).
Statutory Text
C. Deployers shall implement and maintain a Quality Assurance Program, as outlined in Section 4 of this act, to ensure the safe, effective, and compliant use of AI devices in patient care.
HC-02 AI in Licensed Professional Practice Restrictions · HC-02.1 · ProfessionalDeployer · Healthcare
63 O.S. § 5503(A)
Plain Language
Before any patient care decision is made based on AI device output, a qualified end-user (licensed physician with appropriate training) must review and validate the AI-generated data for accuracy. This review must follow the deployer's documented policies and procedures. No AI output may be acted upon in patient care without this human validation step.
Statutory Text
A. All relevant artificial intelligence (AI) device-generated data shall be reviewed for accuracy and validated by a qualified end-user in accordance with deployer-documented policies and procedures before patient care decisions are rendered.
H-01 Human Oversight of Automated Decisions · H-01.6 · DeployerProfessional · Healthcare
63 O.S. § 5503(B)
Plain Language
The qualified end-user must have actual authority to amend or overrule any output from the AI device based on their independent professional judgment. Deployers and other entities are prohibited from pressuring the physician to accept, ignore, or alter their clinical judgment regarding AI outputs. This is not merely a right to override — it is an affirmative prohibition on institutional interference with physician judgment regarding AI device outputs.
Statutory Text
B. The qualified end-user of the AI device shall retain authority to amend or overrule outputs from the device based on their professional judgment, and without pressure from the deployer or any other entity to ignore or alter professional judgement.
S-01 AI System Safety Program · S-01.4S-01.7 · Deployer · Healthcare
63 O.S. § 5503(C)
Plain Language
Deployers must conduct and document regular performance evaluations and risk assessments of each AI device. The evaluations should incorporate feedback solicited from qualified end-users and, when feasible, participation in national specialty society AI assessment registries. When performance concerns are identified, deployers must take corrective action to mitigate patient risk. This is an ongoing obligation — not a one-time pre-deployment assessment — covering the entire lifecycle of the deployed device.
Statutory Text
C. Deployers of an AI device shall conduct and document regular performance evaluations and risk assessments of the device. Such evaluations and assessments should be informed by invited feedback from qualified end-users and, when applicable, participation in national specialty society-administered AI assessment registries. Whenever AI device performance concerns are identified, deployers shall implement appropriate corrective actions to mitigate risk to patients.
G-01 AI Governance Program & Documentation · G-01.3G-01.4 · Deployer · Healthcare
63 O.S. § 5503(D)
Plain Language
All documentation related to AI device use must comply with state and federal medical record-keeping requirements and be accessible for regulatory review by the State Department of Health. In addition, deployers must specifically track and document instances where a qualified end-user overrides or disagrees with AI device outputs, maintaining a summary report that includes the frequency and nature of overrides and the percentage or number of such disagreements. This creates both a general documentation compliance obligation and a specific AI-override tracking obligation.
Statutory Text
D. All documentation shall comply with state and federal medical record-keeping requirements and be accessible for regulatory review. Documentation of relevant instances where a qualified end-user overrides or disagrees with AI device-generated outputs must be maintained through a summary report indicating the frequency and nature of overrides. Deployers shall document the percentage or number of such overrides or disagreements.
G-01 AI Governance Program & Documentation · G-01.6 · Deployer · Healthcare
63 O.S. § 5504(A)
Plain Language
Every deployer of an AI medical device must establish a formal AI governance group that includes representation from qualified end-users (licensed physicians trained on the devices). This group is the designated body responsible for overseeing compliance with all provisions of the act. This is a structural governance requirement — the deployer must create and maintain this body, not merely designate a single compliance officer.
Statutory Text
A. Deployers of any artificial intelligence (AI) device shall establish an AI governance group with representation from qualified end-users. This governance group is responsible for overseeing compliance with this act.
G-01 AI Governance Program & Documentation · G-01.3 · Deployer · Healthcare
63 O.S. § 5504(B)
Plain Language
Deployers must maintain a current inventory of all deployed AI devices. For each device, the deployer must ensure that instructions for use and relevant safety and effectiveness documentation are accessible to all qualified end-users. This is an ongoing maintenance obligation — the inventory must be kept up to date as devices are added or removed.
Statutory Text
B. Deployers shall maintain an updated inventory of deployed AI devices, with device instructions for use and any relevant safety and effectiveness documentation made accessible to all qualified end-users of the device.
Other · Healthcare
63 O.S. § 5504(C)
Plain Language
Deployers must comply with all provisions of this act and with existing federal and state security, privacy, and nondiscrimination regulations. The State Department of Health is the enforcement authority and has rulemaking power. Penalties for noncompliance will be set by the Department. This provision does not create a new compliance obligation beyond the act's other sections but establishes the enforcement framework.
Statutory Text
C. Deployers of AI devices shall ensure compliance with all requirements herein, as well as with applicable federal and state security, privacy, and nondiscrimination regulations. Noncompliance will result in penalties set by the State Department of Health, which shall have the authority to enforce and make rules to enforce this act.
S-01 AI System Safety Program · S-01.1 · Deployer · Healthcare
63 O.S. § 5504(D)
Plain Language
Before deploying an AI medical device, deployers must conduct a diligent review and selection process. While the statute does not specify the elements of this process, it requires that the selection of AI devices not be ad hoc — deployers must be able to demonstrate a deliberate evaluation process was followed. This is a pre-deployment obligation that applies to each AI device the deployer selects for use in patient care.
Statutory Text
D. Deployers shall have a diligent review and selection process for the deployed AI device.
G-01 AI Governance Program & Documentation · G-01.3 · Deployer · Healthcare
63 O.S. § 5504(E)
Plain Language
For each deployed AI device, deployers must create and maintain documentation covering (1) the intended use case for the device in their clinical setting and (2) the training procedure for users. This ensures that each AI device has a documented purpose and that there is a formal training protocol for the qualified end-users who will operate it.
Statutory Text
E. Deployers shall document the use case and user training procedure for the AI device.
S-01 AI System Safety Program · S-01.4 · Deployer · Healthcare
63 O.S. § 5504(F)-(G)
Plain Language
Deployers must continuously monitor all deployed AI devices for performance issues, with specific attention to patient safety and care quality impacts. As part of this monitoring, deployers must participate in national specialty society-administered AI assessment registries when feasible. The feasibility qualifier means participation is mandatory when a relevant registry exists and participation is practicable, but not when no applicable registry exists or participation would be impracticable. This is a continuous post-deployment obligation distinct from the periodic performance evaluations required by Section 5503(C).
Statutory Text
F. Deployers shall continuously monitor the performance of all deployed AI devices, including assessing any impact on patient safety or the quality of patient care. G. In conducting performance monitoring described in subsection F of this section, deployers must participate in national specialty society-administered artificial intelligence assessment registries when feasible.