HB-1915
OK · State · USA
OK
USA
● Pending
Proposed Effective Date
2025-11-01
Oklahoma HB 1915 — An Act relating to artificial intelligence (AI) devices in health care
Regulates AI and machine learning-enabled medical devices used in Oklahoma healthcare settings. Requires that all such devices comply with FDA regulations and be used exclusively by qualified end-users — licensed physicians trained both in the underlying clinical procedure and in the specific AI device. Deployers (hospitals, physician practices, and other healthcare facilities) must implement a Quality Assurance Program, establish an AI governance group, maintain an inventory of deployed AI devices, conduct regular performance evaluations and risk assessments, and ensure all documentation meets state and federal medical record-keeping requirements. The State Department of Health is designated as the enforcement authority with rulemaking power; no private right of action is created.
Summary

Regulates AI and machine learning-enabled medical devices used in Oklahoma healthcare settings. Requires that all such devices comply with FDA regulations and be used exclusively by qualified end-users — licensed physicians trained both in the underlying clinical procedure and in the specific AI device. Deployers (hospitals, physician practices, and other healthcare facilities) must implement a Quality Assurance Program, establish an AI governance group, maintain an inventory of deployed AI devices, conduct regular performance evaluations and risk assessments, and ensure all documentation meets state and federal medical record-keeping requirements. The State Department of Health is designated as the enforcement authority with rulemaking power; no private right of action is created.

Enforcement & Penalties
Enforcement Authority
The State Department of Health has the authority to enforce and make rules to enforce the act. Enforcement is agency-initiated. No private right of action is created. No cure period or safe harbor is specified.
Penalties
Noncompliance will result in penalties set by the State Department of Health. No specific dollar amounts, penalty ranges, or remedy types are specified in the statute; penalty-setting authority is delegated to the Department.
Who Is Covered
"Deployer" means a hospital, physician practice, or other health care facility responsible for implementing an AI device for patient care purposes;.
"Qualified end-user" means a user of an AI device that is a licensed physician with the necessary qualifications and training to independently provide the same diagnostic, prognostic, or therapeutic procedure without the aid of the AI device, and who possesses specific qualifications and training in the use of the AI device, including the ability to assess the validity of its output.
What Is Covered
"Artificial intelligence (AI) device" or "machine learning-enabled device" means a medical device as defined by Section 201(h)(1) of the Federal Food, Drug, and Cosmetic Act (FD&C Act) that includes a machine-based function that, based on training data, infers from the input it receives how to generate outputs that enhance or support a medical diagnosis, prognosis, or treatment;
Compliance Obligations 13 obligations · click obligation ID to open requirement page
Other · Healthcare
63 O.S. § 5502(A)
Plain Language
This provision mandates that all AI medical devices used in Oklahoma healthcare settings must comply with existing FDA regulations and guidance. It does not create a new affirmative obligation beyond what federal law already requires — it codifies the requirement to follow federal regulations at the state level, which enables state enforcement of federal compliance.
Statutory Text
A. All artificial intelligence (AI) devices or machine learning-enabled devices used in health care settings that meet the definition of a medical device under Section 201(h)(1) of the Federal Food, Drug, and Cosmetic Act (FD&C Act) shall be deployed and utilized in accordance with federal regulations established by the U.S. Food and Drug Administration (FDA) and other federal agencies, including relevant guidance on AI or machine learning-enabled software medical devices.
HC-02 AI in Licensed Professional Practice Restrictions · HC-02.1 · DeployerProfessional · Healthcare
63 O.S. § 5502(B)
Plain Language
Only qualified end-users may operate AI medical devices. This means the user must be a licensed physician who can independently perform the same clinical procedure without the AI device and who has been specifically trained on the AI device itself, including the ability to assess whether its outputs are valid. No non-physician staff, nurse practitioner, or untrained physician may use the AI device. Deployers must ensure access is restricted accordingly.
Statutory Text
B. An AI device shall be used exclusively by a qualified end-user.
S-01 AI System Safety Program · S-01.5 · Deployer · Healthcare
63 O.S. § 5502(C)
Plain Language
Deployers must establish and maintain an ongoing Quality Assurance Program for their AI devices. This is a cross-reference to the detailed governance requirements in Section 4 (§ 5504), which specifies the components of the program including the governance group, inventory, review and selection processes, use case documentation, and continuous monitoring. This provision creates the overarching mandate; the specific requirements are mapped separately under their respective Section 4 provisions.
Statutory Text
C. Deployers shall implement and maintain a Quality Assurance Program, as outlined in Section 4 of this act, to ensure the safe, effective, and compliant use of AI devices in patient care.
HC-02 AI in Licensed Professional Practice Restrictions · HC-02.1 · DeployerProfessional · Healthcare
63 O.S. § 5503(A)
Plain Language
Before any patient care decision is made based on AI device output, a qualified end-user (licensed physician with appropriate training) must review and validate the AI-generated data for accuracy. The deployer must have documented policies and procedures governing this review process. This is a mandatory human-in-the-loop requirement — no AI output may be acted upon in patient care without prior physician validation.
Statutory Text
A. All relevant artificial intelligence (AI) device-generated data shall be reviewed for accuracy and validated by a qualified end-user in accordance with deployer-documented policies and procedures before patient care decisions are rendered.
H-01 Human Oversight of Automated Decisions · H-01.6 · DeployerProfessional · Healthcare
63 O.S. § 5503(B)
Plain Language
Qualified end-users must retain full authority to amend or override any AI device output based on their own professional clinical judgment. Deployers and other entities are prohibited from pressuring the physician to accept, ignore, or alter the AI's recommendations. This creates both a positive right (the physician can always override) and a negative obligation (no one may pressure the physician to defer to the AI). This is a structural safeguard ensuring meaningful human control over AI-assisted clinical decisions.
Statutory Text
B. The qualified end-user of the AI device shall retain authority to amend or overrule outputs from the device based on their professional judgment, and without pressure from the deployer or any other entity to ignore or alter professional judgement.
S-01 AI System Safety Program · S-01.4S-01.7 · Deployer · Healthcare
63 O.S. § 5503(C)
Plain Language
Deployers must regularly evaluate AI device performance and conduct risk assessments, documenting the results. These evaluations should incorporate feedback solicited from the licensed physicians who use the devices and, where feasible, participation in national specialty society AI assessment registries. When performance concerns surface, deployers must take corrective action to mitigate patient risk. This is a continuing post-deployment obligation — not satisfied by pre-deployment testing alone.
Statutory Text
C. Deployers of an AI device shall conduct and document regular performance evaluations and risk assessments of the device. Such evaluations and assessments should be informed by invited feedback from qualified end-users and, when applicable, participation in national specialty society-administered AI assessment registries. Whenever AI device performance concerns are identified, deployers shall implement appropriate corrective actions to mitigate risk to patients.
G-01 AI Governance Program & Documentation · G-01.3G-01.4 · Deployer · Healthcare
63 O.S. § 5503(D)
Plain Language
All AI device documentation must meet state and federal medical record-keeping standards and be available for regulatory inspection. Deployers must specifically maintain summary reports documenting when qualified end-users override or disagree with AI device outputs, including the frequency and nature of those overrides and the percentage or number of disagreements. This creates a quantitative tracking obligation — deployers must know and record how often their physicians reject AI recommendations and why.
Statutory Text
D. All documentation shall comply with state and federal medical record-keeping requirements and be accessible for regulatory review. Documentation of relevant instances where a qualified end-user overrides or disagrees with AI device-generated outputs must be maintained through a summary report indicating the frequency and nature of overrides. Deployers shall document the percentage or number of such overrides or disagreements.
G-01 AI Governance Program & Documentation · G-01.6 · Deployer · Healthcare
63 O.S. § 5504(A)
Plain Language
Deployers must establish a formal AI governance group that includes representation from the qualified end-users (licensed physicians) who actually use the AI devices. This governance group is responsible for overseeing compliance with all requirements of the act. This goes beyond designating a single individual — it requires a multi-stakeholder governance body with practitioner representation.
Statutory Text
A. Deployers of any artificial intelligence (AI) device shall establish an AI governance group with representation from qualified end-users. This governance group is responsible for overseeing compliance with this act.
G-01 AI Governance Program & Documentation · G-01.3 · Deployer · Healthcare
63 O.S. § 5504(B)
Plain Language
Deployers must maintain a current inventory of all AI devices they have deployed, along with each device's instructions for use and any relevant safety and effectiveness documentation. All of this must be made accessible to the qualified end-users (licensed physicians) who use the devices. This is both an inventory obligation and an internal documentation accessibility requirement — physicians must be able to find and review device documentation.
Statutory Text
B. Deployers shall maintain an updated inventory of deployed AI devices, with device instructions for use and any relevant safety and effectiveness documentation made accessible to all qualified end-users of the device.
Other · Healthcare
63 O.S. § 5504(C)
Plain Language
This provision serves two functions: (1) it restates the general obligation to comply with this act and with existing federal and state security, privacy, and nondiscrimination regulations, and (2) it designates the State Department of Health as the enforcement authority with rulemaking power and states that noncompliance will result in Department-set penalties. It creates no new independent compliance obligation — the specific duties are established in other sections.
Statutory Text
C. Deployers of AI devices shall ensure compliance with all requirements herein, as well as with applicable federal and state security, privacy, and nondiscrimination regulations. Noncompliance will result in penalties set by the State Department of Health, which shall have the authority to enforce and make rules to enforce this act.
S-01 AI System Safety Program · S-01.1 · Deployer · Healthcare
63 O.S. § 5504(D)
Plain Language
Before deploying an AI device, deployers must have a diligent review and selection process in place. This is a pre-deployment evaluation requirement — deployers cannot simply adopt any AI device without first conducting due diligence on the device's suitability, safety, and effectiveness for their intended clinical use case. The statute does not specify what 'diligent review' entails, which may be further defined by State Department of Health rulemaking.
Statutory Text
D. Deployers shall have a diligent review and selection process for the deployed AI device.
G-01 AI Governance Program & Documentation · G-01.3 · Deployer · Healthcare
63 O.S. § 5504(E)
Plain Language
Deployers must create and maintain documentation of the intended use case for each AI device and the training procedures that qualified end-users must complete before using it. This ensures there is a written record of why an AI device was deployed and how users were prepared to use it, which supports both internal governance and regulatory review.
Statutory Text
E. Deployers shall document the use case and user training procedure for the AI device.
S-01 AI System Safety Program · S-01.4S-01.7 · Deployer · Healthcare
63 O.S. § 5504(F)-(G)
Plain Language
Deployers must continuously monitor the performance of every deployed AI device, with specific attention to patient safety impacts and care quality. As part of this monitoring, deployers must participate in national specialty society-administered AI assessment registries when feasible. The registry participation requirement is qualified by feasibility — if no applicable registry exists for a given specialty or device, the obligation does not apply. The continuous monitoring obligation itself, however, is mandatory and ongoing.
Statutory Text
F. Deployers shall continuously monitor the performance of all deployed AI devices, including assessing any impact on patient safety or the quality of patient care. G. In conducting performance monitoring described in subsection F of this section, deployers must participate in national specialty society-administered artificial intelligence assessment registries when feasible.