Sec. 5(8)-(10)
Plain Language
The Attorney General may designate federal laws, regulations, or guidance as substantially equivalent to the act's safety incident reporting requirements. A frontier developer or large chatbot provider may then declare intent to comply via the designated federal framework instead. If accepted, compliance with the federal standard satisfies the state obligation — but failure to meet the federal standard constitutes a violation of the Nebraska act. The AG must revoke a designation if its prerequisites are no longer met, and entities can revoke their declarations. This is a safe harbor mechanism, not a new affirmative obligation.
Statutory Text
(8) The Attorney General may adopt and promulgate rules and regulations designating one or more federal laws, regulations, or guidance documents that meet all of the following conditions for the purposes of subsection (9) of this section: (a) The law, regulation, or guidance document imposes or states standards or requirements for safety incident reporting that are substantially equivalent to, or stricter than, those required by this section for critical safety incidents, child safety incidents, or both. A law, regulation, or guidance document may satisfy this subdivision even if it does not require safety incident reporting to the State of Nebraska; and (b) The law, regulation, or guidance document is intended to assess, detect, or mitigate catastrophic risk, child safety risk, or both. (9)(a) A frontier developer or large chatbot provider that intends to comply with all or part of this section by complying with the requirements of, or meeting the standards stated by, a federal law, regulation, or guidance document designated pursuant to subsection (8) of this section by the Attorney General shall declare its intent to do so to the Attorney General. (b) After a frontier developer or large chatbot provider has declared its intent pursuant to subdivision (9)(a) of this section, the following shall apply: (i) To the extent that such developer or provider meets the standards of, or complies with the requirements imposed or stated by, the designated federal law, regulation, or guidance document, such developer or provider shall be deemed in compliance with the obligations under this section pertaining to: (A) Critical safety incidents, if such designated law, regulation, or document is intended to assess, detect, or mitigate catastrophic risk; and (B) Child safety incidents, if such designated law, regulation, or document is intended to assess, detect, or mitigate child safety risk; and (ii) The failure by such developer or provider to meet the standards of, or comply with the requirements stated by, such designated law, regulation, or document, shall be considered a violation of the Transparency in Artificial Intelligence Risk Management Act. (c) Subdivision (9)(b) of this section shall not apply to a frontier developer or large chatbot provider to the extent that: (i) Such developer or provider makes a declaration of intent to the Attorney General to modify or revoke a declaration of intent under subdivision (9)(a) of this section; or (ii) The Attorney General revokes a rule or regulation pursuant to subsection (10) of this section. (10) The Attorney General shall revoke a rule or regulation adopted under or promulgated under subsection (8) of this section if the requirements of subsection (8) are no longer met.