A-00235
NY · State · USA
NY
USA
● Pending
New York Assembly Bill 235 — An Act to amend the general business law, in relation to unauthorized depictions of public officials generated by artificial intelligence
Requires owners, licensees, or operators of visual or audio generative AI systems accessible to New York residents to implement reasonable methods to prohibit users from creating unauthorized realistic depictions of public officeholders and candidates for public office, within 60 days of receiving notice from the covered person. Operators must also provide an accessible notice mechanism for covered persons to submit opt-out requests. The bill creates a private right of action with statutory damages of $100 per depiction, capped at $100,000 in the aggregate, and provides safe harbors for reasonable compliance methods and for depictions that are incidental or unforeseeable. The bill does not apply where a third party with no ownership or control over the underlying generative model processes the outputs.
Summary

Requires owners, licensees, or operators of visual or audio generative AI systems accessible to New York residents to implement reasonable methods to prohibit users from creating unauthorized realistic depictions of public officeholders and candidates for public office, within 60 days of receiving notice from the covered person. Operators must also provide an accessible notice mechanism for covered persons to submit opt-out requests. The bill creates a private right of action with statutory damages of $100 per depiction, capped at $100,000 in the aggregate, and provides safe harbors for reasonable compliance methods and for depictions that are incidental or unforeseeable. The bill does not apply where a third party with no ownership or control over the underlying generative model processes the outputs.

Enforcement & Penalties
Enforcement Authority
Private right of action by covered persons (public officeholders and candidates for public office). No designated agency enforcer. A covered person who has sent notice and whose realistic depictions are generated after the prescribed compliance period may bring a civil action against the owner, licensee, or operator of the covered system. The covered person's agent, employee, or representative acting at the direction of the covered person may also provide notice. No cure period beyond the 60-day implementation window triggered by notice.
Penalties
$100 per depiction, capped at $100,000 in the aggregate per covered person. Statutory damages do not require proof of actual monetary harm. No liability where the owner, licensee, or operator implemented a reasonable method but was unable to prevent the unauthorized depictions, nor for depictions that are incidental or created in an unforeseeable way.
Who Is Covered
"Covered person" shall mean any person holding a public office or a candidate for public office. In the context of notice, covered person shall include the covered person's agent, employee or representative acting at the direction of the covered person.
What Is Covered
"Visual or audio generative artificial intelligence system" or "covered system" shall mean any artificial intelligence system that is accessible to New York residents whose primary function is to generate visual or auditory media.
Compliance Obligations 5 obligations · click obligation ID to open requirement page
CP-02 Non-Consensual Intimate Imagery · Deployer · Content Generation
Gen. Bus. Law § 390-f(2)
Plain Language
Within 60 days of receiving notice from a public officeholder or candidate (or their authorized representative) that the person does not want realistic depictions of themselves generated, the owner, licensee, or operator of a covered generative AI system must implement a reasonable method to prevent users from creating such depictions. A method is 'reasonable' if it is consistent with industry standards, not overly burdensome on the system, cost-effective to implement and maintain, and up to date. This obligation is triggered only upon receipt of notice — there is no proactive obligation to prevent depictions of all public officials absent notice. Safe harbor: The operator is not liable if a reasonable method was implemented but was unable to prevent the depictions, nor for depictions that are incidental or created in an unforeseeable way (per § 390-f(5)).
Statutory Text
2. The owner, licensee or operator of a visual or audio generative artificial intelligence system shall implement a reasonable method to prohibit its users from creating unauthorized realistic depictions of a covered person within sixty days of being notified by such covered person that such covered person does not want a realistic depiction of themselves to be generated by the owner, licensee or operator's system. An implemented method to prevent the unauthorized creation of realistic depictions of a covered person shall be considered reasonable when the owner, licensee or operator of the covered system has implemented a method that, in relation to the method of user inputs used by the covered system, is consistent with industry standards, not overly burdensome on the system, cost-effective to implement and maintain and is up to date.
CP-02 Non-Consensual Intimate Imagery · Deployer · Content Generation
Gen. Bus. Law § 390-f(3)
Plain Language
Operators must provide an accessible notice mechanism through which public officeholders and candidates can request that realistic depictions of themselves not be generated. The mechanism must be easy to access, understand, complete, and send, and must provide clear, timely status updates to the sender. Operators may require reasonable identification to verify and process requests. This is a standalone procedural obligation — even before any depiction is generated, operators must have this intake mechanism in place.
Statutory Text
3. The owner, licensee or operator of a covered system shall implement a reasonable method for covered persons to send notice to them under this section provided that such method is easy to access, understand, complete and send and that such method provides clear updates to the sender on the status of their request in a timely manner. The owner, licensee or operator of a covered system may request a reasonable means of identification to process such requests.
CP-02 Non-Consensual Intimate Imagery · Deployer · Content Generation
Gen. Bus. Law § 390-f(4)
Plain Language
Operators may optionally implement safeguards allowing the covered person (or their authorized agent) to continue using the system to generate their own likeness, even after submitting an opt-out notice. However, if those safeguards are not reasonable — meaning they are not consistent with industry standards, overly burdensome, not cost-effective, or not up to date — the operator is liable as though they had failed to implement the opt-out method entirely. This is a permissive carve-out with a conditional liability hook, not a standalone affirmative obligation. It creates no new compliance duty but puts operators on notice that implementing an authorized-use exception carries the same reasonableness standard as the core opt-out obligation.
Statutory Text
4. Nothing in this section shall prohibit the owner, licensee or operator of a covered system from implementing reasonable safeguards to permit the covered person, their agent, employee or representative to use such covered system to generate realistic depictions of such covered person, provided however that such owner, licensee or operator of such covered system shall be liable in the same manner as if they had violated subdivision two of this section where such safeguards are not reasonable. A safeguard is considered reasonable for purposes of this subdivision where the owner, licensee or operator of a covered system implements measures that are consistent with industry standards, not overly burdensome on the system, cost-effective to implement and maintain and are up to date.
Other · Content Generation
Gen. Bus. Law § 390-f(5)
Plain Language
This provision establishes the damages framework and safe harbors for violations of the core opt-out obligation. Covered persons may recover $100 per unauthorized depiction, up to $100,000 in the aggregate, when the operator failed to implement a reasonable prevention method within the prescribed 60-day period. Operators are not liable if they implemented a reasonable method but could not prevent the depiction, or if the depiction was incidental or created in an unforeseeable way. This is a liability provision, not an independent compliance obligation.
Statutory Text
5. The owner, licensee or operator of a covered system shall be liable to a covered person in an amount of one hundred dollars per depiction, but not more than one hundred thousand dollars in the aggregate, generated on their system by a user other than the covered person or their agent, employee, representative, or another at the direction of them created outside of the period prescribed by this section where such owner, licensee or operator of such covered system fails to implement a reasonable method to prevent the unauthorized creation of realistic depictions of a covered person within the periods prescribed in this section. The owner, licensee or operator of a covered system shall not be liable to a covered person where such owner, licensee or operator is unable to prevent the unauthorized depictions of such covered person after implementing a reasonable method, nor shall they be liable for any unauthorized realistic depictions that are incidental or were created in an unforeseeable way.
Other · Content Generation
Gen. Bus. Law § 390-f(6)
Plain Language
The entire section does not apply where a third party — with no ownership or control over the underlying generative model — processes the visual or audio outputs of the system. This exempts downstream processors or intermediaries who handle generated outputs but do not own or control the model that created them. This is a scope limitation, not an independent obligation.
Statutory Text
6. This section shall not apply to the owner, licensee or operator of a visual or audio generative artificial intelligence system where the visual or audio outputs of the system are processed by a third party that has no ownership or control over the underlying generative model.