O.C.G.A. § 10-16-3(e)-(j)
Plain Language
Deployers (or their contracted third parties) must complete a comprehensive impact assessment for each automated decision system before deployment, at least annually thereafter, and within 90 days of any intentional and substantial modification. The assessment must cover system purpose and use cases, algorithmic discrimination risks and mitigation, accessibility impacts, labor law compliance risks, privacy intrusion risks, data categories, validity and reliability analysis, transparency measures, and post-deployment monitoring. If the assessment reveals a discrimination risk, the deployer may not deploy until less discriminatory alternatives are searched for and implemented. A single assessment may cover a comparable set of systems. Impact assessments completed for other regulatory requirements satisfy this obligation if reasonably similar in scope. All impact assessments and records must be retained throughout deployment and for at least three years after final deployment. Small deployers meeting all § 10-16-6 conditions are exempt from this requirement.
Statutory Text
(e) Except as otherwise provided for in this chapter: (1) A deployer, or a third party contracted by the deployer, that deploys an automated decision system shall complete an impact assessment for the automated decision system; and (2) A deployer, or a third party contracted by the deployer, shall complete an impact assessment for a deployed automated decision system at least annually and within 90 days after any intentional and substantial modification to the automated decision system is made available. (f) An impact assessment completed pursuant to subsection (e) of this Code section shall include, at a minimum, and to the extent reasonably known by or available to the deployer: (1) A statement by the deployer disclosing the purpose, intended use cases, and deployment context of, and benefits afforded by, the automated decision system; (2) An analysis of whether the deployment of the automated decision system poses any known or reasonably foreseeable risks of: (A) Algorithmic discrimination and, if so, the nature of the algorithmic discrimination and the steps that have been taken to mitigate the risks; (B) Limits on accessibility for individuals who are pregnant, breastfeeding, or disabled, and, if so, what reasonable accommodations the deployer may provide that would mitigate any such limitations on accessibility; (C) Any violation of state or federal labor laws, including laws pertaining to wages, occupational health and safety, and the right to organize; or (D) Any physical or other intrusion upon the solitude or seclusion, or the private affairs or concerns, of consumers if such intrusion: (i) Would be offensive to a reasonable person; and (ii) May be redressed under the laws of this state; (3) A description of the categories of data the automated decision system processes as inputs and the outputs the automated decision system produces; (4) If the deployer used data to customize the automated decision system, an overview of the categories of data the deployer used to customize the automated decision system; (5) An analysis of the automated decision system's validity and reliability in accordance with contemporary social science standards, and a description of any metrics used to evaluate the performance and known limitations of the automated decision system; (6) A description of any transparency measures taken concerning the automated decision system, including any measures taken to disclose to a consumer that the automated decision system is in use when the automated decision system is in use; (7) A description of the post-deployment monitoring and user safeguards provided concerning the automated decision system, including the oversight, use, and learning process established by the deployer to address issues arising from the deployment of the automated decision system; and (8) When such impact assessment is completed following an intentional and substantial modification to an automated decision system, a statement disclosing the extent to which the automated decision system was used in a manner that was consistent with, or varied from, the developer's intended uses of the automated decision system. (g) If the analysis required by paragraph (2) of subsection (f) of this Code section reveals a risk of algorithmic discrimination, the deployer shall not deploy the automated decision system until the developer or deployer takes reasonable steps to search for and implement less discriminatory alternative decision methods. (h) A single impact assessment may address a comparable set of automated decision systems deployed by a deployer. (i) If a deployer, or a third party contracted by the deployer, completes an impact assessment for the purpose of complying with another applicable law or regulation, the impact assessment shall satisfy the requirements established in this Code section if the impact assessment is reasonably similar in scope and effect to the impact assessment that would otherwise be completed pursuant to this Code section. (j) A deployer shall maintain the most recently completed impact assessment for an automated decision system, all records concerning each impact assessment, and all prior impact assessments, if any, throughout the period of time that the automated decision system is deployed and for at least three years following the final deployment of the automated decision system.