SB-167
GA · State · USA
GA
USA
● Pending
Proposed Effective Date
2025-07-01
Georgia Senate Bill 167 — An Act to amend Title 10 of the Official Code of Georgia Annotated, relating to commerce and trade, so as to provide broadly for private entities that employ certain AI systems to guard against discrimination caused by such systems
Georgia SB 167 regulates developers and deployers of automated decision systems used for consequential decisions affecting consumers in education, employment, government services, financial services, healthcare, housing, insurance, and legal services. Developers must disclose system documentation to the Attorney General and deployers, address algorithmic discrimination risks, and publish public use case summaries. Deployers must implement risk management programs, complete annual impact assessments, provide detailed pre-decision and post-decision notices to consumers including explanation of principal factors and appeal rights, publish impact assessments, and report discovered algorithmic discrimination to the AG within 90 days. AI systems intended to interact with consumers must disclose that the consumer is interacting with AI. Enforcement is exclusively through the Attorney General under Georgia's Fair Business Practices Act, with an affirmative defense for entities that self-discover and quickly cure violations. A small deployer exemption applies where the system affects fewer than 1,000 consumers annually and the deployer has fewer than 15 employees, subject to conditions.
Summary

Georgia SB 167 regulates developers and deployers of automated decision systems used for consequential decisions affecting consumers in education, employment, government services, financial services, healthcare, housing, insurance, and legal services. Developers must disclose system documentation to the Attorney General and deployers, address algorithmic discrimination risks, and publish public use case summaries. Deployers must implement risk management programs, complete annual impact assessments, provide detailed pre-decision and post-decision notices to consumers including explanation of principal factors and appeal rights, publish impact assessments, and report discovered algorithmic discrimination to the AG within 90 days. AI systems intended to interact with consumers must disclose that the consumer is interacting with AI. Enforcement is exclusively through the Attorney General under Georgia's Fair Business Practices Act, with an affirmative defense for entities that self-discover and quickly cure violations. A small deployer exemption applies where the system affects fewer than 1,000 consumers annually and the deployer has fewer than 15 employees, subject to conditions.

Enforcement & Penalties
Enforcement Authority
Enforcement by the Attorney General under the Fair Business Practices Act of 1975 (O.C.G.A. Part 2 of Article 15 of Chapter 1 of Title 10). Enforcement is agency-initiated by the Attorney General. An affirmative defense is available if the developer or deployer discovers the violation through adversarial testing, red teaming, or internal review, cures within seven days, reports to the AG and affected consumers, is otherwise in compliance with a recognized risk management framework, and the violation was inadvertent, affected fewer than 100 consumers, and could not have been discovered through reasonable diligence. The statute expressly preserves all existing rights, claims, remedies, presumptions, and defenses available at law or in equity, and states that the affirmative defense applies only to AG enforcement actions.
Penalties
Violations are enforceable through the Fair Business Practices Act of 1975, which provides for injunctive relief, civil penalties, and other remedies available under that Act. The statute expressly preserves all rights, claims, remedies, presumptions, and defenses available at law or in equity, meaning existing common law causes of action are not displaced. No specific statutory damages amount is set by this chapter; remedies are those available under the FBPA.
Who Is Covered
'Deployer' means a person doing business in this state that deploys an automated decision system.
'Developer' means a person doing business in this state that develops or intentionally and substantially modifies an artificial intelligence system.
What Is Covered
'Automated decision system' means a computational process derived from machine learning, statistical modeling, data analytics, or an artificial intelligence system that, when deployed, issues a simplified output, including, but not limited to, a score, classification, or recommendation, that is used to assist or replace human discretionary decision making and materially impacts natural persons. Such term shall not include a tool that does not assist or replace processes for making consequential decisions and that does not materially impact natural persons, including, but not limited to, a junk email filter, firewall, antivirus software, calculator, spreadsheet, or other tool that does no more than organize data already in possession of the deployer of the automated decision system.
Compliance Obligations 20 obligations · click obligation ID to open requirement page
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Developer · Automated Decisionmaking
O.C.G.A. § 10-16-2(a)
Plain Language
Developers are categorically prohibited from selling, distributing, or making available to deployers any automated decision system that results in algorithmic discrimination. The prohibition covers discrimination or disparate impact across a broad set of protected characteristics in the context of consequential decisions. Self-testing for bias mitigation and diversity expansion are carved out, as are private clubs exempt under the Civil Rights Act.
Statutory Text
No developer shall sell, distribute, or otherwise make available to deployers an automated decision system that results in algorithmic discrimination.
R-02 Regulatory Disclosure & Submissions · R-02.1 · Developer · Automated Decisionmaking
O.C.G.A. § 10-16-2(b)
Plain Language
Developers must submit comprehensive documentation about each automated decision system to the Attorney General, in a form the AG prescribes. The required documentation covers foreseeable uses and misuses, system purpose and benefits, training data summaries, known discrimination risks and mitigation measures, pre-deployment evaluation methodology, data governance measures, usage and monitoring instructions, and any other information deployers need for compliance. Trade secret redactions are permitted under § 10-16-2(f) but not where the information is necessary for deployer compliance.
Statutory Text
Except as provided in subsection (f) of this Code section, a developer of an automated decision system shall provide certain information regarding such automated decision system to the Attorney General, in a form and manner prescribed by the Attorney General. Such information shall include, at a minimum: (1) A general statement describing the reasonably foreseeable uses and known harmful or inappropriate uses of the automated decision system; (2) Documentation disclosing: (A) The purpose of the automated decision system; (B) The intended benefits and uses of the automated decision system; (C) High-level summaries of the types of data used to train the automated decision system; (D) Known or reasonably foreseeable limitations of the automated decision system, including known or reasonably foreseeable risks of algorithmic discrimination arising from the intended uses of the automated decision system; (E) The measures the developer has taken to mitigate known or reasonably foreseeable risks of algorithmic discrimination; (F) How the automated decision system was evaluated for performance and mitigation of algorithmic discrimination before the automated decision system was offered, sold, leased, licensed, given, or otherwise made available to the deployer; (G) The data governance measures used to cover the training data sets and the measures used to examine the suitability of data sources, possible biases, and appropriate mitigation; (H) How the automated decision system should be used, not be used, and be monitored by an individual when the automated decision system is used to make, or assist in making, a consequential decision; and (I) All other information necessary to allow the deployer to comply with the requirements of Code Section 10-16-3; and (3) Any additional documentation that is reasonably necessary to assist the deployer in understanding the outputs and monitoring the performance of the automated decision system for risks of algorithmic discrimination.
T-03 Training Data Disclosure · T-03.3 · Developer · Automated Decisionmaking
O.C.G.A. § 10-16-2(c)
Plain Language
When a developer distributes an automated decision system to a deployer or another developer, it must share all the documentation it provides to the AG — including training data governance measures and impact assessment artifacts such as model cards and data set cards — to the extent feasible. This ensures deployers have sufficient information to complete their own impact assessments. A developer that is also the deployer of its own system is exempt from generating this documentation unless the system is provided to an unaffiliated deployer.
Statutory Text
(1) Except as provided in subsection (f) of this Code section, a developer that offers, sells, leases, licenses, gives, or otherwise makes available to a deployer or other developer an automated decision system shall make available to the deployer or other developer, to the extent feasible, all of the information required to be provided to the Attorney General by subsection (b) of this Code section, as well as the documentation and information, through artifacts such as model cards, data set cards, or other impact assessments, necessary for a deployer or third party contracted by a deployer to complete an impact assessment pursuant to subsection (e) of Code Section 10-16-3. (2) A developer that also serves as a deployer for an automated decision system is not required to generate the documentation required by this subsection unless the automated decision system is provided to an unaffiliated entity acting as a deployer.
G-02 Public Transparency & Documentation · G-02.4 · Developer · Automated Decisionmaking
O.C.G.A. § 10-16-2(d)
Plain Language
Developers must publish and maintain on their public website or in a public use case inventory a clear summary describing the types of automated decision systems they offer and how they manage algorithmic discrimination risks. This statement must be kept current and updated within 90 days of any intentional and substantial modification to a covered system.
Statutory Text
(1) A developer shall make available to the public, in a manner that is clear and readily available on the developer's public website or in a public use case inventory, a statement summarizing: (A) The types of automated decision systems that the developer has developed or intentionally and substantially modified and currently makes available to a deployer or other developer; and (B) How the developer manages known or reasonably foreseeable risks of algorithmic discrimination. (2) A developer shall update the statement described in paragraph (1) of this subsection: (A) As necessary to ensure that the statement remains accurate; and (B) No later than 90 days after the developer intentionally and substantially modifies any automated decision system described in such statement.
H-02 Non-Discrimination & Bias Assessment · H-02.1H-02.2 · Developer · Automated Decisionmaking
O.C.G.A. § 10-16-2(e)(1)
Plain Language
Developers must affirmatively address algorithmic discrimination, invalidity, and errors by ensuring representative training data, implementing data governance, testing for disparate impact, and exploring less discriminatory alternatives. This is not a one-time pre-deployment obligation — developers must continue assessing and mitigating discrimination risk for the entire period any deployer uses the system.
Statutory Text
A developer of an automated decision system shall take steps to address risks of algorithmic discrimination, invalidity, and errors, including, but not limited to, ensuring suitability and representativeness of data sources, implementing data governance measures, testing the automated decision system for disparate impact, and searching for less discriminatory alternative decision methods. Developers shall continue assessing and mitigating the risk of algorithmic discrimination in their automated decision systems so long as such automated decision systems are in use by any deployer.
R-01 Incident Reporting · R-01.3 · Developer · Automated Decisionmaking
O.C.G.A. § 10-16-2(e)(2)
Plain Language
When a developer discovers — through its own testing or via a credible deployer report — that its automated decision system has caused or is likely to have caused algorithmic discrimination, the developer must notify the Attorney General and all known deployers or other developers within 90 days. This is a mandatory disclosure triggered by discovery of actual or likely discrimination, not a routine reporting obligation.
Statutory Text
A developer of an automated decision system shall disclose to the Attorney General, in a form and manner prescribed by the Attorney General, and to all known deployers or other developers of the automated decision system, any known or reasonably foreseeable risks of algorithmic discrimination arising from the intended uses of the automated decision system without unreasonable delay but no later than 90 days after the date on which: (A) The developer discovers through the developer's ongoing testing and analysis that the developer's automated decision system has been deployed and has caused or is reasonably likely to have caused algorithmic discrimination; or (B) The developer receives from a deployer a credible report that the automated decision system has been deployed and has caused algorithmic discrimination.
R-02 Regulatory Disclosure & Submissions · R-02.2 · Developer · Automated Decisionmaking
O.C.G.A. § 10-16-2(g)
Plain Language
The Attorney General may demand any documentation or records required under the developer obligations section, and the developer must produce them within seven days. Records submitted to the AG are exempt from Georgia's open records law. Developers may designate materials as trade secrets or proprietary, and disclosure does not waive attorney-client privilege or work-product protection. The seven-day production window is significantly shorter than the 90-day norm in other jurisdictions.
Statutory Text
The Attorney General may require that a developer disclose to the Attorney General, within seven days and in a form and manner prescribed by the Attorney General, any documentation or records required by this Code section, including, but not limited to, the statement or documentation described in subsection (b) of this Code section. The Attorney General may evaluate such statement or documentation to ensure compliance with this chapter, and, notwithstanding the provisions of Article 4 of Chapter 18 of Title 50, relating to open records, such records shall not be open to inspection by or made available to the public. In a disclosure pursuant to this subsection, a developer may designate the statement or documentation as including proprietary information or a trade secret. To the extent that any information contained in the statement or documentation includes information subject to attorney-client privilege or work-product protection, the disclosure does not constitute a waiver of the privilege or protection.
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · Automated Decisionmaking
O.C.G.A. § 10-16-3(a)
Plain Language
Deployers are categorically prohibited from using an automated decision system in any manner that results in algorithmic discrimination. This mirrors the developer prohibition in § 10-16-2(a) but applies at the deployment stage rather than the distribution stage.
Statutory Text
No deployer of an automated decision system shall use an automated decision system in a manner that results in algorithmic discrimination.
G-01 AI Governance Program & Documentation · G-01.1G-01.2 · Deployer · Automated Decisionmaking
O.C.G.A. § 10-16-3(b)-(c)
Plain Language
Deployers must implement a formal risk management policy and program governing their use of automated decision systems. The program must specify the principles, processes, and personnel for identifying, documenting, and mitigating algorithmic discrimination risks. It must be iterative and regularly updated over the system lifecycle. The program must consider the NIST AI RMF, ISO/IEC 42001, or an equivalent framework — or any framework the AG designates — as well as the deployer's size, system scope, and data sensitivity. A single program may cover multiple systems. Small deployers meeting the exemption criteria in § 10-16-6 are exempt from this obligation.
Statutory Text
Except as provided in Code Section 10-16-6, a deployer of an automated decision system shall implement a risk management policy and program to govern the deployer's deployment of the automated decision system. The risk management policy and program shall specify and incorporate the principles, processes, and personnel that the deployer uses to identify, document, and mitigate known or reasonably foreseeable risks of algorithmic discrimination. The risk management policy and program shall be an iterative process planned, implemented, and regularly and systematically reviewed and updated over the life cycle of an automated decision system, requiring regular, systematic review and updates. A risk management policy and program implemented and maintained pursuant to this subsection shall take into consideration: (1) Either: (A) The guidance and standards set forth in the latest version of the Artificial Intelligence Risk Management Framework published by the National Institute of Standards and Technology of the United States Department of Commerce, standard ISO/IEC 42001 of the International Organization for Standardization, or another nationally or internationally recognized risk management framework for artificial intelligence systems, if the standards are substantially equivalent to or more stringent than the requirements of this chapter; or (B) Any risk management framework for artificial intelligence systems that the Attorney General, in the Attorney General's discretion, may designate; (2) The size and complexity of the deployer; (3) The nature and scope of the automated decision systems deployed by the deployer, including the intended uses of the automated decision systems; and (4) The sensitivity and volume of data processed in connection with the automated decision systems deployed by the deployer. A risk management policy and program implemented pursuant to this Code section may cover multiple automated decision systems deployed by the deployer.
G-01 AI Governance Program & Documentation · G-01.3 · Deployer · Automated Decisionmaking
O.C.G.A. § 10-16-3(d)
Plain Language
Deployers must establish and follow written policies for acquiring and relying on third-party automated decision systems, including contractual controls ensuring developers provide all necessary compliance documentation. They must also maintain procedures for reporting errors or algorithmic discrimination back to developers, and for remediating incorrect information in their own systems. This creates a documented vendor management and error-correction framework.
Statutory Text
Each deployer shall establish and adhere to: (1) Written standards, policies, procedures, and protocols for the acquisition, use of, or reliance on automated decision systems developed by third-party developers, including reasonable contractual controls ensuring that the developer statements and summaries described in subsection (b) of Code Section 10-16-2 include all information necessary for the deployer to fulfill its obligations under this Code section; (2) Procedures for reporting any incorrect information or evidence of algorithmic discrimination to a developer for further investigation and mitigation, as necessary; and (3) Procedures to remediate and eliminate incorrect information from its automated decision systems that the deployer has identified or has been reported to a developer.
H-02 Non-Discrimination & Bias Assessment · H-02.3H-02.8H-02.10 · Deployer · Automated Decisionmaking
O.C.G.A. § 10-16-3(e)-(j)
Plain Language
Deployers must complete an impact assessment before deploying each automated decision system and repeat it at least annually and within 90 days of any intentional and substantial modification. The assessment must cover system purpose and benefits, algorithmic discrimination risk analysis with mitigation steps, accessibility limitations, labor law risks, privacy intrusion risks, data inputs and outputs, validity and reliability analysis against social science standards, transparency measures, and post-deployment monitoring. If the assessment reveals discrimination risk, deployment is blocked until less discriminatory alternatives are implemented. A single assessment may cover comparable systems. Assessments completed for other laws count if reasonably similar in scope. All assessments and associated records must be retained for at least three years after final deployment.
Statutory Text
(e) Except as otherwise provided for in this chapter: (1) A deployer, or a third party contracted by the deployer, that deploys an automated decision system shall complete an impact assessment for the automated decision system; and (2) A deployer, or a third party contracted by the deployer, shall complete an impact assessment for a deployed automated decision system at least annually and within 90 days after any intentional and substantial modification to the automated decision system is made available. (f) An impact assessment completed pursuant to subsection (e) of this Code section shall include, at a minimum, and to the extent reasonably known by or available to the deployer: (1) A statement by the deployer disclosing the purpose, intended use cases, and deployment context of, and benefits afforded by, the automated decision system; (2) An analysis of whether the deployment of the automated decision system poses any known or reasonably foreseeable risks of: (A) Algorithmic discrimination and, if so, the nature of the algorithmic discrimination and the steps that have been taken to mitigate the risks; (B) Limits on accessibility for individuals who are pregnant, breastfeeding, or disabled, and, if so, what reasonable accommodations the deployer may provide that would mitigate any such limitations on accessibility; (C) Any violation of state or federal labor laws, including laws pertaining to wages, occupational health and safety, and the right to organize; or (D) Any physical or other intrusion upon the solitude or seclusion, or the private affairs or concerns, of consumers if such intrusion: (i) Would be offensive to a reasonable person; and (ii) May be redressed under the laws of this state; (3) A description of the categories of data the automated decision system processes as inputs and the outputs the automated decision system produces; (4) If the deployer used data to customize the automated decision system, an overview of the categories of data the deployer used to customize the automated decision system; (5) An analysis of the automated decision system's validity and reliability in accordance with contemporary social science standards, and a description of any metrics used to evaluate the performance and known limitations of the automated decision system; (6) A description of any transparency measures taken concerning the automated decision system, including any measures taken to disclose to a consumer that the automated decision system is in use when the automated decision system is in use; (7) A description of the post-deployment monitoring and user safeguards provided concerning the automated decision system, including the oversight, use, and learning process established by the deployer to address issues arising from the deployment of the automated decision system; and (8) When such impact assessment is completed following an intentional and substantial modification to an automated decision system, a statement disclosing the extent to which the automated decision system was used in a manner that was consistent with, or varied from, the developer's intended uses of the automated decision system. (g) If the analysis required by paragraph (2) of subsection (f) of this Code section reveals a risk of algorithmic discrimination, the deployer shall not deploy the automated decision system until the developer or deployer takes reasonable steps to search for and implement less discriminatory alternative decision methods. (h) A single impact assessment may address a comparable set of automated decision systems deployed by a deployer. (i) If a deployer, or a third party contracted by the deployer, completes an impact assessment for the purpose of complying with another applicable law or regulation, the impact assessment shall satisfy the requirements established in this Code section if the impact assessment is reasonably similar in scope and effect to the impact assessment that would otherwise be completed pursuant to this Code section. (j) A deployer shall maintain the most recently completed impact assessment for an automated decision system, all records concerning each impact assessment, and all prior impact assessments, if any, throughout the period of time that the automated decision system is deployed and for at least three years following the final deployment of the automated decision system.
H-02 Non-Discrimination & Bias Assessment · H-02.8 · Deployer · Automated Decisionmaking
O.C.G.A. § 10-16-3(k)
Plain Language
Deployers must conduct at least annual reviews of each deployed automated decision system specifically to verify it is not causing algorithmic discrimination. This is a standalone periodic review obligation separate from the annual impact assessment, focused specifically on ongoing discrimination detection rather than the broader assessment required by subsection (e).
Statutory Text
At least annually a deployer, or a third party contracted by the deployer, shall review the deployment of each automated decision system deployed by the deployer to ensure that the automated decision system is not causing algorithmic discrimination.
H-02 Non-Discrimination & Bias Assessment · H-02.5 · Deployer · Automated Decisionmaking
O.C.G.A. § 10-16-3(l)
Plain Language
Deployers must publicly post on their websites all impact assessments completed in the last three years. The Attorney General prescribes the form and manner. This is a rolling publication obligation — as new assessments are completed, they must be published and remain available for three years.
Statutory Text
Deployers shall publish on their public websites all impact assessments completed within the preceding three years in a form and manner prescribed by the Attorney General.
H-01 Human Oversight of Automated Decisions · H-01.3 · Deployer · Automated Decisionmaking
O.C.G.A. § 10-16-4(a)
Plain Language
Before or at the time an automated decision system is used for a consequential decision about a consumer, the deployer must notify the consumer and provide: the system's purpose and the nature of the decision; deployer contact information; a plain-language description covering what attributes the system measures, how it measures them, why they are relevant, what human components exist, how automated components inform the decision, and a link to a public page with system logic, outputs, data sources, and the most recent impact assessment results; and instructions to access the deployer's public statement under § 10-16-5. This is a comprehensive pre-decision disclosure combining AI identity notice with detailed system explanation.
Statutory Text
No later than the time that a deployer deploys an automated decision system to make, or assist in making, a consequential decision concerning a consumer, the deployer shall: (1) Notify the consumer that the deployer has deployed an automated decision system to make, or assist in making, a consequential decision; and (2) Provide to the consumer: (A) A statement disclosing the purpose of the automated decision system and the nature of the consequential decision; (B) The contact information for the deployer; (C) A description, in plain language, of the automated decision system, which description shall, at a minimum, include: (i) A description of the personal characteristics or attributes that the system will measure or assess; (ii) The method by which the system measures or assesses those attributes or characteristics; (iii) How those attributes or characteristics are relevant to the consequential decisions for which the system should be used; (iv) Any human components of such system; (v) How any automated components of such system are used to inform such consequential decision; and (vi) A direct link to a publicly accessible page on the deployer's public website that contains a plain-language description of the logic used in the system, including the key parameters that affect the output of the system; the system's outputs; the types and sources of data collected from natural persons and processed by the system when it is used to make, or assists in making, a consequential decision; and the results of the most recent impact assessment, or an active link to a web page where a consumer can review those results; and (D) Instructions on how to access the statement required by Code Section 10-16-5.
H-01 Human Oversight of Automated Decisions · H-01.1H-01.2H-01.4H-01.5 · Deployer · Automated Decisionmaking
O.C.G.A. § 10-16-4(b)-(d)
Plain Language
Within one business day of making or assisting in a consequential decision, the deployer must send the affected consumer a detailed post-decision notice covering: the principal factors and variables driving the decision, the degree of AI contribution, data sources used, how the consumer's personal data informed the decision, the consumer's right to correct data and provide supplemental information, what actions could have or could in the future change the outcome, how to correct inaccurate personal data, and how to appeal an adverse decision with human review if technically feasible. All notices must be provided directly, in plain language, in all languages the deployer uses commercially, and in disability-accessible formats. Critically, if a deployer cannot provide these notices and explanations, it may not use the automated decision system for the consequential decision at all.
Statutory Text
(b) A deployer that has used an automated decision system to make, or assist in making, a consequential decision concerning a consumer shall transmit to such consumer within one business day after such decision a notice that includes: (1) A specific and accurate explanation that identifies the principal factors and variables that led to the consequential decision, including: (A) The degree to which, and manner in which, the automated decision system contributed to the consequential decision; (B) The source or sources of the data processed by the automated decision system; and (C) A plain-language explanation of how the consumer's personal data informed these principal factors and variables when the automated decision system made, or assisted in making, the consequential decision; (2) Information about consumers' right to correct, and how the consumer can submit corrections and provide supplementary information relevant to, the consequential decision; (3) What actions, if any, the consumer might have taken to secure a different decision and the actions that the consumer might take to secure a different decision in the future; (4) Information on opportunities to correct any incorrect personal data that the automated decision system processed in making, or assisting in making, the consequential decision; and (5) Information on opportunities to appeal an adverse consequential decision concerning the consumer arising from the deployment of an automated decision system, which appeal shall, if technically feasible, allow for human review. (c)(1) A deployer shall provide the notice, statement, contact information, and description required by subsections (a) and (b) of this Code section: (A) Directly to the consumer; (B) In plain language; (C) In all languages in which the deployer, in the ordinary course of the deployer's business, provides contracts, disclaimers, sale announcements, and other information to consumers; and (D) In a format that is accessible to consumers with disabilities. (2) If the deployer is unable to provide the notice, statement, contact information, and description directly to the consumer, the deployer shall make such information available in a manner that is reasonably calculated to ensure that the consumer receives it. (d) No deployer shall use an automated decision system to make, or assist in making, a consequential decision if it cannot provide notices and explanations that satisfy the requirements of this Code section.
G-02 Public Transparency & Documentation · G-02.4 · Deployer · Automated Decisionmaking
O.C.G.A. § 10-16-5(a)-(b)
Plain Language
Deployers must publish and keep current on their public website a clear summary covering: the types of automated decision systems they currently deploy, their approach to managing algorithmic discrimination risks for each system, and detailed information about the nature, source, and extent of data they collect and use. The small deployer exemption in § 10-16-6 applies. This is a deployer-side analog to the developer public statement obligation in § 10-16-2(d), with an additional data collection disclosure element.
Statutory Text
(a) Except as provided in Code Section 10-16-6, a deployer shall make available, in a manner that is clear and readily available on the deployer's public website, a statement summarizing: (1) The types of automated decision systems that are currently deployed by the deployer; (2) How the deployer manages known or reasonably foreseeable risks of algorithmic discrimination that may arise from the deployment of each such automated decision system; and (3) In detail, the nature, source, and extent of the information collected and used by the deployer. (b) A deployer shall periodically update the statement described in subsection (a) of this Code section.
R-01 Incident Reporting · R-01.3 · Deployer · Automated Decisionmaking
O.C.G.A. § 10-16-7
Plain Language
Deployers that discover their automated decision system has caused algorithmic discrimination must notify the Attorney General within 90 days of discovery. This parallels the developer notification obligation in § 10-16-2(e)(2) but applies at the deployer level. The form and manner of the notice are prescribed by the AG.
Statutory Text
If a deployer deploys an automated decision system and subsequently discovers that the automated decision system has caused algorithmic discrimination, the deployer, without unreasonable delay, but no later than 90 days after the date of the discovery, shall send to the Attorney General, in a form and manner prescribed by the Attorney General, a notice disclosing the discovery.
R-02 Regulatory Disclosure & Submissions · R-02.2 · Deployer · Automated Decisionmaking
O.C.G.A. § 10-16-9
Plain Language
The Attorney General may demand any chapter-required documentation from a deployer (or its contracted third party) with a seven-day production deadline. Materials submitted are exempt from Georgia's open records law. Deployers may designate trade secrets and proprietary information, and privilege and work-product protections are preserved. This mirrors the developer on-demand disclosure in § 10-16-2(g).
Statutory Text
The Attorney General may require that a deployer, or a third party contracted by the deployer, disclose to the Attorney General, no later than seven days after and in a form and manner prescribed by the Attorney General, any documentation or records required by this chapter. The Attorney General may evaluate the risk management policy, impact assessment, or records to ensure compliance with this chapter, and the risk management policy, impact assessment, and such records, notwithstanding the provisions of Article 4 of Chapter 18 of Title 50, relating to open records, shall not be open to inspection by or made available to the public. In a disclosure pursuant to this Code section, a deployer may designate the statement or documentation as including proprietary information or a trade secret. To the extent that any information contained in the risk management policy, impact assessment, or records is subject to attorney-client privilege or work-product protection, the disclosure does not constitute a waiver of the privilege or protection.
T-01 AI Identity Disclosure · T-01.1 · DeveloperDeployer · Automated Decisionmaking
O.C.G.A. § 10-16-11(a)-(b)
Plain Language
Deployers and developers that make available any AI system intended to interact with consumers must disclose to each interacting consumer that they are interacting with an AI system. This is a conditional obligation — no disclosure is required if it would be obvious to a reasonable person that the interaction is with AI. This applies to all AI systems intended for consumer interaction, not just automated decision systems, making it broader in scope than most other provisions in this chapter.
Statutory Text
(a) Except as provided in subsection (b) of this Code section, a deployer or other developer that deploys, offers, sells, leases, licenses, gives, or otherwise makes available an artificial intelligence system that is intended to interact with consumers shall ensure the disclosure to each consumer who interacts with the artificial intelligence system that the consumer is interacting with an artificial intelligence system. (b) Disclosure is not required under subsection (a) of this Code section under circumstances in which it would be obvious to a reasonable person that the person is interacting with an artificial intelligence system.
Other · Automated Decisionmaking
O.C.G.A. § 10-16-13(a)
Plain Language
Violations of this chapter are enforceable through Georgia's Fair Business Practices Act of 1975. This provision does not create a new compliance obligation — it designates the existing FBPA enforcement mechanism as the vehicle for enforcing the chapter's substantive requirements.
Statutory Text
A violation of the requirements established in this chapter shall be enforceable through the provisions of Part 2 of Article 15 of Chapter 1 of this title, the 'Fair Business Practices Act of 1975.'