SB-00435
CT · State · USA
CT
USA
● Pending
Proposed Effective Date
2026-10-01
Connecticut Raised Bill No. 435 — An Act Concerning Automated Decision Systems Protections for Employees
Imposes comprehensive requirements on developers and deployers of automated employment-related decision processes used in Connecticut. Deployers must disclose to applicants and employees when they are interacting with automated systems, provide pre-collection data notices, deliver detailed pre-decision notices, and furnish adverse-decision explanations with data examination and appeal rights. Human review is required for all final or determinative employment decisions. Deployers must contract with Labor Commissioner-approved independent auditors for pre-deployment and annual bias audits, and may not deploy systems with identified disparate impact absent business necessity and corrective action. State agencies face additional restrictions requiring specific legal authorization before using AI technology. The bill also amends Connecticut's anti-discrimination statute (§ 46a-60) to expressly cover discriminatory effects of automated employment decision processes and adds AI technology use as a mandatory subject of collective bargaining. Enforcement is split between the Attorney General (CUTPA), a private right of action for employees, and CHRO for discrimination claims.
Summary

Imposes comprehensive requirements on developers and deployers of automated employment-related decision processes used in Connecticut. Deployers must disclose to applicants and employees when they are interacting with automated systems, provide pre-collection data notices, deliver detailed pre-decision notices, and furnish adverse-decision explanations with data examination and appeal rights. Human review is required for all final or determinative employment decisions. Deployers must contract with Labor Commissioner-approved independent auditors for pre-deployment and annual bias audits, and may not deploy systems with identified disparate impact absent business necessity and corrective action. State agencies face additional restrictions requiring specific legal authorization before using AI technology. The bill also amends Connecticut's anti-discrimination statute (§ 46a-60) to expressly cover discriminatory effects of automated employment decision processes and adds AI technology use as a mandatory subject of collective bargaining. Enforcement is split between the Attorney General (CUTPA), a private right of action for employees, and CHRO for discrimination claims.

Enforcement & Penalties
Enforcement Authority
Dual enforcement. Violations of Sections 3–8 constitute unfair or deceptive trade practices under § 42-110b and are enforced solely by the Attorney General; the private right of action under § 42-110g is expressly excluded for those violations. However, Section 12 creates a separate private right of action for employees aggrieved by violations of Sections 3–9. The Commission on Human Rights and Opportunities (CHRO) retains enforcement authority over discriminatory practices under amended § 46a-60, including AI-related employment discrimination. The Labor Commissioner administers bias audit standards, maintains the auditor registry, and approves corrective actions for disparate impact findings.
Penalties
Under Section 12, an aggrieved employee may recover damages and equitable relief together with costs and reasonable attorney's fees. No statutory minimum is specified. Under Section 11, AG enforcement is available through the Connecticut Unfair Trade Practices Act (§ 42-110b), which provides for civil penalties, injunctive relief, and restitution. CHRO remedies for discriminatory practices under § 46a-60 include back pay, compensatory damages, and cease-and-desist orders.
Who Is Covered
"Deployer" means a person doing business in the state who deploys an automated employment-related decision process in the state;.
"Developer" means a person doing business in the state who develops, or intentionally and substantially modifies, an automated employment-related decision process;.
What Is Covered
"Automated employment-related decision process" (A) means a computational process that makes, assists in making or is used in the course of making an employment-related decision, (B) includes, but is not limited to, a computational process that (i) uses a computer-based assessment or test to (I) make a predictive assessment concerning an applicant for employment or employee, (II) measure the skills, dexterity, reaction time or any other ability or characteristic of an applicant for employment or employee, (III) measure the personality traits, aptitude, attitude or cultural fit of an applicant for employment or employee, or (IV) screen, evaluate, categorize or recommend an applicant for employment or employee, (ii) directs job advertisements or other recruiting materials to targeted groups, (iii) screens resumes for particular terms or patterns, (iv) analyzes a facial expression, word choice or voice captured during an online interview, or (v) analyzes data acquired from a third party concerning an applicant for employment or an employee, and (C) does not include any word processing, spreadsheet, map navigation, web hosting, domain registration, networking, caching, Internet web site loading, data storage, firewall, anti-virus, anti-malware, spam and robocall filtering, spellchecking, calculator, database or similar software or technology insofar as such software or technology does not make an employment-related decision;
Compliance Obligations 15 obligations · click obligation ID to open requirement page
Other · Developer · Employment
Sec. 2(a)-(b)
Plain Language
Developers must supply deployers with all information the deployer needs to comply with the bill's disclosure, notice, and adverse-decision obligations (Sections 3–6). Alternatively, a developer may contractually assume those deployer duties, but the contract must be binding and must clearly specify which duties the developer has taken on. This creates a supply-chain accountability mechanism — either the developer enables the deployer's compliance or the developer takes on the obligations directly.
Statutory Text
(a) Except as provided in subsection (b) of this section, the developer of an automated employment-related decision process that is deployed in the state shall provide to the deployer of such automated employment-related decision process all information that such deployer requires to perform such deployer's duties under sections 3 to 6, inclusive, of this act. (b) The developer of an automated employment-related decision process may enter into a contract with a deployer of the automated employment-related decision process to assume the deployer's duties under sections 3 to 6, inclusive, of this act. The contract shall be binding and clearly set forth which of the deployer's duties under sections 3 to 6, inclusive, of this act the developer has assumed.
T-01 AI Identity Disclosure · T-01.1 · Deployer · Employment
Sec. 3(a)-(b)
Plain Language
Deployers must disclose to every applicant or employee who interacts with an automated employment-related decision process that they are interacting with an automated system. This obligation is conditional — no disclosure is required if a reasonable person would find it obvious they are interacting with an automated system. The developer may contractually assume this obligation under Section 2(b).
Statutory Text
(a) Except as provided in subsection (b) of this section and subsection (b) of section 2 of this act, a deployer who deploys an automated employment-related decision process that is intended to interact with an applicant for employment or employee in the state shall ensure that it is disclosed to each such applicant or employee who interacts with such process that such applicant or employee is interacting with an automated employment-related decision process. (b) No disclosure shall be required under subsection (a) of this section under circumstances in which a reasonable person would deem it obvious that such person is interacting with an automated employment-related decision process.
D-01 Automated Processing Rights & Data Controls · D-01.1 · Deployer · Employment
Sec. 4
Plain Language
Before collecting any personal data from an applicant or employee for use in an automated employment-related decision process, the deployer must provide a written notice covering: the purpose of collection, categories of data collected, retention period, who will access the data, and the individual's right to opt out of certain personal data processing under Connecticut's existing data privacy law (§ 42-518). This is a pre-collection notice — it must be delivered before data collection begins, not at the point of an employment decision.
Statutory Text
Except as provided in subsection (b) of section 2 of this act, prior to collecting any personal data of an applicant for employment or employee in the state for processing in an automated employment-related decision process, a deployer shall provide to such applicant or employee a written notice disclosing: (1) The purpose of such data collection; (2) The categories of personal data that will be collected for processing in such automated employment-related decision process; (3) The retention period for any personal data collected; (4) The categories of persons who will have access to such personal data; and (5) Information concerning the right, under subparagraph (C) of subdivision (5) of subsection (a) of section 42-518 of the general statutes, to opt out of the processing of personal data for the purposes set forth in said subparagraph.
H-01 Human Oversight of Automated Decisions · H-01.3 · Deployer · Employment
Sec. 5
Plain Language
Before making any employment-related decision in which an automated process is used as the decision-maker or a substantial factor, the deployer must provide the affected applicant or employee a detailed written pre-decision notice. The notice must cover eight required elements: that an automated system is being used, its purpose, opt-out rights under CT data privacy law, deployer contact information, availability of human review, how to request reevaluation, a link to the most recent bias audit summary, and how to request additional documentation. This is distinct from the Section 4 data-collection notice — Section 5 is triggered by impending decision-making, not data collection.
Statutory Text
Except as provided in subsection (b) of section 2 of this act, a deployer who has deployed an automated employment-related decision process to make, or be a substantial factor in making, an employment-related decision concerning an applicant for employment or employee in the state shall, before such employment-related decision is made, provide to such applicant or employee a written notice disclosing: (1) That the deployer has deployed an automated employment-related decision process; (2) The purpose of the automated employment-related decision process and the nature of such employment-related decision; (3) Information concerning the right, under subparagraph (C) of subdivision (5) of subsection (a) of section 42-518 of the general statutes, to opt out of the processing of personal data for the purposes set forth in said subparagraph; (4) Contact information for the deployer; (5) The availability of human review pursuant to section 7 of this act; (6) Information concerning how such applicant or employee may request a revaluation of any employment-related decision made in whole or in part by such automated employment-related decision process; (7) A link to the summary of the most recent bias audit required pursuant to section 8 of this act; and (8) Information concerning how to request additional documentation or information about such automated employment-related decision process.
H-01 Human Oversight of Automated Decisions · H-01.1H-01.2H-01.4H-01.5 · Deployer · Employment
Sec. 6(a)-(b)
Plain Language
When an automated employment decision process makes or substantially contributes to an adverse employment decision, the deployer must provide the affected individual three things: (1) a high-level explanation of the principal reasons for the adverse decision — including how the automated system contributed, what data types were processed, and data sources; (2) an opportunity to examine the data used, correct inaccuracies, and appeal based on incorrect data with human review; and (3) upon request, a copy of the most recent bias audit. The explanation must be delivered directly, in plain language, in all languages the deployer ordinarily uses for business communications in the state, and in disability-accessible format.
Statutory Text
(a) Except as provided in subsection (b) of section 2 of this act, a deployer who has deployed an automated employment-related decision process to make, or be a substantial factor in making, an employment-related decision concerning an applicant for employment or employee in the state shall, if such employment-related decision is adverse to such applicant or employee, provide to such applicant or employee: (1) A high-level statement disclosing the principal reason or reasons for such adverse employment-related decision, including, but not limited to, (A) the degree to which, and manner in which, the automated employment-related decision process contributed to such adverse employment-related decision, (B) the type of data that were processed by such automated employment-related decision process in making, or as a substantial factor in making, such adverse employment-related decision, and (C) the source of the data described in subparagraph (B) of this subdivision; (2) An opportunity to (A) examine the data the automated employment-related decision process processed in making, or as a substantial factor in making, such adverse employment-related decision, (B) correct any incorrect data described in subparagraph (A) of this subdivision, and (C) appeal such adverse employment-related decision if such adverse employment-related decision is based upon any incorrect data described in subparagraph (A) of this subdivision. Such appeal shall allow for human review; and (3) Upon request by such applicant or employee, or such applicant or employee's representative, a copy of the most recent bias audit required pursuant to section 8 of this act. (b) A deployer who is required to provide a high-level statement to an applicant for employment or employee in the state pursuant to subdivision (1) of subsection (a) of this section shall provide such statement: (1) Directly to such applicant or employee; (2) In plain language; (3) In all languages in which such deployer, in the ordinary course of such deployer's business, provides contracts, disclaimers, sales announcements and other information to persons in the state; and (4) In a format that is accessible to individuals with disabilities.
H-01 Human Oversight of Automated Decisions · H-01.6 · Deployer · Employment
Sec. 7(a)-(c)
Plain Language
Deployers must implement meaningful human review over every automated employment-related decision process. The human reviewer must have authority to change decisions, understand the system's limitations including bias risks, and not rely solely on the automated output. Specifically, the reviewer must confirm data accuracy and may modify or veto automated recommendations before any adverse decision. Deployers must also establish procedures to pause, correct, or reverse erroneous outputs, and must maintain logs of all human review reports and interventions. Critically, Section 7(c) imposes an absolute prohibition: no automated system may make a final or determinative employment decision without human review.
Statutory Text
(a) For the purposes of this section "human review" means a review conducted by a qualified individual who (1) has the authority to make or change an employment-related decision, (2) understands the capabilities, limitations and risks of the automated employment-related decision process, including, but not limited to, patterns of bias, disparate impact and data quality issues, and (3) does not rely solely on the content, decision, prediction or recommendation generated by the automated employment-related decision process in making a final or determinative employment-related decision. (b) (1) A deployer who has deployed an automated employment-related decision process in making, or as a substantial factor in making, an employment-related decision concerning an applicant for employment or employee in the state shall implement human review over such automated employment-related decision process by providing for review of the content, decisions, predictions or recommendations generated by the automated employment-related decision process and any other information relevant to such content, decision, prediction or recommendation in order to confirm the accuracy of data processed by such automated employment-related decision process and, when appropriate, modify or veto any such content, decision, prediction or recommendation generated by such automated decision-making process prior to any adverse employment-related decision. (2) A deployer shall (A) establish procedures necessary to pause, correct or reverse erroneous or harmful content, decision, prediction or recommendation generated by an automated employment-related decision process, and (B) establish and maintain logs listing all human review reports and any intervention taken by an individual conducting such human review. (c) No automated employment-related decision process shall be used by a deployer in making a final or determinative employment-related decision without human review over such final or determinative employment-related decision.
H-02 Non-Discrimination & Bias Assessment · H-02.1H-02.2H-02.6H-02.7H-02.4H-02.5H-02.10 · Deployer · Employment
Sec. 8(a)-(f)
Plain Language
Deployers must contract with a Labor Commissioner-approved independent auditor for a bias audit before deployment and annually thereafter; the initial audit must be completed no later than one year before intended deployment. Each audit must evaluate performance and error rates across subgroups, assess disparate impact against protected classes, examine data sources and output quality, evaluate scoring thresholds, and test for less discriminatory alternatives. The auditor must have no financial or operational interest in the deployer or developer. Within 30 days of audit completion, the deployer must file the full report and a plain-language summary with the Labor Commissioner and publish the summary on its website. If the audit identifies disparate impact, the system may not be deployed or continue operating unless the deployer demonstrates business necessity, has implemented Commissioner-approved corrective actions, and either no less discriminatory alternative exists or one has been implemented. All audit records must be retained for at least five years and made available to the Commissioner on request.
Statutory Text
(a) (1) Prior to deploying an automated employment-related decision process, and annually thereafter, a deployer shall contract with an independent auditor to complete a bias audit. Such bias audit shall be done not later than one year prior to the date the deployer intends to deploy such automated employment-related decision process. (2) Each bias audit conducted pursuant to this subsection shall: (A) Evaluate the automated employment-related decision process performance and error rates across relevant subgroups; (B) Assess disparate impact caused by the automated employment-related decision process against protected classes; (C) Examine the sources of data processed by the automated employment-related decision process and quality of content, decisions, predictions or recommendations generated by the automated employment-related decision process; (D) Evaluate the effects of any thresholds, scoring or ranking criteria utilized by the automated employment-related decision process; and (E) Test for less discriminatory alternatives or adjustments to such automated employment-related decision process. (3) No deployer shall contract with an independent auditor who (A) has a financial or operational interest in the deployer or developer of the automated employment-related decision process, or (B) has not been approved by the Labor Commissioner pursuant to subsection (b) of this section. (b) The Labor Commissioner shall establish and implement an approval process of independent auditors to conduct bias audits pursuant to subsection (a) of this section and shall maintain a registry of independent auditors approved by such process. (c) Not later than thirty days after completing a bias audit pursuant to subsection (a) of this section, the deployer shall (1) in a form and manner prescribed by the Labor Commissioner, file a bias audit report and a plain-language summary of such report with the commissioner, and (2) publish a plain-language summary of such audit report on the deployer's Internet web site in a conspicuous place accessible to applicants for employment and employees. Such summary shall include (A) the methodology used in such bias audit, (B) the key findings and identified risks found by such bias audit, and (C) any corrective actions taken by the deployer. (d) No automated employment-related decision process shall be deployed or continue to be deployed by a deployer if the most recent bias audit conducted pursuant to subsection (a) of this section identified any disparate impact caused by such automated employment-related decision process, except where the deployer can demonstrate (1) a business necessity, (2) such deployer has implemented corrective actions approved by the Labor Commissioner, and (3) that either (A) no less discriminatory alternative is available, or (B) a less discriminatory alternative has been implemented by the deployer. (e) Each deployer shall maintain records relating to bias audits required pursuant to subsection (a) of this section for a period of not less than five years and shall make such records available to the Labor Commissioner upon request. (f) The Labor Commissioner may adopt regulations, in accordance with the provisions of chapter 54 of the general statutes, necessary to carry out the purposes of this section, including, but not limited to, establishing minimum qualifications for independent auditors and methodologic requirements for bias audits required pursuant to subsection (a) of this section.
G-03 Whistleblower & Anti-Retaliation Protections · G-03.3 · DeployerDeveloper · Employment
Sec. 9
Plain Language
Employers, deployers, developers, labor organizations, and any other person are prohibited from retaliating against any applicant or employee who files a complaint about violations, assists in investigations, objects to or refuses to participate in activities they reasonably believe violate the act, or exercises any rights under Sections 3–8. This is a broad anti-retaliation provision covering all parties in the employment relationship, not just the deployer.
Statutory Text
No employer, deployer, developer, labor organization or any other person shall discharge or in any manner discriminate or retaliate against, any applicant for employment or employee because such applicant or employee: (1) Filed a complaint, provided information or otherwise assisted in an investigation or proceeding concerning any alleged violation of sections 3 to 8 of this act; (2) Objected to or refused to participate in any activity that such applicant or employee reasonably believed to be in violation of sections 3 to 8 of this act; or (3) Exercised any rights granted under the provisions of sections 3 to 8 of this act.
Other · Employment
Sec. 10(a)-(b)
Plain Language
No person is required to disclose trade secrets or information protected from disclosure under state or federal law in satisfying their obligations under Sections 2–8. However, if information is withheld on this basis, the withholding party must notify the person from whom information is being withheld, disclosing that information is being withheld and the basis for the decision. This is a qualified safe harbor — the trade secret exception is available but triggers its own transparency obligation.
Statutory Text
(a) No provision of sections 2 to 8, inclusive, of this act shall be construed to require any person to disclose any information that is a trade secret or otherwise protected from disclosure under state or federal law. (b) If a person withholds any information under subsection (a) of this section, the person shall send a notice to the person from whom such information is being withheld. Such notice shall disclose (1) that such person is withholding such information, and (2) the basis for such person's decision to withhold such information.
Other · Deployer · Employment
Sec. 13
Plain Language
When employees are represented by a union or employee organization, the deployer must provide written advance notice before any testing, deployment, or material modification of an automated employment decision system, and must bargain in good faith with the union over the system's purpose, scope, and anticipated impacts. This creates a mandatory bargaining obligation specific to unionized workplaces that goes beyond mere notice.
Statutory Text
Where an applicant for employment or employee is represented by an employee organization, a deployer shall (1) provide written notice prior to any testing, deployment or material modification of an automated employment-related decision process, and (2) engage in good faith bargaining with such employee organization regarding the purpose, scope and anticipated impacts of such automated employment-related decision process.
PS-01 Government AI Accountability · PS-01.2PS-01.4 · Government · EmploymentGovernment System
Sec. 14(b)-(c)
Plain Language
State agencies face a categorical prohibition on using AI technology for public benefit delivery or any function materially impacting individual rights, civil liberties, safety, or welfare unless specifically authorized by law. Procurement of AI technology is similarly restricted to specifically authorized uses. When procurement is authorized, the agency must obtain a full independent bias audit (meeting the same Section 8 standards as private-sector deployers), submit the audit to the Commissioner of Administrative Services, and post it publicly on the agency's website at least 60 days before deployment. PII may be redacted from the published audit.
Statutory Text
(b) (1) No state agency, or any entity acting on behalf of a state agency, shall, directly or indirectly, utilize or apply any artificial intelligence technology in performing any function that (A) is related to the delivery of any public assistance benefit to individuals in the state by such agency, or (B) will have a material impact on the rights, civil liberties, safety or welfare of individuals in the state, unless such utilization or application is specifically authorized by law. (2) No state agency shall authorize any procurement, purchase or acquisition of any artificial intelligence technology, except where the use of such system is specifically authorized by law. (3) If a state agency is authorized to procure, purchase or acquire an artificial intelligence technology, the state agency shall contract with an independent auditor to complete a bias audit pursuant to subsection (a) of section 8 of this act. (c) Any bias audit completed pursuant to subdivision (3) of subsection (b) of this section shall be submitted to the Commissioner of Administrative Services, in a form and manner prescribed by the commissioner, and posted on the agency's Internet web site not later than sixty days prior to deployment of such artificial intelligence technology. Any agency may redact any data in such impact statement to remove personally identifiable information of any individual.
Other · Government · EmploymentGovernment System
Sec. 15 (amending § 7-468(a)); Sec. 16 (amending § 5-271(a))
Plain Language
These provisions amend Connecticut's municipal (§ 7-468) and state employee (§ 5-271) collective bargaining statutes to explicitly add 'the use of artificial intelligence technology by an employer' as a mandatory subject of collective bargaining. Employees have the protected right to bargain collectively over AI technology use alongside wages, hours, and other conditions of employment.
Statutory Text
(a) Employees shall have, and shall be protected in the exercise of, the right of self-organization, to form, join or assist any employee organization, to bargain collectively through representatives of their own choosing on questions of wages, hours and other conditions of employment, including, but not limited to, the use of artificial intelligence technology by an employer and to engage in other concerted activities for the purpose of collective bargaining or other mutual aid or protection, free from actual interference, restraint or coercion.
Other · Deployer · Employment
Sec. 17
Plain Language
During the term of any collective bargaining agreement, employers may not use AI technology in any way that modifies or impairs the agreement, the rights and benefits of bargaining unit members (including by reducing wages or hours or assuming employee duties), the union's role as exclusive representative, or the employer-union relationship. This is a broad prohibition that could effectively prevent certain AI-driven workforce changes — including automation of bargaining-unit work — during the life of a CBA without renegotiation.
Statutory Text
During the term of a written collective bargaining agreement entered into by an employer and a designated employee organization, no artificial intelligence technology shall be used by or on behalf of the employer in any manner that: (1) Modifies or impairs such agreement in any way, including, but not limited to, any such use that has the effect of modifying or impairing the rights, benefits and privileges accorded to the employee members of the bargaining unit that is represented by such designated employee organization, by, among other things, (A) reducing the wages, fringe benefits or nonovertime hours of such employee members, or (B) assuming the duties and functions of such employee members; (2) Modifies or impairs the designated employee organization's role as the exclusive representative of the bargaining unit for the purposes of such agreement; or (3) Modifies or impairs the relationship between the employer and the designated employee organization with respect to such agreement.
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · Employment
Sec. 18 (amending § 46a-60(b)(1)(A))
Plain Language
This amends Connecticut's anti-discrimination statute to expressly prohibit using an automated employment-related decision process in any manner that has the discriminatory effect of refusing to hire, discharging, or discriminating against individuals based on protected characteristics — including race, sex, age, disability, veteran status, and others. This is a disparate-impact standard: the employer need not intend discrimination; the effect is sufficient. Notably, the amendment requires courts and the CHRO to consider evidence of anti-bias testing or proactive compliance efforts — including quality, recency, scope, results, and response — as a mitigating factor. This creates a practical safe-harbor-like incentive for deployers who conduct robust bias audits under Section 8.
Statutory Text
(A) For an employer, by the employer or the employer's agent, except in the case of a bona fide occupational qualification or need, to refuse to hire or employ or to bar or to discharge from employment any individual or to discriminate against any individual in compensation or in terms, conditions or privileges of employment because of, or to use an automated employment-related decision process in any manner that has the effect of causing the employer to refuse to hire or employ or to bar or to discharge from employment any individual or to discriminate against any individual in compensation or in terms, conditions or privileges of employment on the basis of, the individual's race, color, religious creed, age, sex, gender identity or expression, marital status, national origin, ancestry, present or past history of mental disability, intellectual disability, learning disability, physical disability, including, but not limited to, blindness, status as a veteran, status as a victim of domestic violence, status as a victim of sexual assault or status as a victim of trafficking in persons. In any action for a discriminatory practice in violation of this subparagraph involving an automated employment-related decision process, the commission or the court shall consider any evidence, or lack of evidence, of anti-bias testing or similar proactive efforts to avoid such discriminatory practice, including, but not limited to, the quality, efficacy, recency and scope of such testing or efforts, the results of such testing or efforts and the response thereto.
H-01 Human Oversight of Automated Decisions · H-01.3 · Deployer · Employment
Sec. 18 (amending § 46a-60(b)(1)(B))
Plain Language
Under the amended anti-discrimination statute, it is a discriminatory practice for employers to fail to provide advance written notice that an automated employment-related decision process will be used in employment decisions affecting an individual. The notice must at minimum disclose the trade name of the automated system and the types and sources of personal information the system will process. This creates a separate notice obligation within Connecticut's anti-discrimination framework — enforced by CHRO — in addition to the deployer notice obligations in Sections 4 and 5.
Statutory Text
(B) For an employer, by the employer or the employer's agent, to fail to provide to any individual advance written notice disclosing, at a minimum, that an automated employment-related decision process will be used to make, to assist in making or in the course of making a decision to hire or employ or to bar or to discharge from employment, or concerning the compensation or terms, conditions or privileges of employment, of such individual. Such notice shall, at a minimum, disclose the trade name of the automated employment-related decision process and the types and sources of personal information concerning the individual that the automated employment-related decision process will process or analyze.