SB-00435
CT · State · USA
CT
USA
● Pending
Proposed Effective Date
2026-10-01
Connecticut Raised Bill No. 435 — An Act Concerning Automated Decision Systems Protections for Employees
Imposes comprehensive obligations on deployers and developers of automated employment-related decision processes operating in Connecticut. Deployers must disclose to applicants and employees when they are interacting with an automated system, provide pre-collection data notices, issue pre-decision notices with information about the system's purpose and available rights, and provide detailed adverse-decision explanations with data access and appeal rights. All final or determinative employment decisions using these systems require meaningful human review. Deployers must contract with Labor Commissioner-approved independent auditors for annual bias audits, file results with the Commissioner, and publish summaries publicly. The bill also restricts state agency AI procurement, requires collective bargaining over AI use, amends Connecticut's antidiscrimination statute to cover AI-driven disparate impact, and creates both AG enforcement (as CUTPA violations) and a private right of action for aggrieved employees.
Summary

Imposes comprehensive obligations on deployers and developers of automated employment-related decision processes operating in Connecticut. Deployers must disclose to applicants and employees when they are interacting with an automated system, provide pre-collection data notices, issue pre-decision notices with information about the system's purpose and available rights, and provide detailed adverse-decision explanations with data access and appeal rights. All final or determinative employment decisions using these systems require meaningful human review. Deployers must contract with Labor Commissioner-approved independent auditors for annual bias audits, file results with the Commissioner, and publish summaries publicly. The bill also restricts state agency AI procurement, requires collective bargaining over AI use, amends Connecticut's antidiscrimination statute to cover AI-driven disparate impact, and creates both AG enforcement (as CUTPA violations) and a private right of action for aggrieved employees.

Enforcement & Penalties
Enforcement Authority
Violations of Sections 3 to 8 constitute unfair or deceptive trade practices under Conn. Gen. Stat. § 42-110b(a) and are enforced solely by the Attorney General. The private right of action under the Connecticut Unfair Trade Practices Act (§ 42-110g) is expressly excluded for violations of Sections 3 to 8. However, Section 12 independently creates a private right of action for employees aggrieved by violations of Sections 3 to 9 (which includes the anti-retaliation provision). Section 18 amendments to § 46a-60 are enforced through the Commission on Human Rights and Opportunities and existing antidiscrimination enforcement mechanisms. The Labor Commissioner oversees the bias audit approval process and receives bias audit reports.
Penalties
An aggrieved employee may bring a civil action in Superior Court to recover damages and equitable relief together with costs and reasonable attorney's fees. No statutory minimum or per-violation amount is specified. Violations of Sections 3 to 8 also constitute unfair or deceptive trade practices enforceable by the Attorney General, carrying penalties available under the Connecticut Unfair Trade Practices Act. Discrimination claims under § 46a-60 carry remedies available through the Commission on Human Rights and Opportunities.
Who Is Covered
"Deployer" means a person doing business in the state who deploys an automated employment-related decision process in the state;.
"Developer" means a person doing business in the state who develops, or intentionally and substantially modifies, an automated employment-related decision process;.
What Is Covered
"Automated employment-related decision process" (A) means a computational process that makes, assists in making or is used in the course of making an employment-related decision, (B) includes, but is not limited to, a computational process that (i) uses a computer-based assessment or test to (I) make a predictive assessment concerning an applicant for employment or employee, (II) measure the skills, dexterity, reaction time or any other ability or characteristic of an applicant for employment or employee, (III) measure the personality traits, aptitude, attitude or cultural fit of an applicant for employment or employee, or (IV) screen, evaluate, categorize or recommend an applicant for employment or employee, (ii) directs job advertisements or other recruiting materials to targeted groups, (iii) screens resumes for particular terms or patterns, (iv) analyzes a facial expression, word choice or voice captured during an online interview, or (v) analyzes data acquired from a third party concerning an applicant for employment or an employee, and (C) does not include any word processing, spreadsheet, map navigation, web hosting, domain registration, networking, caching, Internet web site loading, data storage, firewall, anti-virus, anti-malware, spam and robocall filtering, spellchecking, calculator, database or similar software or technology insofar as such software or technology does not make an employment-related decision;
Compliance Obligations 21 obligations · click obligation ID to open requirement page
T-01 AI Identity Disclosure · T-01.1 · Deployer · EmploymentAutomated Decisionmaking
Sec. 3(a)-(b)
Plain Language
Deployers must disclose to each applicant or employee who interacts with an automated employment-related decision process that the person is interacting with an automated system. This disclosure is not required where a reasonable person would deem it obvious they are interacting with an automated process. The obligation may be contractually shifted to the developer under Section 2(b).
Statutory Text
(a) Except as provided in subsection (b) of this section and subsection (b) of section 2 of this act, a deployer who deploys an automated employment-related decision process that is intended to interact with an applicant for employment or employee in the state shall ensure that it is disclosed to each such applicant or employee who interacts with such process that such applicant or employee is interacting with an automated employment-related decision process. (b) No disclosure shall be required under subsection (a) of this section under circumstances in which a reasonable person would deem it obvious that such person is interacting with an automated employment-related decision process.
D-01 Automated Processing Rights & Data Controls · D-01.1 · Deployer · EmploymentAutomated Decisionmaking
Sec. 4
Plain Language
Before collecting any personal data from an applicant or employee for use in an automated employment-related decision process, the deployer must provide written notice covering: the purpose of collection, the categories of data to be collected, the retention period, who will have access to the data, and information about the right to opt out of personal data processing under Connecticut's existing data privacy law (§ 42-518). This is a pre-collection notice obligation — it must be provided before data collection begins, not at the time of the decision.
Statutory Text
Except as provided in subsection (b) of section 2 of this act, prior to collecting any personal data of an applicant for employment or employee in the state for processing in an automated employment-related decision process, a deployer shall provide to such applicant or employee a written notice disclosing: (1) The purpose of such data collection; (2) The categories of personal data that will be collected for processing in such automated employment-related decision process; (3) The retention period for any personal data collected; (4) The categories of persons who will have access to such personal data; and (5) Information concerning the right, under subparagraph (C) of subdivision (5) of subsection (a) of section 42-518 of the general statutes, to opt out of the processing of personal data for the purposes set forth in said subparagraph.
H-01 Human Oversight of Automated Decisions · H-01.3 · Deployer · EmploymentAutomated Decisionmaking
Sec. 5
Plain Language
Before making an employment-related decision that is made or substantially informed by an automated process, the deployer must provide the affected applicant or employee written notice covering eight categories of information: that an automated process is being used, its purpose and the nature of the decision, opt-out rights, deployer contact information, availability of human review, how to request reevaluation, a link to the most recent bias audit summary, and how to request further documentation. This is a pre-decision notice distinct from the pre-collection notice in Section 4.
Statutory Text
Except as provided in subsection (b) of section 2 of this act, a deployer who has deployed an automated employment-related decision process to make, or be a substantial factor in making, an employment-related decision concerning an applicant for employment or employee in the state shall, before such employment-related decision is made, provide to such applicant or employee a written notice disclosing: (1) That the deployer has deployed an automated employment-related decision process; (2) The purpose of the automated employment-related decision process and the nature of such employment-related decision; (3) Information concerning the right, under subparagraph (C) of subdivision (5) of subsection (a) of section 42-518 of the general statutes, to opt out of the processing of personal data for the purposes set forth in said subparagraph; (4) Contact information for the deployer; (5) The availability of human review pursuant to section 7 of this act; (6) Information concerning how such applicant or employee may request a revaluation of any employment-related decision made in whole or in part by such automated employment-related decision process; (7) A link to the summary of the most recent bias audit required pursuant to section 8 of this act; and (8) Information concerning how to request additional documentation or information about such automated employment-related decision process.
H-01 Human Oversight of Automated Decisions · H-01.1H-01.2H-01.5 · Deployer · EmploymentAutomated Decisionmaking
Sec. 6(a)-(b)
Plain Language
When an automated employment-related decision process makes or substantially contributes to an adverse decision about an applicant or employee, the deployer must provide: (1) a high-level, plain-language explanation of the principal reasons for the adverse decision, including the degree and manner of the automated process's contribution, the types and sources of data used; (2) the opportunity to examine the data used, correct inaccurate data, and appeal the decision with human review if it was based on incorrect data; and (3) upon request, a copy of the most recent bias audit. The explanation must be provided directly to the individual, in plain language, in all languages used in the deployer's ordinary business, and in an accessible format.
Statutory Text
(a) Except as provided in subsection (b) of section 2 of this act, a deployer who has deployed an automated employment-related decision process to make, or be a substantial factor in making, an employment-related decision concerning an applicant for employment or employee in the state shall, if such employment-related decision is adverse to such applicant or employee, provide to such applicant or employee: (1) A high-level statement disclosing the principal reason or reasons for such adverse employment-related decision, including, but not limited to, (A) the degree to which, and manner in which, the automated employment-related decision process contributed to such adverse employment-related decision, (B) the type of data that were processed by such automated employment-related decision process in making, or as a substantial factor in making, such adverse employment-related decision, and (C) the source of the data described in subparagraph (B) of this subdivision; (2) An opportunity to (A) examine the data the automated employment-related decision process processed in making, or as a substantial factor in making, such adverse employment-related decision, (B) correct any incorrect data described in subparagraph (A) of this subdivision, and (C) appeal such adverse employment-related decision if such adverse employment-related decision is based upon any incorrect data described in subparagraph (A) of this subdivision. Such appeal shall allow for human review; and (3) Upon request by such applicant or employee, or such applicant or employee's representative, a copy of the most recent bias audit required pursuant to section 8 of this act. (b) A deployer who is required to provide a high-level statement to an applicant for employment or employee in the state pursuant to subdivision (1) of subsection (a) of this section shall provide such statement: (1) Directly to such applicant or employee; (2) In plain language; (3) In all languages in which such deployer, in the ordinary course of such deployer's business, provides contracts, disclaimers, sales announcements and other information to persons in the state; and (4) In a format that is accessible to individuals with disabilities.
H-01 Human Oversight of Automated Decisions · H-01.4H-01.6 · Deployer · EmploymentAutomated Decisionmaking
Sec. 7(a)-(c)
Plain Language
Deployers must implement mandatory human review over all automated employment-related decision processes before any adverse or final/determinative employment decision is made. The human reviewer must be a qualified individual with authority to change the decision, understanding of the system's limitations and bias risks, and who does not rely solely on the automated output. Deployers must also establish procedures to pause, correct, or reverse erroneous outputs, and must maintain logs of all human review activities and interventions. No automated process may be used for a final or determinative employment decision without human review — this is an absolute prohibition, not merely an option available upon request.
Statutory Text
(a) For the purposes of this section "human review" means a review conducted by a qualified individual who (1) has the authority to make or change an employment-related decision, (2) understands the capabilities, limitations and risks of the automated employment-related decision process, including, but not limited to, patterns of bias, disparate impact and data quality issues, and (3) does not rely solely on the content, decision, prediction or recommendation generated by the automated employment-related decision process in making a final or determinative employment-related decision. (b) (1) A deployer who has deployed an automated employment-related decision process in making, or as a substantial factor in making, an employment-related decision concerning an applicant for employment or employee in the state shall implement human review over such automated employment-related decision process by providing for review of the content, decisions, predictions or recommendations generated by the automated employment-related decision process and any other information relevant to such content, decision, prediction or recommendation in order to confirm the accuracy of data processed by such automated employment-related decision process and, when appropriate, modify or veto any such content, decision, prediction or recommendation generated by such automated decision-making process prior to any adverse employment-related decision. (2) A deployer shall (A) establish procedures necessary to pause, correct or reverse erroneous or harmful content, decision, prediction or recommendation generated by an automated employment-related decision process, and (B) establish and maintain logs listing all human review reports and any intervention taken by an individual conducting such human review. (c) No automated employment-related decision process shall be used by a deployer in making a final or determinative employment-related decision without human review over such final or determinative employment-related decision.
H-02 Non-Discrimination & Bias Assessment · H-02.1H-02.2H-02.6H-02.7 · Deployer · EmploymentAutomated Decisionmaking
Sec. 8(a)(1)-(3)
Plain Language
Deployers must engage a Labor Commissioner-approved independent auditor to conduct a bias audit before deploying any automated employment-related decision process and annually thereafter. The initial audit must be completed no later than one year before intended deployment. The audit must evaluate performance and error rates across subgroups, assess disparate impact against protected classes, examine data sources and output quality, evaluate thresholds and scoring criteria, and test for less discriminatory alternatives. The auditor must have no financial or operational interest in the deployer or developer and must be on the Commissioner's approved registry.
Statutory Text
(a) (1) Prior to deploying an automated employment-related decision process, and annually thereafter, a deployer shall contract with an independent auditor to complete a bias audit. Such bias audit shall be done not later than one year prior to the date the deployer intends to deploy such automated employment-related decision process. (2) Each bias audit conducted pursuant to this subsection shall: (A) Evaluate the automated employment-related decision process performance and error rates across relevant subgroups; (B) Assess disparate impact caused by the automated employment-related decision process against protected classes; (C) Examine the sources of data processed by the automated employment-related decision process and quality of content, decisions, predictions or recommendations generated by the automated employment-related decision process; (D) Evaluate the effects of any thresholds, scoring or ranking criteria utilized by the automated employment-related decision process; and (E) Test for less discriminatory alternatives or adjustments to such automated employment-related decision process. (3) No deployer shall contract with an independent auditor who (A) has a financial or operational interest in the deployer or developer of the automated employment-related decision process, or (B) has not been approved by the Labor Commissioner pursuant to subsection (b) of this section.
R-02 Regulatory Disclosure & Submissions · R-02.1 · Deployer · EmploymentAutomated Decisionmaking
Sec. 8(c)
Plain Language
Within 30 days of completing each bias audit, deployers must both (1) file the full bias audit report and a plain-language summary with the Labor Commissioner and (2) publish the plain-language summary on their website in a conspicuous, accessible location. The summary must cover methodology, key findings and identified risks, and corrective actions taken. This creates both a regulatory filing and a public disclosure obligation tied to each annual bias audit cycle.
Statutory Text
(c) Not later than thirty days after completing a bias audit pursuant to subsection (a) of this section, the deployer shall (1) in a form and manner prescribed by the Labor Commissioner, file a bias audit report and a plain-language summary of such report with the commissioner, and (2) publish a plain-language summary of such audit report on the deployer's Internet web site in a conspicuous place accessible to applicants for employment and employees. Such summary shall include (A) the methodology used in such bias audit, (B) the key findings and identified risks found by such bias audit, and (C) any corrective actions taken by the deployer.
H-02 Non-Discrimination & Bias Assessment · Deployer · EmploymentAutomated Decisionmaking
Sec. 8(d)
Plain Language
A deployer may not deploy or continue deploying an automated employment-related decision process that has been found in its most recent bias audit to cause disparate impact, unless the deployer can demonstrate all three of: (1) business necessity, (2) implementation of corrective actions approved by the Labor Commissioner, and (3) either that no less discriminatory alternative exists or that a less discriminatory alternative has been implemented. This is a deployment-gating obligation — disparate impact findings trigger a conditional ban unless all three conditions are satisfied.
Statutory Text
(d) No automated employment-related decision process shall be deployed or continue to be deployed by a deployer if the most recent bias audit conducted pursuant to subsection (a) of this section identified any disparate impact caused by such automated employment-related decision process, except where the deployer can demonstrate (1) a business necessity, (2) such deployer has implemented corrective actions approved by the Labor Commissioner, and (3) that either (A) no less discriminatory alternative is available, or (B) a less discriminatory alternative has been implemented by the deployer.
G-01 AI Governance Program & Documentation · G-01.3G-01.4 · Deployer · EmploymentAutomated Decisionmaking
Sec. 8(e)
Plain Language
Deployers must retain all bias audit records for at least five years and produce them to the Labor Commissioner upon request. This is both a recordkeeping and a regulatory access obligation — the five-year retention period exceeds the typical two- to three-year standard in other jurisdictions.
Statutory Text
(e) Each deployer shall maintain records relating to bias audits required pursuant to subsection (a) of this section for a period of not less than five years and shall make such records available to the Labor Commissioner upon request.
Other · Developer · EmploymentAutomated Decisionmaking
Sec. 2(a)-(b)
Plain Language
Developers must provide deployers with all information the deployer needs to comply with the disclosure and notice obligations under Sections 3 through 6. Alternatively, developers may contractually assume some or all of those deployer duties. The contract must be binding and must clearly specify which duties the developer has assumed. This creates a supply-chain information flow obligation and allows contractual duty-shifting.
Statutory Text
(a) Except as provided in subsection (b) of this section, the developer of an automated employment-related decision process that is deployed in the state shall provide to the deployer of such automated employment-related decision process all information that such deployer requires to perform such deployer's duties under sections 3 to 6, inclusive, of this act. (b) The developer of an automated employment-related decision process may enter into a contract with a deployer of the automated employment-related decision process to assume the deployer's duties under sections 3 to 6, inclusive, of this act. The contract shall be binding and clearly set forth which of the deployer's duties under sections 3 to 6, inclusive, of this act the developer has assumed.
G-03 Whistleblower & Anti-Retaliation Protections · G-03.3 · DeployerDeveloper · EmploymentAutomated Decisionmaking
Sec. 9
Plain Language
No employer, deployer, developer, labor organization, or any other person may retaliate against an applicant or employee for: filing a complaint or assisting in an investigation of violations of the automated decision provisions; objecting to or refusing to participate in activities the person reasonably believes violate the act; or exercising any rights under Sections 3 through 8. This is a broad anti-retaliation provision covering the full range of protected activities. It is independently actionable under Section 12's private right of action.
Statutory Text
No employer, deployer, developer, labor organization or any other person shall discharge or in any manner discriminate or retaliate against, any applicant for employment or employee because such applicant or employee: (1) Filed a complaint, provided information or otherwise assisted in an investigation or proceeding concerning any alleged violation of sections 3 to 8 of this act; (2) Objected to or refused to participate in any activity that such applicant or employee reasonably believed to be in violation of sections 3 to 8 of this act; or (3) Exercised any rights granted under the provisions of sections 3 to 8 of this act.
Other · EmploymentAutomated Decisionmaking
Sec. 10(a)-(b)
Plain Language
No provision of the act requires disclosure of trade secrets or information otherwise protected from disclosure under state or federal law. However, when information is withheld on this basis, the withholding party must notify the person from whom it is being withheld, stating that information is being withheld and the basis for the decision. This is a safe harbor that modifies the disclosure obligations throughout the act, paired with a notice-of-withholding requirement.
Statutory Text
(a) No provision of sections 2 to 8, inclusive, of this act shall be construed to require any person to disclose any information that is a trade secret or otherwise protected from disclosure under state or federal law. (b) If a person withholds any information under subsection (a) of this section, the person shall send a notice to the person from whom such information is being withheld. Such notice shall disclose (1) that such person is withholding such information, and (2) the basis for such person's decision to withhold such information.
Other · EmploymentAutomated Decisionmaking
Sec. 11
Plain Language
Violations of the automated employment decision provisions (Sections 3-8) are declared to be unfair or deceptive trade practices under Connecticut's Unfair Trade Practices Act. However, enforcement is limited to the Attorney General — the CUTPA private right of action under § 42-110g is expressly excluded. This is an enforcement hook that activates the AG's existing CUTPA enforcement powers but creates no new compliance obligation.
Statutory Text
Any violation of the provisions of sections 3 to 8, inclusive, of this act shall constitute an unfair or deceptive trade practice for the purposes of subsection (a) of section 42-110b of the general statutes and shall be enforced solely by the Attorney General. The provisions of section 42-110g of the general statutes shall not apply to any such violation.
Other · EmploymentAutomated Decisionmaking
Sec. 12
Plain Language
Aggrieved employees may bring a private civil action in Superior Court for violations of any provision from Section 3 (AI interaction disclosure) through Section 9 (anti-retaliation). Available remedies include damages, equitable relief, costs, and reasonable attorney's fees. Notably, this covers Section 9 (retaliation) but not the bias audit requirements in Section 8's regulatory components. This is a remedial provision, not an independent compliance obligation.
Statutory Text
An employee aggrieved by a violation of sections 3 to 9, inclusive, of this act may bring a civil action in the Superior Court to recover damages and equitable relief together with costs and reasonable attorney's fees.
Other · Deployer · EmploymentAutomated Decisionmaking
Sec. 13
Plain Language
Where employees are represented by a union or employee organization, the deployer must provide written notice to the organization before any testing, deployment, or material modification of an automated employment-related decision process. The deployer must also engage in good-faith bargaining with the organization about the purpose, scope, and anticipated impacts of the system. This is a labor-relations obligation that goes beyond standard AI transparency requirements by requiring pre-deployment negotiation with organized labor.
Statutory Text
Where an applicant for employment or employee is represented by an employee organization, a deployer shall (1) provide written notice prior to any testing, deployment or material modification of an automated employment-related decision process, and (2) engage in good faith bargaining with such employee organization regarding the purpose, scope and anticipated impacts of such automated employment-related decision process.
PS-01 Government AI Accountability · PS-01.2PS-01.4 · Government · Government SystemAutomated Decisionmaking
Sec. 14(b)-(c)
Plain Language
State agencies are broadly prohibited from using AI technology in any function related to public assistance benefits or that materially impacts rights, civil liberties, safety, or welfare — unless specifically authorized by law. Even when authorized, agencies may not procure AI technology unless the use is specifically authorized by law, and must contract with an independent auditor for a bias audit meeting the same requirements as the private-sector bias audit in Section 8. The completed bias audit must be submitted to the Commissioner of Administrative Services and posted on the agency's website at least 60 days before deployment. Personally identifiable information may be redacted.
Statutory Text
(b) (1) No state agency, or any entity acting on behalf of a state agency, shall, directly or indirectly, utilize or apply any artificial intelligence technology in performing any function that (A) is related to the delivery of any public assistance benefit to individuals in the state by such agency, or (B) will have a material impact on the rights, civil liberties, safety or welfare of individuals in the state, unless such utilization or application is specifically authorized by law. (2) No state agency shall authorize any procurement, purchase or acquisition of any artificial intelligence technology, except where the use of such system is specifically authorized by law. (3) If a state agency is authorized to procure, purchase or acquire an artificial intelligence technology, the state agency shall contract with an independent auditor to complete a bias audit pursuant to subsection (a) of section 8 of this act. (c) Any bias audit completed pursuant to subdivision (3) of subsection (b) of this section shall be submitted to the Commissioner of Administrative Services, in a form and manner prescribed by the commissioner, and posted on the agency's Internet web site not later than sixty days prior to deployment of such artificial intelligence technology. Any agency may redact any data in such impact statement to remove personally identifiable information of any individual.
Other · Government · EmploymentGovernment System
Sec. 15 (amending § 7-468(a))
Plain Language
This amendment to existing municipal collective bargaining law (§ 7-468) adds the use of AI technology by an employer as an explicit subject of collective bargaining for municipal employees. It confirms that AI deployment is a mandatory bargaining subject. This modifies existing labor relations law but does not create a new standalone AI compliance obligation.
Statutory Text
(a) Employees shall have, and shall be protected in the exercise of, the right of self-organization, to form, join or assist any employee organization, to bargain collectively through representatives of their own choosing on questions of wages, hours and other conditions of employment, including, but not limited to, the use of artificial intelligence technology by an employer and to engage in other concerted activities for the purpose of collective bargaining or other mutual aid or protection, free from actual interference, restraint or coercion.
Other · Government · EmploymentGovernment System
Sec. 16 (amending § 5-271(a))
Plain Language
This amendment to existing state employee collective bargaining law (§ 5-271) adds AI technology use by an employer as an explicit mandatory subject of collective bargaining for state employees. Like Section 15 for municipal employees, this confirms AI deployment is a mandatory bargaining subject but creates no new standalone compliance obligation.
Statutory Text
(a) Employees shall have, and shall be protected in the exercise of the right of self-organization, to form, join or assist any employee organization, to bargain collectively through representatives of their own choosing on questions of wages, hours, the use of artificial intelligence technology by an employer and other conditions of employment, except as provided in subsection (d) of section 5-272, and to engage in other concerted activities for the purpose of collective bargaining or other mutual aid or protection, free from actual interference, restraint or coercion.
Other · Deployer · EmploymentAutomated Decisionmaking
Sec. 17
Plain Language
During the term of a collective bargaining agreement, employers may not use AI technology in any way that modifies or impairs the agreement, including reducing wages, benefits, or hours, or having AI assume duties and functions of bargaining unit members. AI use may also not undermine the union's exclusive representative role or the employer-union relationship. This is effectively a prohibition on using AI to circumvent collective bargaining agreements — a labor-relations constraint rather than a standard AI compliance obligation.
Statutory Text
During the term of a written collective bargaining agreement entered into by an employer and a designated employee organization, no artificial intelligence technology shall be used by or on behalf of the employer in any manner that: (1) Modifies or impairs such agreement in any way, including, but not limited to, any such use that has the effect of modifying or impairing the rights, benefits and privileges accorded to the employee members of the bargaining unit that is represented by such designated employee organization, by, among other things, (A) reducing the wages, fringe benefits or nonovertime hours of such employee members, or (B) assuming the duties and functions of such employee members; (2) Modifies or impairs the designated employee organization's role as the exclusive representative of the bargaining unit for the purposes of such agreement; or (3) Modifies or impairs the relationship between the employer and the designated employee organization with respect to such agreement.
H-02 Non-Discrimination & Bias Assessment · H-02.1 · Deployer · EmploymentAutomated Decisionmaking
Sec. 18(b)(1)(A) (amending § 46a-60(b)(1)(A))
Plain Language
This amendment to Connecticut's antidiscrimination statute (§ 46a-60) makes it a discriminatory employment practice to use an automated employment-related decision process in any manner that has the effect of causing discrimination on the basis of any protected characteristic. This is a disparate impact standard — intent is not required. Notably, the provision also creates an evidentiary consideration: in any discrimination action involving an automated process, the commission or court must consider evidence (or lack thereof) of anti-bias testing or similar proactive efforts, including the quality, efficacy, recency, and scope of such testing. This effectively incentivizes bias testing by making it relevant as evidence but does not create a safe harbor.
Statutory Text
(A) For an employer, by the employer or the employer's agent, except in the case of a bona fide occupational qualification or need, to refuse to hire or employ or to bar or to discharge from employment any individual or to discriminate against any individual in compensation or in terms, conditions or privileges of employment because of, or to use an automated employment-related decision process in any manner that has the effect of causing the employer to refuse to hire or employ or to bar or to discharge from employment any individual or to discriminate against any individual in compensation or in terms, conditions or privileges of employment on the basis of, the individual's race, color, religious creed, age, sex, gender identity or expression, marital status, national origin, ancestry, present or past history of mental disability, intellectual disability, learning disability, physical disability, including, but not limited to, blindness, status as a veteran, status as a victim of domestic violence, status as a victim of sexual assault or status as a victim of trafficking in persons. In any action for a discriminatory practice in violation of this subparagraph involving an automated employment-related decision process, the commission or the court shall consider any evidence, or lack of evidence, of anti-bias testing or similar proactive efforts to avoid such discriminatory practice, including, but not limited to, the quality, efficacy, recency and scope of such testing or efforts, the results of such testing or efforts and the response thereto.
H-01 Human Oversight of Automated Decisions · H-01.3 · Deployer · EmploymentAutomated Decisionmaking
Sec. 18(b)(1)(B) (new § 46a-60(b)(1)(B))
Plain Language
Under amended § 46a-60, employers must provide advance written notice to individuals before using an automated employment-related decision process in any employment decision affecting them. The notice must disclose at minimum: that an automated process will be used, the trade name of the system, and the types and sources of personal information the system will process or analyze. Failure to provide this notice is a discriminatory employment practice enforceable through the Commission on Human Rights and Opportunities. This creates a separate notice obligation within the antidiscrimination framework, distinct from but overlapping with the Section 5 pre-decision notice.
Statutory Text
(B) For an employer, by the employer or the employer's agent, to fail to provide to any individual advance written notice disclosing, at a minimum, that an automated employment-related decision process will be used to make, to assist in making or in the course of making a decision to hire or employ or to bar or to discharge from employment, or concerning the compensation or terms, conditions or privileges of employment, of such individual. Such notice shall, at a minimum, disclose the trade name of the automated employment-related decision process and the types and sources of personal information concerning the individual that the automated employment-related decision process will process or analyze.