Overview
Key takeaways
- Beginning October 1, 2025, California employers must comply with new Fair Employment and Housing Act (FEHA) regulations on the use of artificial intelligence (AI) and automated decision systems (ADS) in hiring and employment.
- The rules primarily pertain to three compliance areas: bias testing, recordkeeping, and vendor liability.
- Employers should start preparing now to avoid exposure once the rules take effect.
In Depth
New regulations become effective October 1, 2025
As a follow-up to our May 8, 2025, alert regarding the current legal landscape of AI in the workplace, California’s Civil Rights Council has approved and finalized new rules focused on the use of AI tools in the workplace after a notice and public comment period. They update existing antidiscrimination laws in California’s FEHA to address the use of technology in employment decisions, including by adding:
- The definition of “agent” of an employer (i.e., staffing agencies, third-party vendors),
- The definition of “proxy” as a characteristic or category closely correlated with protected categories under the FEHA,
- Various examples of ADS programs (computer-based assessments or tests, targeting job ads to specific groups, screening resumes for particular terms or patterns, and analyzing facial expressions), and
- Clarification that antibias testing (including the “quality, efficacy, recency, and scope of such effort”) is relevant evidence in support of defenses to discrimination claims.
The regulations specifically apply to the use of “automated decision systems” (ADS), meaning any computational process – including AI, machine learning, or other algorithms – that make or help make decisions regarding employees or job applicants. Examples include tools used for resume screening, interview scoring, skill or trait assessments, and promotion recommendations. Basic IT tools such as email, firewalls, word processing software, map navigation software, or spreadsheets are not included.
Bias testing
The new regulations provide that bias audits and similar proactive measures can be used as evidence in discrimination cases when ADS are used in connection with employment decisions like hiring, firing, or promotion. Regulators and courts will gauge how recently companies audited ADS used in conjunction with employment decisions, how thorough the employer’s testing was, what the results of audits showed, and whether the employer made corrections in line with the audits’ findings. In practice, this makes regular testing and documentation essential to defending discrimination claims that implicate AI tools.
Recordkeeping requirements
Employers must now keep ADS-related records for at least four years. This includes retaining the data used to run ADS tools, the outputs generated (such as scores or rankings), the criteria applied to job or promotion candidates, and the results of any testing or evaluations. If a complaint is filed, an employer must hold these records even longer. Employers should review and update their data retention policies to conform with these new rules.
Vendor and third-party liability
The new regulations explicitly state that liability can extend to an employer’s vendors or other third-party entities (including, for example, staffing agencies who use AI). If an employer’s staffing partner or AI software provider uses an ADS tool on the employer’s behalf that ends up having a disparate impact, the employer may still be held responsible. Employers should review their vendor agreements, require transparency around any testing and updates, and allocate responsibility for compliance and liability in contracts.
Next steps for employers
- Take inventory of all AI or algorithmic tools you currently use in connection with employment decisions such as hiring, firing, or promotion.
- Partner with your employment counsel to implement and document bias testing for each tool, including under attorney-client privilege, and be prepared to show how you addressed any issues found during testing.
- Update your data retention schedule to ensure ADS-related data is preserved for at least four years.
- Review your vendor contracts and add obligations related to testing, transparency, and compliance.
Bottom line
California has made it clear that AI, even with its new-age appeal, must conform to long-standing discrimination laws under FEHA. Starting in October 2025, California employers using AI tools in employment decisions should be prepared to engage in bias audits, enhanced recordkeeping, and close oversight of vendors and staffing agencies. Now is the time for employers to position themselves to avoid the risk of costly compliance disputes.
Our experienced McDermott Will & Schulte lawyers are available to help you evaluate existing policies and design compliant frameworks that protect your business and remain practical for day-to-day operations.