• People
  • Services & Industries
  • Insights
  • Innovation
  • Offices
  • My Binder
  • PDF

Discrimination Lawsuit Over Workday’s AI Hiring Tools Can Proceed as Class Action: 6 Things Employers Should Do After Latest Court Decision

Insights

5.20.25

A federal judge just allowed a job applicant’s lawsuit against Workday to move forward as a nationwide class action, ruling that the company’s AI-powered hiring tools may have had a discriminatory impact on applicants over age 40. The May 16 decision is a major development in Mobley v. Workday, one of the country’s most closely watched legal challenges to the use of artificial intelligence in employment decisions. While this age discrimination case is still in its early stages, the ruling serves as a warning to employers and AI vendors alike that they can be held accountable for algorithmic screening tools if they disproportionately harm protected groups – even if the bias wasn’t intentional. What do you need to know about last week’s critical ruling?

👉 Catch up on our original article when the lawsuit was first filed.
👉 Read our update when the plaintiff sought to expand it to a nationwide class action.

What Happened?

The court’s May 16 order granted preliminary certification of a collective action under the Age Discrimination in Employment Act (ADEA), allowing the lead plaintiff to notify other job seekers age 40+ who applied through Workday’s system and were allegedly denied employment “recommendations.” To quickly recap how we got here:

  • Derek Mobley is a Black man over the age of 40 who self-identifies as having anxiety and depression. He says he applied to more than 100 jobs with companies that use Workday’s AI-based hiring tools over the course of several years – and says he was rejected every single time.
  • Thousands of companies use Workday’s AI-based applicant screening tools, which include personality and cognitive tests. They then interpret a candidate’s qualifications through advanced algorithmic methods and can automatically reject them or advance them along in the hiring process.
  • After Mobley filed a claim in a California federal court, Workday asked to have the case dismissed since it wasn’t the employer making the employment decisions. After over a year of procedural wrangling, a California federal judge gave the green light for Mobley to continue his lawsuit in July 2024.
  • In February, Mobley sought permission to expand his age discrimination claim to a national action, so that millions of other applicants over the age of 40 could also join.

In Friday’s decision, Judge Rita Lin of the US District Court for the Northern District of California found that the allegations cleared the legal bar to proceed collectively, noting that the case centers on a common question: “Whether Workday’s AI recommendation system has a disparate impact on applicants over forty.” Judge Lin noted that their claims will “rise and fall together.”

Her decision is largely based on the disparate impact theory, which allows claims to proceed without proof of intentional discrimination – a crucial distinction as this area of law comes under political pressure.

But Isn’t Disparate Impact Under Attack?

Yes – and that’s part of the reason this ruling is so important for employers to monitor. Just last month, President Trump signed an Executive Order directing federal agencies, including the EEOC, to eliminate enforcement based on disparate impact theory. That move will almost certainly reduce government-led investigations into algorithmic discrimination for the foreseeable future – but it doesn’t affect private litigation like the Workday case. It might even spur state EEO agencies to take up the charge and seek out more disparate impact claims on their own. That means we’re likely to see:

  • Fewer (or no) new cases filed by the EEOC or DOJ under this theory
  • More state agency claims, private class actions, and opt-in lawsuits targeting AI tools on disparate impact grounds

Why This Ruling Matters

This lawsuit is one of the first major court challenges to the use of algorithmic hiring tools under federal employment discrimination laws. It highlights several risks for employers using AI-driven systems:

  • Vendor tools may create legal exposure if they disproportionately reject applicants in protected classes.
  • Courts may treat screening systems as a “unified policy” even when different employers use the tools differently.
  • Individualized defenses (e.g., qualifications or interview rates) won’t prevent collective certification at this early stage.

Judge Lin was unswayed by Workday’s argument that the size or complexity of the proposed class – potentially reaching millions of applicants – should block the case from moving forward: “Allegedly widespread discrimination is not a basis for denying notice.”

What’s Next in the Case?

  • The parties must meet and confer by May 28 to propose a plan for identifying and notifying class members.
  • A case management conference is scheduled for June 4.
  • Workday can seek to “decertify” the class later in the litigation, after discovery.

The court also suggested targeted notice via social media or electronic platforms if traditional methods fail – a modern twist fitting for a lawsuit about algorithmic screening.

Employer Takeaways: What Should You Do Now?

This decision underscores the urgent need for AI compliance diligence. Here’s what employers should consider:

1. Audit Your Vendors

Ask for documentation showing how their systems are tested for bias, and require contractual assurances around nondiscrimination and data transparency. If you need guidance on the right way to approach this dynamic, this Insight reviews the right questions to ask at the outset of the relationship and along the way.

2. Retain Human Oversight

Ensure critical decisions aren’t made solely by automated tools. Make sure to train your HR teams on when to override algorithmic rankings, and to audit results for desired (and non-discriminatory) outcomes.

3. Document Criteria and Justifications

Maintain clear records of hiring decisions and the rationale behind them. Watch out for tools that rely on vague “fit scores” or unexplainable metrics.

4. Monitor for Disparate Impact

Regardless of the executive order sidelining disparate impact, regularly analyze outcomes across age, race, gender, and other protected class categories if possible. Treat any significant disparities as red flags.

5. Get Your Governance House in Order 

If you haven’t established an AI Governance program to set guardrails and construct the best practices for implementing AI in your organization, the time to do so is now. This will help you monitor the outcomes of your AI hiring tools on a regular basis and ensure you have the human touch where necessary.

6. Stay Tuned to Legal Shifts

Political winds are shifting, but courts – not agencies – will likely define AI liability for now. Make sure you are subscribed to Fisher Phillips’ Insight System to get regular updates about this case, the ongoing ACLU action against Aon Consulting for its use of AI screening platforms, and other relevant developments.

Conclusion

We will continue to monitor these developments and provide the most up-to-date information directly to your inbox, so make sure you are subscribed to Fisher Phillips’ Insight System. If you have questions, contact your Fisher Phillips attorney, the authors of this Insight, or any attorney in our AI, Data, and Analytics Practice Group.

Related People

  1. Anne Khan photo
    Anne Yarovoy Khan
    Of Counsel

    949.798.2162

    Email
  2. John Polson photo
    John M. Polson
    Chairman & Managing Partner

    949.798.2130

    Email
  3. David Walton bio photo
    David J. Walton, CIPP/US
    Partner

    610.230.6105

    Email
  4. Erica G. Wilson
    Partner

    412.822.6624

    Email

Service Focus

  • AI, Data, and Analytics
  • Litigation and Trials
  • Employment Discrimination and Harassment

We Also Recommend

Subscribe to Our Latest Insights 

©2025 Fisher & Phillips LLP. All Rights Reserved. Attorney Advertising.

  • Privacy Policy
  • Legal Notices
  • Client Payment Portal
  • My Binder