AI in Hiring Under Legal Scrutiny: The Workday Lawsuit and Its Implications for Employers

A breakdown of the Workday AI hiring lawsuit and what it means for employers using AI in recruitment. Learn the facts behind the case, where it stands now, and the compliance steps your organization should take to avoid discrimination claims.

EMPLOYMENT LAW

6/10/20252 min read

In a landmark case that underscores the complexities of integrating artificial intelligence (AI) into recruitment processes, Workday Inc. faces a collective action lawsuit alleging that its AI-powered hiring tools discriminate against job applicants based on age, race, and disability.

The lawsuit, initiated by Mobley in 2023, claims that Workday's AI-driven applicant screening system disproportionately rejected candidates over the age of 40, as well as those identifying as Black or having disabilities. Mobley, a Black man over 40 with a disability, alleges he was denied employment opportunities due to biases embedded in Workday's algorithms.

In May 2025, U.S. District Judge Rita Lin granted preliminary certification for the lawsuit to proceed as a collective action under the Age Discrimination in Employment Act (ADEA), allowing other affected individuals to join the case.

Workday contends that it is not liable under federal anti-discrimination laws, arguing that it does not make hiring decisions and is not an employment agency. However, the court found that Workday could be considered an "agent" of employers, as its AI tools perform functions traditionally handled by human recruiters, such as screening and rejecting applicants.

The Equal Employment Opportunity Commission (EEOC) supports this perspective, emphasizing that delegating hiring functions to AI does not absolve companies or their vendors from compliance with anti-discrimination laws.

This case serves as a critical reminder for employers to:

  • Assess AI Tools for Bias: Regularly evaluate AI-driven recruitment tools for potential biases against protected classes.

  • Maintain Human Oversight: Ensure that automated decisions are reviewed by human personnel to mitigate unintended discriminatory outcomes.

  • Stay Informed on Legal Standards: Keep abreast of evolving legal interpretations regarding AI in employment to ensure compliance.

  • Document Decision-Making Processes: Maintain thorough records of hiring decisions and the rationale behind them, especially when AI tools are involved.

The Workday lawsuit highlights the necessity for employers to critically examine the tools they use in hiring processes. As AI becomes increasingly prevalent in recruitment, ensuring these technologies operate within the bounds of anti-discrimination laws is paramount. Employers must balance the efficiencies offered by AI with the imperative to uphold equitable hiring practices.

About the Author
Latisha Newby, Esq. is an attorney, HR strategist, and founder of Cultivate HR Consulting. She helps businesses and HR professionals build equitable, legally sound workplaces where both people and policies thrive. Whether you're navigating AI in hiring, rethinking your DEI strategy, or responding to new compliance risks—Latisha brings real talk, practical tools, and deep expertise to help you lead with clarity and confidence.

Need a second set of eyes on your hiring tech or compliance process?
Let’s make sure your practices are legally aligned and people-centered. Contact me here to schedule a consultation or explore how Cultivate HR can support your team.