AI in HR | February 17th, 2026

When AI Meets the Courtroom: What the Workday Hiring Lawsuit Means for HR Tech

By Riona Simpson

AI has been quietly shaping HR for a while now. CV screening, candidate matching, automated shortlists – it’s all become fairly normal.

What’s new is this: AI in HR is now being tested in court.

A recent U.S. legal case involving Workday has become one of the first real examples of an AI-powered HR system being examined through an employment law lens. It’s not a verdict yet – but it is an important moment for HR, technology, and accountability.

What’s happening in the Workday case?

The case centres on claims that automated hiring tools unfairly screened out older job applicants, a protected class under U.S. discrimination law. A U.S. court has now allowed people who believe they were affected to formally opt in to the case under the Age Discrimination in Employment Act (ADEA).

  • At this stage, it’s crucial to be clear:
  • This does not mean Workday has been found at fault.
  • It does mean the court thinks the questions being raised are worth properly examining.

And that alone makes this case significant.

Why this case feels like a turning point

This is one of the first times we’re seeing a major HR platform’s AI tools questioned in a legal setting.

For years, HR tech conversations around AI have focused on speed, efficiency, and scale. This case shifts the spotlight to something else entirely: impact.

It raises questions many HR teams are already quietly asking:

  • If AI helps filter candidates, who’s responsible for the outcome?
  • How much human oversight is enough?
  • And what happens when technology influences decisions at massive scale?

The bigger picture: AI laws are catching up, fast

What makes this case particularly timely is that it’s not happening in isolation. Courts aren’t the only ones paying attention. Regulators in several countries are starting to look much more closely at how AI is used in hiring and employment decisions.

Here’s what that looks like in practice:

EU
Under the EU AI Act, recruitment AI is classed as “high risk.” In practical terms, that means clear expectations around human oversight, proper documentation, and being able to explain how automated tools influence outcomes.

US
At state and city level (New York City and Colorado, for example), new rules are emerging around automated screening. These include requirements for bias audits, greater transparency, and clearer accountability when AI plays a role in hiring.

UK
Rather than introducing one single AI law, the UK is taking a principles-based approach. AI in hiring is likely to be scrutinised through existing fairness, equality, and data protection legislation.

Canada
While specific AI legislation has paused, the direction is clear: increased focus on governance, oversight, and responsible use of high-impact AI systems.

This reinforces the same message the Workday case highlights: AI in HR isn’t just about efficiency anymore. It’s about responsibility, oversight, and being able to explain how decisions are shaped.

AI doesn’t replace responsibility, it redistributes it

One of the most interesting elements of this case is that it challenges the idea that technology is just a “neutral tool”.

Even when humans make the final hiring decision, AI systems often decide:

  • who gets seen
  • who gets ranked higher
  • and who never makes it past the first filter

That doesn’t automatically make AI dangerous, but it does mean organisations need to understand what their tools are doing, not just what they promise to do.

Our take at Lemon Platypus

We don’t see this case as a warning to avoid AI in HR.

AI can be incredibly powerful when it’s:

  • Well-designed
  • Properly monitored
  • Used with clear human oversight

But this case is a reminder that “set and forget” isn’t good enough, especially in areas like recruitment where fairness, access, and opportunity really matter.

A few practical takeaways for organisations:

  • Ask more questions of your vendors – How does the tool work? What data does it rely on? How are risks managed?
  • Keep humans in the loop – Automation should support decisions, not silently make them.
  • Document your processes – Transparency is becoming a legal and ethical expectation, not a nice-to-have.
  • Map where you hire, and which rules apply – Some requirements may apply based on candidate location, not just company location (New York City for example).

This is exactly the kind of work we support at Lemon Platypus – asking the right questions of vendors, understanding how tools actually behave, and helping organisations implement HR tech in a way that’s practical, responsible, and defensible.

What happens next?

The outcome of this case could go a number of ways. It may confirm existing practices are sound, or it may push the industry towards clearer standards and guardrails.

Either way, it marks a shift.

AI in HR is no longer just a product feature. It’s becoming a responsibility, one that employers and tech providers share.

And we expect this won’t be the last time AI hiring tools end up under legal scrutiny.

If you’re thinking about how AI fits into your hiring or wider HR tech stack, or want a second pair of eyes on how your current tools are set up, we’re always happy to talk.

Find Your Perfect HR System
Today, Contact Us!

We specialise in helping businesses like yours find, implement, and sustain the right HR solutions, ensuring a smooth process and lasting results.

Yellow cartoon platypus in blue inflatable ring

Latest Blog Articles

graphic-blog-1
September 30th, 2025

The Pre-Pay Review Play: Turning Transparency Into Credibility

The most important window for pay transparency isn’t June 2026 when the EU rules bite – it’s the months before your next...

graphic-blog-2
September 25th, 2025

How to Lose an Employee in 10 Seconds

One wrong number in your HR system can undo months of work. Accuracy builds trust, and trust is what makes people actually...

graphic-blog-3
August 6th, 2025

In Case You Missed It: Making AI Work for HR 

Our whitepaper with HiBob shares five real ways AI can cut admin without killing your tone or culture.

Find Your Perfect HR System
Today, Contact Us!

We specialise in helping businesses like yours find, implement, and sustain the right HR solutions, ensuring a smooth process and lasting results.

Yellow cartoon platypus in blue inflatable ring