Digital Discrimination: Why Penalizing AI-Assisted Candidates Is Hurting Your Hiring Strategy

Overview

Employers want AI skills. So why are they rejecting AI-savvy candidates?

There’s a contradiction quietly playing out in hiring teams across the country. On one hand, AI is showing up in every conversation about future-of-work skills. Business leaders are investing in AI tools and training their people to use them. Roles that require AI skills are growing three times faster than those that don’t.

But on the other hand, a candidate who uses AI to write a cover letter or tailor a résumé might be rejected on the spot.

This disconnect is leading to a growing but largely unspoken bias: digital discrimination. And it’s keeping great candidates from getting in the door.


Job Seekers Are Using AI—Creating Both Opportunities and Volume Problems

Nearly half of job seekers have already turned to tools like ChatGPT for help with their applications. Many use it to overcome writer’s block, optimize for keywords, or improve the tone of their writing. And the results are hard to ignore:

  • 78% of those who used ChatGPT in their application process got an interview
  • 59% ended up landing the job

But AI’s impact extends beyond just writing better materials. Job seekers are increasingly using AI and automation tools to apply to dozens or even hundreds of positions in a fraction of the time it would have taken previously. This efficiency for candidates has created a significant challenge for recruiters—application volumes have surged, overwhelming many talent acquisition teams with applications that may not all be relevant or qualified.

 

This volume challenge has led some companies to implement technical barriers that detect and block AI-assisted applications entirely.

The irony? The same companies that are seeking tech-savvy, AI-fluent talent are rejecting candidates who demonstrate those very skills in their job search.

 

AI Use Signals Adaptability, Not Laziness

For years, hiring leaders have asked for candidates who are adaptable. Curious. Tech-savvy. Able to learn and apply new tools.

Now that job seekers are actually doing that (i.e. using AI to improve their work) we’re penalizing them.

That’s a problem.

AI isn’t going anywhere. According to a recent Microsoft/LinkedIn survey:

  • 66% of business leaders said they wouldn’t hire someone without AI skills
  • 71% said they’d prefer a less-experienced candidate who did have AI skills over a more experienced one who didn’t

And yet only 12% of hiring managers say they’re actively screening for AI proficiency.

The result? We’re telling job seekers one thing and rewarding another. We’re asking for 2025-ready candidates while still evaluating them like it’s 2010.

The Reality: Thoughtful AI Use is a Skill

Using AI well takes judgment. Knowing what to ask, how to edit, what to keep, and what to throw away—that’s a skill. And it’s becoming a signal of the kind of thinking employers want on their teams.


Yes, there are plenty of candidates using AI poorly. Copy-pasted, generic content is still generic, even if it was generated with the world’s most powerful tool. But smart use of AI—especially when it’s used to improve communication, not mask ability—should be viewed as a positive, not a red flag.


Some employers are already adjusting. IBM is investing in training 2 million people in AI fluency by 2026. HubSpot created an “AI Toolkit for Job Seekers,” sharing 60+ use cases for AI in the application process. These are a signal of where the market is heading.

What This Means for Hiring Teams

This doesn’t mean we need to let go of standards. We still need to hire thoughtfully. But we can—and should—shift how we evaluate applications in an AI-enabled world.


Here’s where to start:


1. Focus on outcomes, not inputs.
Instead of asking “Did they use AI to write this?”, ask “Does this show the kind of thinking, experience, and communication we’re looking for?” Judge the quality of the work, not the tools used to create it.

2. Encourage responsible AI use.
If your company values AI fluency, make that clear in your job descriptions. Ask candidates about how they use new tools to improve their work. Normalize the idea that AI is part of the modern job toolkit.


3. Update your evaluation process.
If you’re concerned that a résumé or cover letter doesn’t reflect the candidate’s real abilities, create better ways to assess them. Live problem-solving, work samples, and scenario-based interviews can reveal far more than a static document ever could.


4. Coach your hiring managers.
Most digital discrimination isn’t intentional. It comes from a lack of clarity. Give your teams guidance on what effective AI use looks like—and how to separate genuine red flags from unfamiliar formatting or word choice.


5. Develop smarter screening strategies instead of blocking AI.
Rather than implementing technical barriers to AI-assisted applications, develop more sophisticated screening methods that can identify qualified candidates regardless of how they applied. This might include more precise skills-based assessments or using AI tools of your own to help identify promising candidates in large applicant pools.

Be ready to talk about how you used AI if you’re asked. Explain how it helped you work more efficiently, communicate more clearly, or improve your materials. That doesn’t make you less authentic—it shows that you’re proactive and skilled at using the tools available to you.

If you’re applying to a company that still discourages AI use in hiring, be thoughtful. And know that the companies who do value your ability to use these tools well might be a better long-term fit.

Hiring is Changing. So Should Our Filters

The gap between how job seekers are applying and how employers are evaluating those applications is growing wider by the month. If we don’t address it, we’ll continue to lose out on smart, adaptive candidates—not because they’re unqualified, but because they’re ahead of the curve.

It’s time to close that gap.

That starts by dropping our assumptions about what a “genuine” application looks like. It means designing hiring processes that value outcomes, skills, and adaptability. It means acknowledging that a candidate who uses AI to get their foot in the door might also be the one who helps your team move faster, think bigger, and work smarter.

You can’t stop people from using AI in their job search—and you shouldn’t try. Instead, develop strategies that help you identify quality applicants regardless of how they applied. The tools have changed. Our expectations should, too.

Picture of Julie Calli
Julie Calli
Julie Calli is Chief Marketing Officer at Lensa. A seasoned marketing professional with over 20 years of experience, she specializes in recruitment marketing and digital advertising. She has held senior leadership positions for over a decade, showcasing her ability to drive growth in startups and established companies alike.

Recommended posts