Why only 26% of candidates trust AI hiring tools (and what to do about it)

Written by
Kiku
10 minute read

A March 2025 Gartner survey found that only 26% of candidates trust AI to evaluate them fairly. At the same time, more than half believe AI is already being used to screen their applications.

That combination is the defining tension in hiring right now.  

Most organisations are already using AI but most candidates don't trust it. And we’re seeing an uptick of drop offs, complaints, and candidates who self-select out before they’ve even started.  

To tackle trust issues, we need to understand what's driving the distrust in the first place.  

Three reasons behind the distrust

Candidate scepticism about AI isn't random. It clusters around four specific concerns.

1. Fear of invisible decisions

The most common concern is that candidates can't see what AI is doing. When a human rejects you, there's at least the possibility of a reason or a conversation to look into. When an algorithm rejects you, it feels final and unreasonable.

The fix isn't to hide that AI is involved. In fact, it's the opposite. Be clear about what AI does in your hiring process from the get-go.  

2. Concern about bias

Only 8% of US job seekers believe AI makes hiring more fair, according to 2025 research. That's a striking number given that structured AI screening is one of the most effective tools available for reducing the four main types of bias in traditional hiring.

We need to address this disconnect.  

AI screening can reduce affinity bias, confirmation bias, attribution bias, and recency bias which are products of human inconsistency. But because candidates associate "AI" with high-profile cases of algorithmic discrimination, they assume AI makes bias worse and not the other way around.

Changing this perception requires explaining how your AI is designed to be fair based on what questions it asks, how scoring works, and who reviews the outputs.

3. Feeling dehumanised

Hiring is personal. Candidates are making significant life decisions and they want to feel seen.  

An AI screening process that offers no human touchpoint or acknowledgement that a person reviewed their application creates a candidate experience that feels transactional at best and dismissive at worst.

Make it clear where AI ends and people begin so candidates know a human is in the loop.  

Why does it matter?

Candidate distrust has measurable operational consequences:

Impact What it looks like in practice
Higher drop-off rates Candidates who distrust AI are more likely to abandon the process mid-screening even if they were well-qualified. You're losing people before you've had a chance to evaluate them.
Weaker offer acceptance Gartner research shows candidate acceptance rates dropped from 74% in 2023 to 51% in 2025 (source: Gartner 2025 survey). Trust in the hiring process is a significant factor in that decline.
Employer brand damage Candidates talk. One who feels their application was handled unfairly or without transparency will share that experience in different networks.

What to do about closing the trust gap in hiring?  

Closing the trust gap doesn't require removing AI from your hiring process. Here are four ways to approach it.  

1. Lead with transparency

There's a difference between disclosing that you use AI and being transparent about it.

Disclosure "We use AI in our hiring process."
Being transparent "Here's what it does and doesn't do, and here's how a human reviews the result."

While disclosure is a legal requirement in some jurisdictions, it is transparency that builds trust. Candidates need to know:

  • What role AI plays in the process — screening, scoring, shortlisting
  • What data is collected and how it's used
  • Where a human is involved and what decisions remain with people
  • What happens if they want to raise a concern or appeal a result

2. Design AI screening around structured fairness

One of the most powerful things you can tell a candidate is: "Everyone gets the same questions, evaluated against the same criteria."  

That's both the promise and the design principle of structured interviews. It's what separates AI screening that reduces bias from AI that merely automates existing bias.

For example, Kiku's AI interviews are built around three components:  

  • Eligibility questions that apply consistent requirements to every candidate
  • Realistic job previews that give candidates honest context about the role
  • Behavioural questions that assess actual capability rather than credential presentation.  

This structure is the foundation of bias mitigation in high-volume hiring.

3. Make human oversight visible

Communication is key in making the human oversight of the process visible. And it doesn't even need to be elaborate. It can be a sentence in your screening invite or a line on your careers FAQ.  

The point is that candidates shouldn't have to wonder whether anyone looked at their application or was it all decided by AI.

Teams using Kiku see and increase candidate satisfaction partly because the process is fast and convenient but also because it's designed to feel fair.  

See how candidates experience the process.

 

4. Close the feedback loop

One of the most cited sources of candidate frustration is never hearing back or receiving a generic rejection with no indication of why. AI makes it possible to give every candidate personalised feedback based on their actual interview responses.

When a candidate receives feedback referencing something they said, it signals the process was genuine. This is also one of the clearest demonstrations of how AI agents differ from simple automation tools — the agent doesn't just filter, it engages.

 

The trust gap is a competitive advantage  

Most hiring teams are focused on what AI can do for them. Very few are focused on what AI transparency can do for their employer brand.

Candidates who encounter a hiring process that explains how AI is used, assures them of human oversight, and treats them as individuals respond differently. They complete the process at higher rates, accept offers more often, and speak positively about the experience regardless of the outcome.

The 74% of candidates who don't currently trust AI hiring tools aren't a lost cause either. They're an audience waiting to be convinced by how you:

  • Design your process
  • Communicate your hiring process that involves AI
  • Consistently deliver on what you promise

Remember, when AI hiring is universal because it exists in most hiring process, the ability to build trust becomes your differentiator.

FAQ: Candidate trust and AI in hiring

Why do candidates distrust AI in hiring?

The distrust stems from four main concerns:  

  • Fear of unfair decision-making
  • Concern that AI perpetuates or amplifies bias
  • Feeling dehumanised by a process that removes human contact
  • Uncertainty about data privacy  

Most of these are addressable through transparency and process design. For practical guidance, see: The Recruiter's Guide to Explaining AI to Candidates.

 

Does AI in hiring actually reduce bias?

Structured AI screening reduces bias by applying consistent criteria to every candidate. Whether AI makes hiring more or less fair depends entirely on how it's designed.  

Poorly designed AI can automate existing bias. Well-designed AI with structured questions, transparent scoring, and human oversight is more consistent than most unstructured interview processes.  

Read more: Four types of bias in screening processes.

 

How do you measure candidate trust in your hiring process?

Post-screening candidate satisfaction surveys are the most direct method. Track completion rates (candidates who start but don't finish), offer acceptance rates, and voluntary feedback.  

 

Ready to build a hiring process candidates trust?

Book a demo and we'll walk you through how Kiku's structured AI screening is designed and how to communicate it to your candidates in a way that builds confidence rather than concern.

Explore Kiku's AI driven productss

Ready to recruit smarter?

Let us show you how AI can relieve your everyday and free up your time.