Back

Want to make recruitment more human? Make it ‘trustless’

Last week, our smart interviewer technology was featured in a glowing piece by the Australian Financial Review. The story was picked up by LinkedIn News Australia, who conducted a poll asking users if they were “comfortable being interviewed by a bot”. 

The poll garnered more than 6,500 responses. Perhaps unsurprisingly, 50% of respondents selected the response “No – it’s a job for humans.” Just under a third of LinkedIn users said that they believe chatbot interviewing is “the future”, while 21% said that it’s appropriate only for certain roles.

When you have over 6,500 responses, you can do some meaningful analysis. In this case, “It’s just for humans” was the prevailing opinion. But, in the comments section attached to the poll, we discovered more about how people feel toward Ai, both as a technological construct and as a tool for recruitment. We bucketed the comments into five recurring themes:

  1. We can’t trust the people that make Ai
  2. Ai can never remove bias
  3. Ai aims to replace humans
  4. Ai is dangerous
  5. People don’t like chatbots, because they aren’t human

Ai hasn’t made a good name for itself lately – take Amazon’s recent facial recognition debacle as a good example – so it’s easy to see why people are resistant to the prospect of Ai moving into a space historically handled by humans. Take a bird’s eye view, and the notion certainly looks preposterous: How could a machine, asking just five questions, ever hope to recreate the capabilities of a seasoned recruiter or talent acquisition specialist?

That is the problem, though: The more ‘human’ aspects of the recruitment process are ruining the game. Ghosting is rampant, both for candidates and recruiters. Ineradicable biases are creating unfairnesses that permeate organisations from top to bottom. The Great Resignation is putting immense pressure on hirers to move quickly, excluding thousands of applicants based on arbitrary criteria that shift from month to month. Consider, too, these sobering statistics:

  • According to a recent global survey by CoderPad, 65% of tech recruiters believe their hiring process is biased
  • Mentions of ‘ghosting’ in Glassdoor interview reviews is up 450% since the start of the pandemic (Business Insider, 2021)
  • A toxic corporate culture is 10.4 times more likely to predict employee churn than compensation (the point here being that hiring poorly decimates an organisation in no time flat)
  • 78% of job seekers have admitted to lying on their CVs

Ai is held to an impossible standard

For Ai to qualify as a useable, reliable tool, we expect it to be perfect. We compare it, unfairly, against some ultimate human ideal: The chirpy, well-rested recruiter on their best day. The kind of recruiter who has never ghosted anyone, who has no biases whatsoever, and who finds the right person for the right job, no matter what. Here’s the issue with this comparison: That kind of human doesn’t exist.

For Ai to be a valid and useful tool, and an everyday part of the human recruiter’s toolset, it doesn’t need to be perfect, flawless; it only needs to be better than the alternative. Can’t be done? For one example, Smart Interviewer, eliminates the problem of ghosting completely: Each of your candidates gets an interview, and every single person receives feedback. Even better? 98% of the candidates who use our platform find that feedback useful. 

(That is to say nothing of the way it removes bias, as if that weren’t enough on its own.)

We need to make recruitment ‘trustless’

Ai has a way to go before it will earn the trust of the majority. Again, this is totally understandable. We believe that there is a better, and quicker, way to get there.

To borrow a concept commonly associated with cryptocurrency and blockchain technology, we want to create a trustless environment for our Ai and its activities. Not an environment without trust, but one in which trust is a foregone conclusion. In a trustless environment, dishonesty, either by admission or omission, is impossible. Just as you cannot forge blockchain entries, you cannot hide the workings and algorithms that make our Ai what it is.

That is the essence of our FAIR Framework. For hiring managers and organisations, this document provides an assurance as well as a template to query fairness related metrics of Ai recruitment tools. For candidates, FAIR ensures that they are using a system built with fairness as a key performance metric. For us, transparency on fairness is standard operating procedure.

Finally, think about this: When we say we want a ‘human’ recruitment process, what are we really saying? That we want something fallible, prone to biases, subject to the decisions of people who have bad days? What if a trustless Ai companion could help remove all that, without replacing the person? Is that not more human?


Blog

What’s More Ethical: Measuring Skills or Guessing Them?

Barb Hyman, CEO & Founder, Sapia.ai

Why skills data matters for HR and CHROs

Every CHRO I speak to wants clarity on skills:

  • What skills do we have today?

  • What skills do we need tomorrow?

  • How do we close the gap?

The skills-based organisation has become HR’s holy grail. But not all skills data is created equal. The way you capture it has ethical consequences.

Two very different approaches to skills analysis

1. Skills inference from digital traces

Some vendors mine employees’ “digital exhaust” by scanning emails, CRM activity, project tickets and Slack messages to guess what skills someone has.


It is broad and fast, but fairness is a real concern.

2. Skills measurement through structured conversations

The alternative is to measure skills directly. Structured, science-backed conversations reveal behaviours, competencies and potential. This data is transparent, explainable and given with consent.

It takes longer to build, but it is grounded in reality.

The risks of skills inference HR leaders must confront

  • Surveillance and trust: Do your people know their digital trails are being mined? What happens when they find out?

  • Bias: Who writes more Slack updates, introverts or extroverts? Who logs more Jira tickets, engineers or managers? Behaviour is not the same as skills.

  • Explainability: If an algorithm says, “You are good at negotiation” because you sent lots of emails, how can you validate that?

  • Agency: If a system builds a skills profile without consent, do employees have control over their own career data?

A more human approach: skills measurement

Skills define careers. They shape mobility, pay and opportunity. That makes how you measure them an ethical choice as well as a technical one.

At Sapia.ai, we have shown that structured, untimed, conversational AI interviews restore dignity in hiring and skills measurement. Over 8 million interviews across 50+ languages prove that candidates prefer transparent and fair processes that let them share who they are, in their own words.

Skills measurement is about trust, fairness and people’s futures.

Questions every HR and CHRO should ask

When evaluating skills solutions, ask:

  • Is this system measuring real skills, or only inferring them from proxies?

  • Would I be comfortable if employees knew exactly how their skills profile was created?

  • Does this process give people agency over their data, or take it away?

The real test of ethics in the skills-based organisation

The choice is between skills data that is guessed from digital traces and skills data that is earned through evidence, reflection and dialogue.
If you want trust in your people decisions, choose measurement over inference.

To see how candidates really feel about ethical skills measurement, check out our latest research report: Humanising Hiring, the largest scale analysis of candidate experience of AI interviews – ever.


FAQs

What is the most ethical way to measure skills?
The most ethical method is to use structured, science-backed conversations that assess behaviours, competencies and potential with consent and transparency.

Why is skills inference problematic?
Skills inference relies on digital traces such as emails or Slack activity, which can introduce bias, raise privacy concerns and reduce employee trust.

How does ethical AI help with skills measurement?
Ethical AI, such as structured conversational interviews, ensures fairness by using consistent data, removing demographic bias and giving every candidate or employee a voice.

What should HR leaders look for in a skills platform?
Look for transparency, explainability, inclusivity and evidence that the platform measures skills directly rather than guessing from digital behaviour.

How does Sapia.ai support ethical skills measurement?
Sapia.ai uses structured, untimed chat interviews in over 50 languages. Every candidate receives

Read Online
Blog

Mirrored diversity: why retail teams should look like their customers

Walk into any store this festive season and you’ll see it instantly. The lights, the displays, the products are all crafted to draw people in. Retailers spend millions on campaigns to bring customers through the door. 

But the real moment of truth isn’t the emotional TV ad, or the shimmering window display. It’s the human standing behind the counter. That person is the brand.


The missing link in retail hiring

Most retailers know this, yet their hiring processes tell a different story. Candidates are often screened by rigid CV reviews or psychometric tests that force them into boxes. Neurodiverse candidates, career changers, and people from different cultural or educational backgrounds are often the ones who fall through the cracks.

And yet, these are the very people who may best understand your customers. If your store colleagues don’t reflect the diversity of the communities you serve, you create distance where there should be connection. You lose loyalty. You lose growth.

We call this gap the diversity mirror.


What mirrored diversity looks like

When retailers achieve mirrored diversity, their teams look like their customers:

  • A grocery store team that reflects the cultural mix of its neighbourhood.
  • A fashion store with colleagues who understand both style and accessibility.
  • A beauty retailer whose teams reflect every skin tone, gender, and background that walks through the door.

Customers buy where they feel seen – making this a commercial imperative. 

 

How to recruit seasonal employees with mirrored diversity

The challenge for HR leaders is that most hiring systems are biased by design. CVs privilege pedigree over potential. Multiple-choice tests reduce people to stereotypes. And rushed festive hiring campaigns only compound the problem.

That’s where Sapia.ai changes the equation: Every candidate is interviewed automatically, fairly, and in their own words.

  • Bias is measured and monitored using Sapia.ai’s FAIR™ framework.
  • Outcomes are validated at scale: 7+ million candidates, 52 countries, average candidate satisfaction 9.2/10.
  • Diversity can be measured: with the Diversity Dashboard, you can track DEI capture rates, candidate engagement, and diversity hiring outcomes across every stage of the funnel.

With the right HR hiring tools, mirrored diversity becomes a data point you can track, prove, and deliver on. It’s no longer just a slogan.

 

Retail recruiting strategies in action: the David Jones example

David Jones, Australia’s premium department store, put this into practice:

  • 40,000 festive applicants screened automatically
  • 80% of final hires recommended by Sapia.ai
  • Recruiters freed up 4,000 hours in screening time
  • Candidate experience rated 9.1/10

The result? Store teams that belong with the brand and reflect the customers they serve.

Read the David Jones Case Study here 👇


Recruiting ideas for retail leaders this festive season

As you prepare for festive hiring in the UK and Europe, ask yourself:

  • How much will you spend on marketing this Christmas?
  • And how much will you invest in ensuring the colleagues who deliver that brand promise reflect the people you want in your stores?

Because when your colleagues mirror your customers, you achieve growth, and by design, you’ll achieve inclusion.

See how Sapia.ai can help you achieve mirrored diversity this festive season. Book a demo with our team here. 

FAQs on retail recruitment and mirrored diversity

What is mirrored diversity in retail?

Mirrored diversity means that store teams reflect the diversity of their customer base, helping create stronger connections and loyalty.

Why is diversity important in seasonal retail hiring?

Seasonal employees often provide the first impression of a brand. Inclusive teams make customers feel seen, improving both experience and sales.

How can retailers improve their hiring strategies?

Adopting tools like AI structured interviews, bias monitoring, and data dashboards helps retailers hire fairly, reduce screening time, and build more diverse teams.

 

Read Online
Blog

The Diversity Dashboard: Proving your DEI strategy is working

Why measuring diversity matters

Organisations invest heavily in their employer brand, career sites, and EVP campaigns, especially to attract underrepresented talent. But without the right data, it’s impossible to know if that investment is paying off.

Representation often varies across functions, locations, and stages of the hiring process. Blind spots allow bias to creep in, meaning underrepresented groups may drop out long before offer.

Collecting demographic data is only step one. Turning it into insight you can act on is where real change and better hiring outcomes happen.

What is the Diversity Dashboard?

The Diversity Dashboard in Discover Insights, Sapia.ai’s analytics tool, gives you real-time visibility into representation, inclusion, and fairness at every stage of your talent funnel. It helps you connect the dots between your attraction strategies and actual hiring outcomes.

Key features include:

  • Demographic filters – Switch between gender, ethnicity, English as an additional language, First Nations status, disability, and veteran status. View age and ethnicity in standard or alternative formats to match regional reporting needs.
  • Representation highlights – Identify the top five represented sub-groups for each demographic, plus the three fastest-growing among underrepresented groups.
  • Track trends over time – See month-by-month changes in representation over the past 12 months, compare to earlier periods, and connect the data back to your EVP and attraction spend.
  • Candidate experience metrics – Measure CSAT (satisfaction) and engagement rates by demographic to ensure your hiring process works for everyone. Inclusion is measurable.
  • Hiring fairness – Compare representation in your applied, recommended, and hired pools to spot drop-offs. Understand not just who applies, but who progresses — and why.

     

From insight to action

With the Diversity Dashboard, you can pinpoint where inclusion is thriving and where it’s falling short.

  • See if your EASL candidates are applying in high numbers but not progressing to live interview.
  • Spot if candidates with a disability report high satisfaction but have lower offer rates.
  • Track the impact of targeted campaigns month-by-month and adjust quickly when something isn’t working.

It’s also a powerful tool to tell your success story. Celebrate wins by showing which underrepresented groups are making the biggest gains, and share that progress with boards, executives, and regulators.

Built on science, backed by trust

Powered by explainable AI and the world’s largest structured interview dataset, your insights are fair, auditable, and evidence-based.

Measuring diversity is the first step. Using that data to take action is where you close the Diversity Gap. With the Diversity Dashboard, you can prove your strategy is working and make the changes where it isn’t.

Book a demo to see the Diversity Dashboard in action.

Read Online