Blog

Back

Written by Nathan Hewitt

Recruitment metrics: How and why to track your candidate abandonment rate

Fixing job application abandonment | Sapia.ai recruitment software

In the candidate short market we’re in, it’s absolutely critical to keep talent engaged throughout the entire application process. You simply cannot afford to lose the talent that you’ve spent time and money attracting. This sounds obvious, of course, but abandonment is a key problem – and few companies know where, when, and why it is happening. 

Let’s start with the metric, and then talk about how we apply it to your wider talent acquisition journey.

Overall candidate abandonment rate = number of candidates still in the process at shortlist stage, minus the total number of candidates who landed on your careers page, divided by that total number again. Or:

C = frac{x_2 - x_1}{x_1}

At the very minimum, this is the metric you need to start tracking, because it is a generalized diagnostic for the health of your recruitment process. If you know that you had 100 visitors to your careers (or job ad) page, but your shortlist has only 10 candidates in it, you’ve lost 90% of your possible talent pool at one stage or another. Simple math, yes, but in our experience, many recruiters and talent acquisition managers don’t look at what their starting pool of candidate interest was – and therefore, what their theoretical talent pool might have been – and look only at actual applicants.

This poses another, related question: How do I know what my abandonment rate is at each stage of the application process? 

Let’s say, like the example above, that you had 100 visitors to your careers (or job ad) page, and 20 of them completed the first-step application form on that page. You’ve lost 80% of your possible pool right there. Not great, but at least you know – now you can examine that page to uncover possible issues preventing conversion. Is the page too long? Does it have too much text? Is the ‘apply’ button clearly shown? Is the form too long, requiring too much information to fill out? Are your perks/EVP attributes clearly displayed?

Without examining stage progression in isolation, you might never know why people aren’t sticking around.

To reiterate: As well as an overall abandonment rate, you need to measure the drop out rates at each of the stages of your talent acquisition journey. The next section can help show you what to focus on.

Where, when and why do candidates drop out of the application process?

Conventional wisdom tells us that the longer your application and interview process goes on, the higher your dropout rate will be. But that’s a generalized issue – it tells you nothing about how to fix the problem, beyond simply making it shorter. You need specific, localized data to diagnose and fix your leakage spots.

Data from a 2022 Aptitude Research report on key interviewing trends found that candidates tend to drop out at the following stages, in the following proportions:

  • 22% of candidates drop out at the application stage
  • 24% at the screening stage
  • 25% at the interview stage
  • 18% at the assessment stage
  • 9% at the offer stage

Candidate application abandonment rate | Sapia Ai recruitment software

Good to know, right? If you audit your own journey, looking at these stages and using these numbers as benchmarks, you can quickly identify your weak areas. 

For example: You might be proud of your four-step culture-building interview process, in which candidates have a coffee meet-and-greet with the team they’re hoping to join. But if it’s cumbersome for the applicant and relies on several stakeholders to orchestrate, it may be dragging your process out unnecessarily, and doing more harm than good.

The most common job application abandonment stage: The interview

25% of candidates drop out here. Shouldn’t really be a surprise, should it? Job interviews are long, numerous, and in many cases, ineffective. According to Aptitude Research, 33% of companies aren’t confident in how they interview; 50% believe they’ve lost talent due to poor interviewing.

When asked about their top interviewing challenges, surveyed HR and TA leaders responded:

  • Our interview process is too long (52%)
  • We make candidates go through too many interviews (39%)
  • Out interview process is inconsistent (38%)
  • We don’t use objective data to drive decisions (32%)
  • We have bias in the interview process (21%)

Let’s focus on that second-last challenge: lack of objective data. Almost a third of companies are approaching their interview and application process with assumptions and gut feelings; and half of them believe their interview process is too long. 

Despite this, 68% of companies say they have not made any improvements surrounding candidate experience this year. How many, then, are looking seriously at their entire talent acquisition journey to see where it’s failing? 

This is why we’re focusing on candidate abandonment rate in this post: It is a simple metric to show the health of your application process, easier to measure than many of the other recruitment metrics for which you’re responsible (the ever-nebulous quality-of-hire being a prime example). As the saying goes, what gets measured, gets managed.

Start here today, and see what you learn.

(P.S. Sapia’s Ai Smart Chat Interviewer combines the first four stages of your process – application, screening, interviewing, and assessment – together, resulting in an application process that can secure top talent in as little as 24 hours.

Because it’s a chat-based interview with a smart little Ai, your team doesn’t need to do anything – everyone who applies gets an interview, immediately. That maximizes your talent pool right from the get-go.

What’s more, our candidate dropout rate is just 15%, on average. That means that 85% of your starting talent pool will stick around.

Why do our candidates stick around? More than 90% of them love the experience. See how we can help you here, today.)


Blog

Give me data to make me smarter please

Big decisions on ‘gut feel’

It scares me sometimes when I think about the big decisions I’ve made on gut feel and will probably continue to make relying on my instincts.

Personally, I would love to be armed with meaningful data and insights whenever I make important life decisions. Such as what’s the maximum price I should pay for that house on the weekend, who to partner with, who to work for, and who to hire into my team. Data that helped me see a bigger picture or another perspective would be very valuable. For most of those decisions there is so much information asymmetry which makes it feel even riskier. For sure I could check out glassdoor when choosing my next job but it comes with huge sample bias and not much science behind it.

So why is there still an (almost) universal blind acceptance that these decisions are best entrusted to gut feel? Especially given the facts show we are pretty crap at making good ‘gut’ based decisions.

Dialling down subjectivity

I’m one of those people that believe in the power of AI — to remove that asymmetry, to dial down the bias, to empower me with data to make smarter!

At a recent HR conference, a quick pulse around the room confirmed there is high curiosity and appetite to understand AI. What we’re missing is the clarity about the opportunities and what success looks like from using it. The concern about how to navigate the change management exercise that comes with introducing data and technology into a previously entirely human-driven process is daunting.

The best human resources AI is not about taking the human out of hiring and culture decisions. Far from it. It’s about providing meaningful data to help us make better decisions faster.

Having worked in the ‘People and Culture’ space for a while, I know building trust in how the organisation makes decisions — especially people decisions — is hard in the absence of data. Yet we all know that transparency builds trust. So how can you build that trust through transparency when the decision-maker is a human — and the humans make decisions in closed rooms and private discussions.

Human decision making is the ultimate black box.

Remember that feeling when the recruiters call up and say you weren’t a good fit — who feels great about that call? A total black box cop-out response!

It doesn’t have to be this way, and the faster we can get to better decision making the better. Seven months ago, I joined a team of data scientists who had spent the prior three years building technology that relies on AI to work its magic and equip recruiters with meaningful and actionable insights when hiring.

I’m no data scientist. I have had to learn the ins and outs of our AI pretty fast. And because our technology is at work in the people space, I’m learning how to ensure the AI is safe, fair and our customers trust it and us to do the right thing with it.

If we reduce it to its core process, a machine learning algorithm is trying to improve the performance of an outcome based on the input data it receives. In some instances, such as in deep learning algorithms, it’s trying to simulate the functioning of the human brain’s neural networks, to figure out the patterns between the data inputs and data outputs.

A machine is infinitely more obliging and easier to understand than a human in responding to the direction.

Because it has no feelings, it’s going to be free of the biases humans bring to these critical decisions. Plus machines are more malleable to learning and way faster at it. This is more critical these days when roles are changing dynamically and swiftly as industries are disrupted.

Our team plays in predictive analytics for recruitment space. What this means is our AI seeks out the lead indicators of job success: the correlating factors between values, personality and job performance. We all intuitively know that behaviours drive leading indicators. But we struggle to assess for those consistently well.

Our job is to augment your intelligence and ability to make the right decision. By knowing how people treat others, what drives them, and their values, you become better informed about the real DNA of a person and how they might function in your team.

All of our customers are looking for a slightly different kind of worker and use AI to help.

A powerful motivator to use AI is to build confidence and trust in the process from both candidates and people leaders by dialling down the human element (getting rid of the bias) and revealing the patterns for success. Less room for bias = more fairness for candidates = more diverse hiring. Key to this is we don’t look at any personal information — the machine doesn’t know or care about your age, gender, colour or educational background.

For our customers having this data is empowering and helps them make smart decisions. For all the people who are affected by those decisions, they can feel relieved that they were considered on their merits, not based on someone’s gut feel.

There is no perfect method for sure to make the right decision.

But if I have to choose between trusting biased humans and (a sometimes) biased machine they create, I know which one I would trust more. At least with a machine, you can actually test for the bias, remove it, and re-train it.


Suggested reading:

https://sapia.ai/blog/hr-job-manage-risk/

 

Read Online
Blog

Algorithmic Hiring to Improve Social Mobility

It is a widely held belief that diversity brings strength to the workplace through different perspectives and talents.

The value is greatest when companies harness the differences between employees from multiple demographic backgrounds to understand and appeal to a broad customer base. But true diversity relies on social mobility and therein lies the problem: the rate of social mobility in the UK is the worst in the developed world.

The root cause of the UK’s lack of social mobility can be found in the very place that it can bring the most value – the workplace. Employers’ recruiting processes often suffer from unconscious human bias that results in involuntary discrimination. As a result, the correlation between what an employee in the UK earns today and what his or her father earned is more apparent than in any other major economy.

This article explores the barriers to occupational mobility in the UK and the growing use of predictive analytics or algorithmic hiring to neutralise unintentional prejudice against age, academic background, class, ethnicity, colour, gender, disability, sexual orientation and religion.

The government wants to promote equal opportunity

The UK government has highlighted the fact that ‘patterns of inequality are imprinted from one generation to the next’ and has pledged to make their vision of a socially mobile country a reality. At the recent Conservative party conference in Manchester, David Cameron condemned the country’s lack of social mobility as unacceptable for ‘the party of aspiration’. Some of the eye-opening statistics quoted by Cameron include:

  • 7% of the UK population has been privately educated.
  • 22% of FTSE 350 chief executives have been privately educated.
  • 44% within the creative industries have been privately educated.
  • By the age of three, children from disadvantaged families are already nine months behind their upper middle class peers.
  • At sixteen, children receiving school meals will on average achieve 1.7 grades lower in their GCSEs.
  • For A levels, the school one attends has a disproportionate effect on A* level achievement; 30% of A* achievers attend an independent school, while children attending such schools make up merely 7% of the general population.
  • Independent school graduates make up 32% of MPs, 51% of medics, 54% of FTSE 100 chief executives, 54% of top journalists and 70% of High Court judges.
  • By the age of 42, those educated privately will earn on average £200,000 more than those educated at state school.

Social immobility is an economic problem as well as a social problem

The OECD claims that income inequality cost the UK 9% in GDP growth between 1990 and 2010. Fewer educational opportunities for disadvantaged individuals had the effect of lowering social mobility and hampering skills development. Those from poor socio economic backgrounds may be just as talented as their privately educated contemporaries and perhaps the missing link in bridging the skills gap in the UK. Various industry sectors have hit out at the government’s immigration policy, claiming this widens the country’s skills gap still further.

Besides immigration, there are other barriers to social mobility within the UK that need to be lifted. Research by Deloitte has shown that 35% of jobs over the next 20 years will be automated. These are mainly unskilled roles that will impact people from low incomes. Rather than relying too heavily on skilled immigrants, the country needs to invest in training and development to upskill young people and provide home-grown talent to meet the future needs of the UK economy. Countries that promote equal opportunity for everyone from an early age are those that will grow and prosper.

How are employers supporting the government’s social mobility policy?

The UK government’s proposal to tackle the issue of social mobility, both in education and in the workplace, has to be greatly welcomed. Cameron cited evidence that people with white-sounding names are more likely to get job interviews than equally qualified people with ethnic names, a trend that he described as ‘disgraceful’. He also referred to employers discriminating against gay people and the need to close the pay gap between men and women. Some major employers – including Deloitte, HSBC, the BBC and the NHS – are combatting this issue by introducing blind-name CVs, where the candidate’s name is blocked out on the CV and the initial screening process. UCAS has also adopted this approach in light of the fact that 36% of ethnic minority applicants from 2010-2012 received places at Russell Group universities, compared with 55% of white applicants.

Although blind-name CVs avoid initial discriminatory biases in an attempt to improve diversity in the workforce, recruiters may still be subject to similar or other biases later in the hiring process. Some law firms, for example, still insist on recruiting Oxbridge graduates, when in fact their skillset may not correlate positively with the job or company culture. While conscious human bias can only be changed through education, lobbying and a shift in attitude, a great deal can be done to eliminate unconscious human bias through predictive analytics or algorithmic hiring.

How can algorithmic hiring help?

Bias in the hiring process not only thwarts social mobility but is detrimental to productivity, profitability and brand value. The best way to remove such bias is to shift reliance from humans to data science and algorithms. Human subjectivity relies on gut feel and is liable to passive bias or, at worst, active discrimination. If an employer genuinely wants to ignore a candidate’s schooling, racial background or social class, these variables can be hidden. Algorithms can have a non-discriminatory output as long as the data used to build them is also of a non-discriminatory nature.

Predictive analytics is an objective way of analysing relevant variables – such as biodata, pre-hire attitudes and personality traits – to determine which candidates are likely to perform best in their roles. By blocking out social background data, informed hiring decisions can be made that have a positive impact on company performance. The primary aim of predictive analytics is to improve organisational profitability, while a positive impact on social mobility is a healthy by-product.

An example of predictive analytics at work

A recent study in the USA revealed that the dropout rate at university will lead to a shortage of qualified graduates in the market (3 million deficit in the short term, rising to 16 million by 2025). Predictive analytics was trialled to anticipate early signs of struggle among students and to reach out with additional coaching and support. As a result, within the state of Georgia student retention rates increased by 5% and the time needed to earn a degree decreased by almost half a semester. The programme ascertained that students from high-income families were ten times more likely to complete their course than those from low-income households, enabling preventative measures to be put in place to help students from socially deprived backgrounds to succeed.

What can be done to combat the biases that affect recruitment?

Bias and stereotyping are in-built physiological behaviours that help humans identify kinship and avoid dangerous circumstances. Such behaviours, however, cloud our judgement when it comes to recruitment decisions. More companies are shifting from a subjective recruitment process to a more objective process, which leads to decision making based on factual evidence. According to the CIPD, on average one-third of companies use assessment centres as a method to select an employee from their candidate pool. This no doubt helps to reduce subjectivity but does not eradicate it completely, as peer group bias can still be brought to bear on the outcome.

Two of the main biases which may be detrimental to hiring decisions are ‘Affinity bias’ and ‘Status Quo bias’. ‘Affinity bias’ leads to people recruiting those who are similar to themselves, while ‘Status Quo bias’ leads to recruitment decisions based on the likeness candidates have with previous hires. Recruiting on this basis may fail to match the selected person’s attributes with the requirements of the job.

Undoubtedly it is important to get along with those who will be joining the company. The key is to use data-driven modelling to narrow down the search in an objective manner before selecting based on compatibility. Predictive analytics can project how a person will fare by comparing candidate data with that of existing employees deemed to be h3 performers and relying on metrics that are devoid of the type of questioning that could lead to the discriminatory biases that inhibit social mobility.

“When it comes to making final decisions, the more data-driven recruiting managers can be, the better.”

Bias works on many levels of consciousness

‘Heuristic bias’ is another example of normal human behaviour that influences hiring decisions. Also known as ‘Confirmation bias’, it allows us to quickly make sense of a complex environment by drawing upon relevant known information to substantiate our reasoning. Since it is anchored on personal experience, it is by default arbitrary and can give rise to an incorrect assessment.

Other forms of bias include ‘Contrast bias’, when a candidate is compared with the previous one instead of comparing his or her individual skills and attributes to those required for the job. ‘Halo bias’ is when a recruiter sees one great thing about a candidate and allows that to sway opinion on everything else about that candidate. The opposite is ‘Horns bias’, where the recruiter sees one bad thing about a candidate and lets it cloud opinion on all their other attributes. Again, predictive analytics precludes all these forms of bias by sticking to the facts.

https://sapia.ai/blog/workplace-unconscious-bias/

Age is firmly on the agenda in the world of recruitment, yet it has been reported that over 50% of recruiters who record age in the hiring process do not employ people older than themselves. Disabled candidates are often discriminated against because recruiters cannot see past the disability. Even these fundamental stereotypes and biases can be avoided through data-driven analytics that cut to the core in matching attitudes, skills and personality to job requirements.

Once objective decisions have been made, companies need to have the confidence not to overturn them and revert to reliance on one-to-one interviews, which have low predictive power. The CIPD cautions against this and advocates a pure, data-driven approach: ‘When it comes to making final decisions, the more data-driven recruiting managers can be, the better’.

The government’s strategy for social mobility states that ‘tackling the opportunity deficit – creating an open, socially mobile society – is our guiding purpose’ but that ‘by definition, this is a long-term undertaking. There is no magic wand we can wave to see immediate effects.’ Being aware of bias is just the first step in minimising its negative effect in the hiring process. Algorithmic hiring is not the only solution but, if supported by the government and key trade bodies, it can go a long way towards remedying the inherent weakness in current recruitment practice. Once the UK’s leading businesses begin to witness the benefits of a genuinely diverse workforce in terms of increased productivity and profitability, predictive hiring will become a self-fulfilling prophecy.

Read Online
Blog

How Iceland won best in-house innovation in recruitment

We’re thrilled to announce that along with our customer Iceland Foods, we won the award for Best In-House Innovation in Recruitment at the 2021 Recruiter Awards in London.

Established in 2002, the Recruiter Awards gala is the UK’s largest event for the entire recruitment community recognising outstanding achievements by agencies and in-house recruiters.

The award recognises the partnership between  Iceland and Sapia that saved their store leaders 24,000 hours a year by implementing transformational change – during a pandemic.

Iceland receives a high volume of applicants – more than 120,000 per month – and faced a crisis in 2020: increased trade and Covid-19 absence meant that surge hiring needed to be automated, without losing the personal touch.

Automation was critical to increase the time store managers had to trade in their stores.

It had to be a simple solution that store managers would understand quickly and trust. The candidate experience had to be fast, inclusive and human. 

The tool needed to work for the candidate market which is as diverse as the general population. The team settled on Smart Interviewer as their solution of choice. 

Candidates have reacted well to the technology, with 99% positive sentiment towards the process  and 77% of candidates more likely to  recommend Iceland as an employer of choice.

There was 5x payback in four months, giving back 8,000 hours to the business and costing less than £1 per applicant.

On top of this there was zero gender and race bias, ensuring people hires are as diverse as the applicant group.

The Judges comments were that: “ this simple, straightforward submission ticked every box by demonstrating the contribution the recruitment function played to the success of their overall business. They also clearly demonstrated thoughtful consideration to the fact that many candidates would be applying for jobs at Iceland following the decimation of their previous career paths, for example, aviation industry employees.”

Read the case study of Iceland and Sapia innovated during the pandemic here.

Read Online