Most candidates aren’t being rejected – they’re being ghosted. Mentions of ghosting on Glassdoor is up 450% since the start of COVID. We know that it’s bad for employer brand and long-term prospecting, so why does silent candidate rejection happen so often?
Here are some of the causes most commonly cited for ghosting candidates:
We’re not out to bash recruiters, talent acquisition professionals, or hiring managers. Finding talent during the Great Resignation is difficult. Time is precious. Offering everyone a high-touch candidate experience, therefore, seems far beyond scope.
Problem is, candidates expect feedback. At the very least, they need closure. Rejection by silence has a unique sting. Consider the following responses, offered by people who applied for jobs and were either ghosted, or received a templated rejection:
“Discarded. Treated like number.”
“Crushed. Doubted my competence and value.”
“Depressed, unsure of reasons, uncertainty with quality of CV and skills or experience.”
The preceding is part of a new study, What Type of Explanation Do Rejected Job Applicants Want? Implications for Explainable Ai, by researchers at UNSW, Australia. It aims to prescribe an ideal framework for positive candidate rejection.
Here is a snapshot of some of the findings.
This point may sound obvious, but here it is: 53% of study respondents wanted to know why they did not make the cut. Just under a third wanted to know how they might improve, and 12% of respondents wanted to find out more about the competition – including whether or not the successful candidate was an inside hire.
According to the wants of candidates, when crafting a rejection letter, it is recommended that you focus on at least one of these factors:
If possible, err on the side of extra transparency. If it was an inside hire, say so. If the losing candidate was neck-and-neck with the winner, tell them. People want the truth, it seems, and without sugar-coating.
This is interesting: As part of the study, respondents were asked how much they would pay for a tailored explanation for rejection. 44% of respondents said they wouldn’t pay anything for feedback; 25% of respondents said they might pay more than $20.
We might first surmise from this result that applicants don’t place value on feedback, but this isn’t the case; for the most part, they believe they have already paid for it. Said one respondent, “The idea about paying for feedback is idiotic and I beg you not to put it into the universe. If I take the time to apply for a job they should have the courtesy to provide feedback. Job hunting is hard enough and expensive don’t add more cost to excuse inexcusable conduct.” Fair enough.
Phai, our smart interviewing Ai, gives every single one of your candidates an interview. But it also provides tailored personality insights and coaching tips to every single one of your candidates, whether or not they are successful.
We do this because we can quickly and accurately analyse how people align to the HEXACO personality inventory. It’s high tech stuff, but the result is what matters: More than 98% of candidates love the feedback they receive, and rate it as useful. We help people understand themselves better, and equip them to attack jobs with the techniques best suited to their personalities.
If you’re using Phai, that means you’re really helping people. If that wasn’t enough of a reward on its own, know that good candidate feedback is also helping your employer brand immeasurably. It’s a dream solution for volume hiring.
The value is greatest when companies harness the differences between employees from multiple demographic backgrounds to understand and appeal to a broad customer base. But true diversity relies on social mobility and therein lies the problem: the rate of social mobility in the UK is the worst in the developed world.
The root cause of the UK’s lack of social mobility can be found in the very place that it can bring the most value – the workplace. Employers’ recruiting processes often suffer from unconscious human bias that results in involuntary discrimination. As a result, the correlation between what an employee in the UK earns today and what his or her father earned is more apparent than in any other major economy.
This article explores the barriers to occupational mobility in the UK and the growing use of predictive analytics or algorithmic hiring to neutralise unintentional prejudice against age, academic background, class, ethnicity, colour, gender, disability, sexual orientation and religion.
The UK government has highlighted the fact that ‘patterns of inequality are imprinted from one generation to the next’ and has pledged to make their vision of a socially mobile country a reality. At the recent Conservative party conference in Manchester, David Cameron condemned the country’s lack of social mobility as unacceptable for ‘the party of aspiration’. Some of the eye-opening statistics quoted by Cameron include:
The OECD claims that income inequality cost the UK 9% in GDP growth between 1990 and 2010. Fewer educational opportunities for disadvantaged individuals had the effect of lowering social mobility and hampering skills development. Those from poor socio economic backgrounds may be just as talented as their privately educated contemporaries and perhaps the missing link in bridging the skills gap in the UK. Various industry sectors have hit out at the government’s immigration policy, claiming this widens the country’s skills gap still further.
Besides immigration, there are other barriers to social mobility within the UK that need to be lifted. Research by Deloitte has shown that 35% of jobs over the next 20 years will be automated. These are mainly unskilled roles that will impact people from low incomes. Rather than relying too heavily on skilled immigrants, the country needs to invest in training and development to upskill young people and provide home-grown talent to meet the future needs of the UK economy. Countries that promote equal opportunity for everyone from an early age are those that will grow and prosper.
The UK government’s proposal to tackle the issue of social mobility, both in education and in the workplace, has to be greatly welcomed. Cameron cited evidence that people with white-sounding names are more likely to get job interviews than equally qualified people with ethnic names, a trend that he described as ‘disgraceful’. He also referred to employers discriminating against gay people and the need to close the pay gap between men and women. Some major employers – including Deloitte, HSBC, the BBC and the NHS – are combatting this issue by introducing blind-name CVs, where the candidate’s name is blocked out on the CV and the initial screening process. UCAS has also adopted this approach in light of the fact that 36% of ethnic minority applicants from 2010-2012 received places at Russell Group universities, compared with 55% of white applicants.
Although blind-name CVs avoid initial discriminatory biases in an attempt to improve diversity in the workforce, recruiters may still be subject to similar or other biases later in the hiring process. Some law firms, for example, still insist on recruiting Oxbridge graduates, when in fact their skillset may not correlate positively with the job or company culture. While conscious human bias can only be changed through education, lobbying and a shift in attitude, a great deal can be done to eliminate unconscious human bias through predictive analytics or algorithmic hiring.
Bias in the hiring process not only thwarts social mobility but is detrimental to productivity, profitability and brand value. The best way to remove such bias is to shift reliance from humans to data science and algorithms. Human subjectivity relies on gut feel and is liable to passive bias or, at worst, active discrimination. If an employer genuinely wants to ignore a candidate’s schooling, racial background or social class, these variables can be hidden. Algorithms can have a non-discriminatory output as long as the data used to build them is also of a non-discriminatory nature.
Predictive analytics is an objective way of analysing relevant variables – such as biodata, pre-hire attitudes and personality traits – to determine which candidates are likely to perform best in their roles. By blocking out social background data, informed hiring decisions can be made that have a positive impact on company performance. The primary aim of predictive analytics is to improve organisational profitability, while a positive impact on social mobility is a healthy by-product.
A recent study in the USA revealed that the dropout rate at university will lead to a shortage of qualified graduates in the market (3 million deficit in the short term, rising to 16 million by 2025). Predictive analytics was trialled to anticipate early signs of struggle among students and to reach out with additional coaching and support. As a result, within the state of Georgia student retention rates increased by 5% and the time needed to earn a degree decreased by almost half a semester. The programme ascertained that students from high-income families were ten times more likely to complete their course than those from low-income households, enabling preventative measures to be put in place to help students from socially deprived backgrounds to succeed.
Bias and stereotyping are in-built physiological behaviours that help humans identify kinship and avoid dangerous circumstances. Such behaviours, however, cloud our judgement when it comes to recruitment decisions. More companies are shifting from a subjective recruitment process to a more objective process, which leads to decision making based on factual evidence. According to the CIPD, on average one-third of companies use assessment centres as a method to select an employee from their candidate pool. This no doubt helps to reduce subjectivity but does not eradicate it completely, as peer group bias can still be brought to bear on the outcome.
Two of the main biases which may be detrimental to hiring decisions are ‘Affinity bias’ and ‘Status Quo bias’. ‘Affinity bias’ leads to people recruiting those who are similar to themselves, while ‘Status Quo bias’ leads to recruitment decisions based on the likeness candidates have with previous hires. Recruiting on this basis may fail to match the selected person’s attributes with the requirements of the job.
Undoubtedly it is important to get along with those who will be joining the company. The key is to use data-driven modelling to narrow down the search in an objective manner before selecting based on compatibility. Predictive analytics can project how a person will fare by comparing candidate data with that of existing employees deemed to be h3 performers and relying on metrics that are devoid of the type of questioning that could lead to the discriminatory biases that inhibit social mobility.
“When it comes to making final decisions, the more data-driven recruiting managers can be, the better.”
‘Heuristic bias’ is another example of normal human behaviour that influences hiring decisions. Also known as ‘Confirmation bias’, it allows us to quickly make sense of a complex environment by drawing upon relevant known information to substantiate our reasoning. Since it is anchored on personal experience, it is by default arbitrary and can give rise to an incorrect assessment.
Other forms of bias include ‘Contrast bias’, when a candidate is compared with the previous one instead of comparing his or her individual skills and attributes to those required for the job. ‘Halo bias’ is when a recruiter sees one great thing about a candidate and allows that to sway opinion on everything else about that candidate. The opposite is ‘Horns bias’, where the recruiter sees one bad thing about a candidate and lets it cloud opinion on all their other attributes. Again, predictive analytics precludes all these forms of bias by sticking to the facts.
https://sapia.ai/blog/workplace-unconscious-bias/
Age is firmly on the agenda in the world of recruitment, yet it has been reported that over 50% of recruiters who record age in the hiring process do not employ people older than themselves. Disabled candidates are often discriminated against because recruiters cannot see past the disability. Even these fundamental stereotypes and biases can be avoided through data-driven analytics that cut to the core in matching attitudes, skills and personality to job requirements.
Once objective decisions have been made, companies need to have the confidence not to overturn them and revert to reliance on one-to-one interviews, which have low predictive power. The CIPD cautions against this and advocates a pure, data-driven approach: ‘When it comes to making final decisions, the more data-driven recruiting managers can be, the better’.
The government’s strategy for social mobility states that ‘tackling the opportunity deficit – creating an open, socially mobile society – is our guiding purpose’ but that ‘by definition, this is a long-term undertaking. There is no magic wand we can wave to see immediate effects.’ Being aware of bias is just the first step in minimising its negative effect in the hiring process. Algorithmic hiring is not the only solution but, if supported by the government and key trade bodies, it can go a long way towards remedying the inherent weakness in current recruitment practice. Once the UK’s leading businesses begin to witness the benefits of a genuinely diverse workforce in terms of increased productivity and profitability, predictive hiring will become a self-fulfilling prophecy.
In my career, I have been involved in either leading or managing countless restructures. The driving force behind these restructures ranged from offshoring capability to migrating to new or emerging skill requirements, right-sizing a particular function, or the basic need to reduce operating cost.
While each of these projects delivered on their outcomes, I would argue they had varying levels of success in preserving the organisational talent and corporate memory required.
Why?
More often than not, organisations will work towards a target – this could be FTE, Headcount, or Cost Reduction for example. Choices to achieve this target are often made in isolation of critical information.
I have seen situations where the people who kept their jobs vs lost their jobs was made on relationship merit, not on skills, capability or cultural alignment. I have also seen examples where it is a numbers game. Here, the end goal is purely to achieve the target. This leaves some organisations scrambling for contractors and consultants to backfill their critical skill gaps.
With the world in economic freefall (cue the dramatic music) we are going to see more and more organisations looking to right-size their workforce in-line with consumer confidence and spending.
Personality and behaviour.
So, with this in mind, how can an organisation make choices which are informed by data, unpinned by fairness, while still being efficient?
Personality and behavioural assessments have long been used in the recruitment and promotion of individuals. However, they are rarely used when right-sizing or restricting an organisation. This seems like a glaring and obvious opportunity. In right-sizing, you are effectively making hiring and promotion decision within your existing workforce.
When comparing the apples with apples, personality and behavioural assessments allow you to focus on the attributes which will differentiate your workforce for the next phase of your organisational journey . Then depending on what this journey is, you have the opportunity to codify the critical capabilities required.
Sapia has been partnering with many organisations locally and globally. Together we not only re-imagine how our partners use personality and behavioural assessments in recruitment, but also its application to the entirety of the HR lifecycle.
Unlike traditional personality & behavioural profiling, Sapia is powered by AI, utilising conversational text to assess individuals. Not capturing any protected attributes, this process removes the opportunity of bias creeping in. Thus, it allows you to make informed, a data-driven decision about your future workforce on culture & values alignment.
Yes, personality is widely accepted as an indicator of job performance.
Until now, the only way to accurately measure personality was through long and repetitive personality tests. The Sapia team breaks new ground disrupting decades of assessment practice. This is done by showing that answers to standard interview questions, through a text-based mobile interview can be used to reliably infer personality traits.
Get the published research here
To keep up to date on all things “Hiring with Ai” subscribe to our blog!
You can try out Sapia’s Chat Interview right now, or leave us your details here to get a personalised demo.
As humans, we often don’t trust what we can’t see and we can’t trust what we don’t understand.
Transparency and explainability are fundamental ingredients of trust, and there is plenty of research to show that high trust relationships create the most productive relationships and cultures.
We are committed to building ethical and engaging assessments. This is why we have taken the path of a text chat with no time pressure. We allow candidates to take their own time, reflect and submit answers in text format. Apart from the apparent errors related to facial expressions, we believe that technologies such as voice to text can add an extra layer of errors. We also refrain from scraping publicly available data such as LinkedIn nor do we use behavioural data like how fast a candidate completes or how many corrections they make. Lastly, we strictly use the final submitted answers from the candidates and nothing else.
Our approach has led to candidates loving the text experience, as measured by the feedback they leave and NPS.
No demographic details are collected from candidates nor used to influence their ranking. Only the candidates answer to relevant interview questions are analysed by our scientifically validated algorithm to assess their fit for the role.
Biases can occur in many different forms. Algorithms and Ai learn according to the profile of the data we feed it. If the data it learns from is taken from a CV, it’s only going to amplify our existing biases. Only clean data, like the answers to specific job-related questions, can give us a true bias-free outcome. We continuously test the data that trains the machine for known biases such as between gender and race groups, so that if ever the slightest bias is found, it can be corrected. Potential biases in data can be tested for and measured. These include all assumed biases such as between gender and race groups that can be added to a suite of tests. These tests can be extended to include other groups of interest where those groupings are available like English As Second Language (EASL) users.
Here are a few examples:
Sapia uses all of these tests and more.
To keep up to date on all things “Hiring with Ai” subscribe to our blog!
Finally, you can try out Sapia’s SmartInterview right now, or leave us your details here to get a personalised demo.