I work with a team building a product-driven by AI which is used to inform decisions about people. This means I am often approached on social media or in-person by people who have a point of view about that, often with fear or frustration about being picked (or rejected) by a machine.
This week I received an email from a commerce/law graduate who had recently applied for a role at one of the big ‘accounting’ professional services firms. This student, let’s call him Dan, had to complete an online game in order to qualify for the next step which was a video interview.
To give himself the maximum chance of ‘doing well’ in the game, Dan created a dummy profile ‘Jason’ to see what the experience was like and get an inside read of the questions so that when he did it for real he would really nail it. This first time round he fudged the test as it was a trial run and he left most answers blank. When Dan went and did this for real, he was conscientious of course and wrote thoughtful answers and tried to pick the right behaviour in the balloon popping game!
Jason, who scored 44% received a video interview. Jason does not exist.
Dan, who scored 75% did not progress to the next round.
The machine picked the wrong guy
Every business like ours that works in this space recognises that this is new technology, and so still very much in the early stages of development. Like humans, machines will make mistakes. In our business, we call them false positives (people recommended who just aren’t right) or false negatives (people who are missed by the machine who could be right for the role).
Dan’s questions are legitimate…
When you are rejected by humans, either you hear nothing or you may get an explanation like — ‘you aren’t a good culture fit’ when they reject you. Machines may give you a score.
For me what this reveals is that any business who uses AI and ML for candidate selection, it’s critical to have empathy for the person who is experiencing this, in this case, empathy for the candidate experience.
Machines can make better selection decisions about people because they have access to a larger more comprehensive set of data, can process data faster, and if built with the right objective data, they can be far less bias than humans.
When used in recruitment, they need to work for both parties — the organisation and the candidate. Building trust in these technologies is critical in our space. It can’t all be about the organisation getting their efficiency gains.
This means :
Recruitment wants to rise above being a process. So AI in recruitment should enable that if it’s to be trusted by candidates.
Suggested Reading:
https://sapia.ai/blog/culture-surveys-no-use/
Predictive Talent Analytics turns the imaginary into reality, presenting a variety of businesses, including contact centres, with the opportunity to improve hiring outcomes and raise the performance bar. With only a minor tweak to existing business processes, predictive talent analytics addresses challenge faced by many contact centres.
Recruitment typically involves face-to-face or telephone interviews and psychometric or situational awareness tests. However, there’s an opportunity to make better hires and to achieve better outcomes through the use of Predictive Talent Analytics.
Many organisations are already using analytics to help with their talent processes. Crucially, these are descriptive analytical tools. They’re reporting the past and present. They aren’t looking forward to tomorrow and that’s key. If the business is moving forward your talent tools should also be pointing in the same direction.
Consider a call-waiting display board showing missed and waiting calls. This is reporting.
Alternatively, consider a board that does the same but also accurately predicts significant increases in call volumes, providing you with enough time to increase staffing levels appropriately. That’s predictive.
Descriptive analytical tools showing the path to achievement taken by good performers within the business can add value. But does that mean that every candidate within a bracketed level of academic achievement, from a particular socio-economic background, from a certain area of town or from a particular job board is right for your business? It’s unlikely! Psychometric tests add value but does that mean that everyone within a pre-set number of personality types will be a good fit for your business? That’s also unlikely.
The simple truth is that, even with psychometric testing and rigorous interviews, people are still cycling out of contact centres and the same business challenges remain.
With only a minor tweak to talent processes, predictive talent analytics presents an opportunity to harness existing data and drive the business forward by making hiring recommendations based on somebody’s future capability.
But wait, it gets better!
Pick the right predictive talent analytics tool and this can be done in an interesting, innovative and intriguing way taking approximately five minutes.
Once the tool’s algorithm knows what good looks like, crucially within your business (because every company is different!), your talent acquisition team can approach the wider talent market armed with a new tool that will drive up efficiency and performance.
Picking the right hires, first time.
Consider this. Candidate A has solid, recent, relevant experience and good academic grades, ticking all the right hiring boxes but post-hire subsequently cycles out of the business in a few months.
Candidate B is a recent school-leaver with poor grades, no work history but receives a high-performance prediction and, once trained, becomes an excellent employee for many years to come.
On paper candidate A is the better prospect but with the fullness of time, candidate B, identified using predictive talent analytics, is the better hire.
Instead of using generic personality bandings to make hiring decisions, use a different solution.
Use predictive talent analytics to rapidly identify people who will generate more sales or any other measured output. Find those who will be absent less or those who will help the business achieve a higher NPS. Bring applicants into the recruitment pipeline knowing the data is showing they will be a capable, or excellent, performer for your business.
Now that’s an opportunity worth grasping!
Steven John worked within contact centres whilst studying at university, was a recruiter for 13 years and is now Business Development Manager at Sapia, a leading workforce science business providing a data-driven prediction with every hire. This article was originally written for the UK Contact Centre Forum
You can try out Sapia’s FirstInterview right now, or leave us your details to book a demo
Our friends at eArcu are passionate about empowering recruiters to drive the hiring experience. They help recruiters engage and excite candidates from the first touch, to the first day. Now you can take full advantage of eArcu ATS by using Sapia’s interview automation platform, for fairer, faster and better hiring results. Integrating is easy, and it’ll allow you to get ahead of your competitors even easier!
A lot is expected from recruiters, from screening thousands of applicants to attracting candidates of diverse backgrounds and delivering a great candidate experience. But technology has advanced a lot and can now better support recruiters.
The great news is that when you integrate Sapia artificial intelligence technology with the powerful eArcu ATS, you will have a faster, fairer and more efficient recruitment process that candidates love.
You can now:
Gone are the days of screening CVs, followed by phone screens to find the best talent. The number of people applying for each job has grown 5-10 times in size recently. Reading each CV is simply no longer an option. In any case, the attributes that are markers of a high performer often aren’t in CVs and the risk of increasing bias is high.
You can now streamline your eArcu process by integrating Sapia’s interview automation with eArcu.
By sending out one simple interview link, you nail speed, quality and candidate experience in one hit.
Sapia’s award-winning chat Ai is available to all eArcu users. You can automate interviewing, screening, ranking and more, with a minimum of effort! Save time, reduce bias and deliver an outstanding candidate experience.
As unemployment rates rise, it’s more important than ever to show empathy for candidates and add value when we can. Using Sapia, every single candidate gets a FirstInterview through an engaging text experience on their mobile device, whenever it suits them. Every candidate receives personalised MyInsights feedback, with helpful coaching tips which candidates love.
“I have never had an interview like this in my life and it was really good to be able to speak without fear of judgment and have the freedom to do so.
The feedback is also great. This is a great way to interview people as it helps an individual to be themselves.
The response back is written with a good sense of understanding and compassion.
I don’t know if it is a human or a robot answering me, but if it is a robot then technology is quite amazing.”
Take it for a 2-minute test drive here >
Recruiters love the TalentInsights Sapia surface in eArcu as soon as each candidate finishes their interview.
See Recruiter Reviews here >
Well-intentioned organisations have been trying to shift the needle on the bias that impacts diversity and inclusion for many years, without significant results.
In recent years, we have all wisened up to the risk of using CVs to assess talent. A CV as a data source is well known to amplify the unconscious biases we have. A highly referenced study from 2003 called “Are Emily and Greg More Employable than Lakisha and Jamal?” found that white names receive 50 per cent more callbacks for interviews.
However, during COVID, we reverted to old ways in a different guise.
This isn’t a step forward.
Video hiring productises bias. It actually enables bias at scale.
It leads to mirror hiring – those who look and sound most like me. Instead of screening CVs in 30 seconds now, your team is watching 3-minute videos, so recruiting takes longer, and it’s exhausting.
Video platforms are being challenged in the US (EPIC Files Complaint with FTC about Employment Screening Firm HireVue) for concerns over invisible biases that may be affecting candidate fairness given the opaque nature of those algorithms. Facial recognition systems are worse at identifying the gender of women and people of colour than at classifying male, white faces. This year IBM openly pulled out of facial recognition, fearing racial profiling and discriminatory use, partly due to the questionable performance of the underlying AI.
We get that at some point you and the candidate need to meet, although no rule says you need to see someone to hire them. That’s just a bias (much like the bias pre-Covid) that you need to see someone at work to know that they are doing the work.
Blind hiring means you are interviewing a candidate without seeing them or knowing what school they went to, the jobs they have had. It’s a real meritocracy in that it’s fair for the candidate – and also smart for your organisation.
If you are hanging your hat on the fact you just finished bias training- research has shown consistently unconscious bias training does not work.
While we have all been dutifully attending it for years, the truth is the change factor is zero.
At a recent event attended by academics and data-loving professionals –whilst there was a welcome recognition that humans are more biased than Ai, and despite hearing that Wikipedia lists more than 150 biases we humans have – the majority of the audience still believe the impossible: that we can be trained out of our unconscious biases.
The Nobel Prize winner Dr Daniel Kahneman prescribes an algorithmic approach as better at decision-making to remove unconscious biases. He claims “Algorithms are noise-free. People are not. When you put some data in front of an algorithm, you will always get the same response at the other end.” Also, see why machines are a great assistive tool in making hiring a fair process, here.
We know your inbox is flooded with Ai tools with each proclaiming to remove bias and give you amazing results and it’s tough to discriminate between what’s puffery, what’s real and what you can trust.
If your role requires you to know the difference between puffery and science, then read this. Buyers Guide: 8 Questions You Must Ask.
The 30-second due-diligence test that every HR professional should be asking when presented with one of these whizz-bang Ai tools is this:
It’s critical, in fact, it’s a duty of care you have to your candidates and your organisation to be curious and investigate deeply the technology you are bringing into the organisation.
We have to be careful not to think that all AI is biased. AI is based on data, and that data can be tested for bias. ‘Data-driven’ also means transparent. Testing for bias, fairness and explainability of AI models is an active area of research and has advanced a lot. If built with best practices, AI can be used to challenge human decisions and interrupt potential biases. In the end, hiring is a human activity, and the final outcome should always be owned by a human.
If you want to know more about the research that defines the Sapia approach, look here.
If you want to know more about our bias testing, look here.
It offers a pathway to fairer hiring in 2021. In this Inclusivity e-Book, you’ll learn:
Download Inclusivity Hiring e-Book Here >
Get diversity and inclusion right whilst hiring on time and on budget.