Sapia labs, our R&D department, has developed a world-first innovation that will help us deepen our understanding of the contextual meaning of words in written job interviews. Called InterviewBERT, this algorithm combines Google’s model for Natural Language Processing (NLP) with our proprietary dataset of more than 330 million words. BERT, meet Smart Interviewer. Together, they’ll usher in a new generation of pre-employment assessment tools and recruitment software solutions.
Put simply, InterviewBERT makes Smart Interviewer, the most sophisticated conversational Ai in the world. Ours is no simple chat-bot – already, Smart Interviewer is capable of discovering personality traits and communication skills, accurately and reliably, using a candidate’s written responses. With InterviewBERT, Smart Interviewer will learn more about candidates than ever before, faster than ever before. With this speed and accuracy also comes reductions to the unfairnesses and biases that plague the hiring process.
Through sound Ai infrastructure, we have been able to accumulate a vast and accurate dataset. This dataset grows by the minute – we interview a new candidate every 30 seconds – and, coupled with the expertise of our Sapia labs team, we can assess candidate suitability for a role in milliseconds.
“The smartest companies know that the fairest and most accurate way to assess someone’s suitability for a role is through a structured interview,” our CEO, Barb Hyman, said. “Text increases accuracy and speed of assessing candidates, while removing biases that come through voice or video interviews.”
Dr Buddhi Jayatilleke, Chief Data Scientist and head of Sapia labs, said the team is excited at the finding that InterviewBERT had such a profound impact on trait accuracy.
“Written language encodes personality signals predictive of ‘fit’,” Dr Jayatilleke said. “The ability to understand people through language has limitless applications, and we are excited to keep inventing more ways to use language data for our customers.”
Dr Jayatilleke said decades of research had confirmed that language has long been seen as a source of truth for personality.
“What our R&D team has proven is just how powerful language data is when you combine it with enormous data volumes and scientific rigour,” he said. “This capability can be used for assessment and for offering personalised career coaching – a game changer for job seekers, universities, and employers.”
Sapia labs will present its findings from a new research paper, Identifying and Mitigating Gender Bias in Structured Interview Responses, at a SIOP symposium in April.
At Sapia we are attuned to research and stories around bias – for most of us, it’s the reason we work here.
Our team has observed the speed with which the blame for Coronavirus has targeted an entire ethnicity.
In this case, I’ve heard some say, “it’s not racism, people are genuinely scared of the spread of the virus. It’s a deadly virus. As it originated in China people naturally worry about anyone from China”.
Unfortunately, this is the very definition of bias.
A flawed logic that seems sensible on the surface, nothing but pure stereotyping underneath. Simply, everyone who looks Chinese are not recent travels from China.
Australia is home to 1.2mil Chinese origin Australians according to the 2016 Census. Should we worry about all of them? Bias has no place in fighting any problem, even when it is a deadly virus. It only creates stress and disharmony.
At the beginning of this week, one of our team who had come down with a cold shared he would work from home, to keep the team safe from his contagion.
We laughed at the time about him being a carrier of Coronavirus. By the end of the week, members of our team with holidays booked to visit family and travel in China during the Easter break had cancelled their trip.
They did this before Qantas stopped their direct flights and before the Australian government announced that Chinese people won’t be allowed back into Australia.
The team member who had a cold this week is Sri Lankan by birth. I guess that means we would have all been safe if he turned up to work as he is the ‘right’ ethnicity.
As a white immigrant myself, I don’t experience those prejudices. I have had career and life opportunities beyond my dreams, unfettered by racial bias.
Building a technology that gives equivalence to such career opportunities is why we work for our company. Some of our team have been screened out of job openings. Maybe they had the wrong name, went to the wrong school or just didn’t look like a cultural fit?
Not all AI is equal. HireVue, an AI-driven recruitment company, has recently been taken to the US Federal Trade Commission with a prominent rights group claiming unfair and deceptive trade practices in HireVue’s use of face-scanning technology to assess job candidates’ “employability.”
Using video is an obvious problem as a data source for reasons around race and gender and their associated biases, but you might be surprised to know that CV’s can be just as flawed and are in much broader use as a first parse for algorithms.
At Sapia, we rely on a simple open, transparent interview via a text conversation to evaluate someone for a role. No visuals, no CV data. No voice data as that too carries the risk of bias. Neither do we take data from Facebook. Using nothing that the candidate does not know about.
Bottom line, testing for bias and removing it from algorithms is possible. Whereas for humans, it’s not.
No amount of bias training will make you less biased. Maybe that’s one reason why using machines to augment and challenge decisions is fast becoming mainstream.
It certainly helps to reduce the impact of unconscious bias in hiring decisions.
To find out how to improve candidate experience using Recruitment Automation, we also have a great eBook on candidate experience.
New insights from Aptitude Research suggest recruitment automation can play a much greater role in talent acquisition than just improving efficiency for hiring managers, it can also make the interview process more human for candidates.
The research shows that when you shift the focus from an employer-driven view to a candidate-first view, then it is possible to reduce bias in hiring and improve the overall human element of talent acquisition.
For most companies, the value of automation is perceived through the recruiter and hiring manager experience, with the benefits to the candidate often ignored. However, recruitment automation has to be about more than simply moving candidates through the process quickly to have any significant benefit to a company.
When you focus on the impact and experience of the candidate, the benefits to both recruiters and candidates can significantly improve through recruitment automation. This approach has given rise to a movement called humanistic automation technology.
But humanistic automation sounds like an oxymoron right? Is it even possible?
The Aptitude Research showed not only is this possible, but that when Ai is used this way, it creates personal connection at scale, and works to reduce bias, something no other technology or even human-centred solution can deliver.
So, how exactly does it do this?
There have been some slight improvements in building connections through the hiring process recently, but only 50% of companies have a single point of contact for communication, which results in candidates feeling engaged or valued through the process.
Recruitment automation with a candidate-focus means that communication is personalised for high-engagement with the ability for the conversation to adapt to what it learns about a candidate almost immediately.
As a candidate finding out that you are not successful is tough, and worse, most companies just ghost those they don’t wish to move ahead with. Automation can ensure that every candidate is engaged and cared for even when they are not moving forward in the process – and that doesn’t mean a standard rejection email. Ai can deliver highly personalised communication that builds connection even for those unsuccessful in their application.
Although some companies have made efforts to remove bias from resumes, companies still have a lot of work to do on inclusion. For starters, many are relying on training programs, which have shown to be largely ineffective in delivering long-term change.
It’s true that recruitment automation can amplify bias, but automation that works to reduce bias is continually testing against biases in the system and has been shown to be effective in reducing the impact of bias in hiring decisions. Somethings humans cannot do (we’re inherently biased, whether we like it or not).
When you have the right data input gathered through blind screening and blind interviews – that don’t rely on CV data – then you can help companies achieve an equal and fair experience to all candidates.
Inclusive hiring is not limited to gender and race. Companies need a broader view of diversity, equity, and inclusion that includes individuals with disabilities and neurodiversity. This requires the right digital tools and technology to ensure that candidates have a positive experience. In many cases, chat and text are more inclusive over video or even phone screening and interviews for these candidates.
Most companies see feedback as a risky area and something they have no ability to do in a fair and timely manner. Essentially this is a lost opportunity for learning and development.
When you see feedback as a value proposition of an employer brand, its power in transforming your TA strategy becomes clear. Recruitment automation allows companies to deliver personalized feedback building trust and strengthening your employer brand.
Personalized feedback with tangible action items, means that candidates feel empowered even if they are rejected. Technology can help to deliver these action items in a human way, that even humans are not able to do at scale or even very well.
These insights are only made possible through natural language processing and machine learning that work in the background to reveal important information about the candidate. When a candidate feels like they are ‘seen’ that can be a transformational moment in their career paths.
Only recruitment automation can deliver individual feedback to everyone who takes time to do a job interview.
In an era of growing awareness around the privacy of data, only 1 in 4 candidates trust the data being will be used to drive hiring decisions. As companies look at recruitment automation through a candidate-centric lens, they must consider both the quality of the data they use and how to build trust between employers and candidates.
The biggest mistake that most companies make is using the wrong data. Resume data is not necessarily an indicator of performance or quality of hire.
Ethical Ai is something that hiring managers need to understand and use to evaluate providers. Providers using ethical Ai operate transparently, are backed by explanations, describe their methodology, and frequently publish their data.
Aptitude Research found that when data is transparent, it increases the trust in talent acquisition leaders, hiring managers, and senior leaders. With data transparency, 84% of talent acquisition leaders stated that they trust the data, and 78% of senior leaders trust the data.
55% of companies are increasing their investment in recruitment automation this year. These companies recognise that automation can improve efficiency, lift the administrative burden, reduce costs, and enable data-driven decisions.
This report focuses on a new look at automation through the eyes of the candidate
After all, automation is more than moving candidates through a process quickly. It should also enable companies to communicate in a meaningful and inclusive way and build trust between candidates and employers.