Written by Nathan Hewitt

What can HR learn from The Social Dilemma?

Objective data and truth.  

Netflix’s latest documentary “The Social Dilemma” tells a story of data gone mad, of it being used to personalise ‘the truth’ so that everyone’s truth is their own. The idea of ‘objective truth’ doesn’t exist anymore.

The combination of hyper-connectivity at scale that comes from social media, the addictive habits of engaging with it, and the incredible ability to personalise what we see, listen to, and believe, creates a feeling of satisfaction at best (think Spotify) and at its worst, a fractured society.

So what’s the relevance of that for HR?

HR has been on this journey to do the opposite. To introduce an objective standard of truth – especially given the risks that come from personalised decision making when it comes to hiring. The risk of making hiring decisions based on personal views means we see hiring being influenced by unconscious biases – something that can be easier to identify than fix. ‘Mirror hiring’, and companies that hire for “culture-fit”.

Consider the decline of so many legacy Fortune 500 companies over the last 50 years.

Do you think Kodak and its ilk would have crashed as quickly if they had a genuinely diverse set of opinions and experiences at their leadership level?

It’s no coincidence that in The Social Dilemma most of the protagonists (if that’s the right word) sharing their regrets and insights on “how the heck did we get here?” were mostly young white men.

From my own experience of being HRD at a leading digital tech company, engineers were hired based on two data inputs:  their coding ability, and their ‘fit’ with the team.

The former is readily tested using objective tools, but the latter is largely tested through having coffee chats with the team.

Or to put it another way – 100% subjective; 0% objective data.

Is it any wonder then that you end up with more of the same when you use the personal opinions of humans to drive these decisions? People are so scared of data amplifying bias, and humans can be pretty good at it too.

Bias in the recruiting process has been an issue for as long as modern-day hiring practices existed. In order to address some concerns, the idea of “blind applications” became popular a few years ago, with companies simply removing names on applications and thinking that it would remove any gender or racial profiling. It made a difference, but bias still existed though the schools that people attended, as well as past experience they might have had. Interestingly, these are two things that have now been shown to have no impact on a person’s ability to do a job.

Away from computer screens and smartphone addictions, when it comes to hiring, HR needs to do the very thing that social media has rendered mute.

It has to ensure that there is objective truth on every candidate. It has to do this for every new hire, every promotion.

Ironically, it is what social media weaponised – ‘data’  that can only, truly help us achieve this. I talk often about “objective data” – that is data that has been collected without input bias – and it is only this data that helps us disrupt bias that comes from putting humans in the decision in making seat.  This objective data is more builds a truly holistic picture about an individual when helping inform hiring decisions, decisions that will shape a company’s culture, and its future.

(Read: Why Machines Make Better Decisions Than Humans Do)

The data seeks to understand who you are, not the school you went to, or the degree you hold, but how you think and behave and most of all your intrinsic traits. It was Facebook’s homogenous culture that encouraged technical brilliance over ethical thinking that ultimately created the issues discussed in The Social Dilemma. If only they’d used their skills to invest in objective data that set aside its technical bias and hired for humanity, we might not be questioning it in the way we are.

Here’s the next article in this series:

To keep up to date on all things “Hiring with Ai” subscribe to our blog!

You can try out Sapia’s Chat Interview right now, or leave us your details to get a personalised demo


All disruption has to fight against fear

Depending on which media you read, technology, and specifically Artificial Intelligence, will create or destroy thousands of jobs. It is already radically changing many, as well as how we apply and hire for them.

Fear of New Invention

Back in the day when cars were first released, there was such a fear about the danger they presented to society, that when they came to a junction, they were required to stop the car, get out and fire a warning shot so that the people in the surrounding area would be safe from unexpected danger.

I was reminded of this when reading the commentary around Amazon and its use of AI to screen talent.

Amazon’s AI Experiment Went Awry

In case you missed it, Amazon did an experiment. They analysed 10 years of CV data to build a predictive model to help filter through what I am sure is hundreds of thousands of applications to work at the company. Because the sample group was mostly male, the CVs were naturally based towards male ‘traits’ if there is such a thing. The model built off this training data naturally ended up mirroring that sample group which meant it preferred male to female CVs.

CV’s Trigger Bias

It is pretty obvious to all of us that if you create a product off one homogenous group, then you will end up flavouring it with the characteristics of that group. YouTube found that when the team they used to build their iOS app didn’t consider left-handed users when it added in mobile uploads, causing videos recorded in a left-handed person’s view of the landscape to be upside down. I presume because the team building it was comprised of all right-handed people.

Suggested reading: A CV Tells You Nothing

Humans are heavily prone to unconscious bias 

In fact, we rely on biases to survive.

While these biases help us not go insane, unfortunately, it has led us to the point today where they are having a very significant effect in the workforce. There are many serious forms of bias, but the best known is gender bias. A recent study showed simply by changing the name of an applicant from a woman’s to a man’s, with every other detail kept the same, the ‘male’ applicant was more likely to progress to an interview. The exact same CV.

When humans do screening, they are prone to making snap judgements based on superficialities, ignoring the very many factors that can help actually predict whether a candidate will perform. This is where data platforms actually have an advantage, by doing ‘blind screening’ and making the process both faster and fairer. However, this only works when the data that goes into the model manages for human frailties.

When it comes to using data to build predictive models to inform and guide decision-making, it is important to really dig deep on the input data.

And if you think unconscious bias training is the answer … read this first. 

Think carefully about input data

The key insight for this experiment for Amazon is that relying on CVs to assess talent, is inherently flawed. This is accentuated even more when you accept that what differentiates talent now and will become even more acute in the future is not hard skills, not what uni someone went to or degree they have, but soft skills. Jeff Weiner who has the benefit of this kind of rich data from 600m users attested to that this week.

At Sapia, working with dozens of companies across the world to help blind screen thousands of candidates, we know that it’s the behaviours and values of a potential coworker that will influence their performance and tenure. Values, such as commitment and attitudes are invisible in a CV. It’s not easy to see either in an interview. But it’s easily tested using well-crafted data platforms.

So let’s try to look beyond the news grab, the headline which naturally attracts attention when it has Amazon in the first line.

  • Algorithms will be biased if the data they are built with is biased.
  • Algorithms can be tested for bias. Humans can’t be
  • Algorithms can be trained to remove bias. Humans, truthfully cant be
  • Algorithms are blind to your gender, skin colour age. Humans are very sensitive to this, especially when it comes to hiring

We have a once-in-a-Millenium opportunity to extend and enable better, fairer thinking through careful and conscious AI-assisted decisions.

The algorithms we build aren’t sentient beings or unmanageable acts of nature, they are built by humans. When we recognise that and are conscious of those risks, we can start to counteract these biases through technology to help humans see what’s in front of us more clearly, without the filters of bias.

Leave us your details to book a demo

Read Online

The evidence is in: AI can determine your personality through text chat

MELBOURNE, July 2020: Australian AI recruitment start-up Sapia, has published peer-reviewed research validating a new AI-based approach to talent assessment that determines personality and job suitability through text.

The research was published by IEEE.

Personality assessments have long been used to supplement CV data. It is widely accepted that one’s personality can be a predictor of job performance and suitability. Thus, Sapia uses structured text-based interviews, NLP, and machine learning to identify personality traits by analysing text answers to questions related to the job being applied for.

Every candidate gets a “chat based smart interview”. As no demographic data is gathered from other sources such as CVs, the process is blind to gender, race and characteristics that are not relevant in candidate selection. The research validates the accuracy of Sapia’s AI approach. Lastly, it also signals a huge improvement to personality tests, where the candidate experience is underwhelming.

Also Know, Personality AI refers to the use of artificial intelligence (AI) technologies to analyze and understand human personality traits, tendencies, and behavior patterns. This field of AI has gained significant attention in recent years, as businesses and organizations seek to better understand their customers, employees, and other stakeholders.

From the leadership at Sapia

Barbara Hyman, Sapia (Formerly PredictiveHire) CEO says chat-based interviews address the three big failures of current assessments – ghosting, bias and trust.

“Recruiters are the ultimate ghosters,” Ms Hyman says. “With Sapia, the fact that every single candidate receives a personalised learning profile is gold for candidates and your employer brand. Using text to analyse fit that’s blind to gender, race, age and any personal factors is a must-have in today’s current climate and means every company can introduce bias interruption for every hire and promotion. Imagine what that will do to diversity in hiring”

Principal Data Scientist Buddhi Jayatilleke says “language has long been seen as a source of truth for personality- it defines who we are.  This technology offers a direct way to understand personality from language. All is done by using an experience that is human and empowering. Additionally, this capability can be used for assessment and personalised career coaching. Furthermore, it could be a game changer for job seekers, universities, and employers.”

Candidates across 34 countries have experienced Sapia’s unique chat-based interviews. More insight into how the technology works can be found here.

About Sapia

Sapia (Formerly PredictiveHire) is a team of data scientists, engineers and HR professionals. Together we have built a product suite that is based on science and built to humanise hiring. Sapia believes that relying on data to drive your most important decisions. Who you hire/ promote, enhances trust and confidence  that decisions are fair. We also serve customers in the UK, South Africa, India Australia, and New Zealand.

To keep up to date on all things “Hiring with Ai” subscribe to our blog!

Finally, you can try out Sapia’s Chat Interview right now, or leave us your details to get a personalised demo.

Read Online

Research shows that automation can make recruitment more human

To find out how to use Recruitment Automation to ‘hire with heart’, we also have a great eBook on recruitment automation with humanity.

New insights from Aptitude Research suggests Ai can play a much greater role in talent acquisition than just improving efficiency for hiring managers,  it can also make the interview process more human for candidates, something Sapia has long advocated

Aptitude Research has published a new paper showing that when you shift the focus in automated Talent Acquisition from an employer-driven view to a candidate-first then it is possible to reduce bias in hiring,  and improve the overall human element of recruitment.

The research, sponsored by Sapia , an Australian technology company that has pioneered transparent Ai-assisted hiring solutions, shows that humanistic automation creates a personal connection at scale, and works to reduce bias, something no other technology or even human-centred solution can deliver.

Madeline Laurano, CEO of Aptitude comments “The misperception that candidates do not want automation and prefer to keep the current talent acquisition is one of the most significant misperceptions in talent acquisition. Candidates want a fair recruitment process and consistency in communication. Automation can support all of these initiatives and enhance the humanity of the experience.”

There are four main ways that talent acquisition is made more human with automation when the candidate is the focus, rather than  simply moving candidates through the process:

  1. Automation can understand what candidates want: Ai considers the unique expectations and experiences of candidates and adapts to them as it learns. Collecting feedback about the recruitment experience and continually improving the candidate journey can help candidates feel connected and heard.
  2. The correct data can interrupt bias early in the process: By creating a consistent and fair experience for candidates early in the process, and not relying on CV data as the determinate of job suitability,  companies are more successful at reducing bias and increasing inclusivity.
  3. Trust can be built through transparent data: Both employers and candidates need to trust the data and methodologies for the technology that they are using, something that can be achieved through transparency. Companies looking at automation should consider providers that will partner with them and provide transparency.
  4. Provide feedback to every candidate:  Though natural language processing every candidate receives personalized feedback and messaging. Leaving unsuccessful candidates feedback they can use for future job searches is empowering and a big leap from ‘ghosting’ or a standard rejection email.

The research can be downloaded here

About Aptitude Research
Aptitude Research Partners is a research-based analyst and advisory firm focused on HCM technology. We conduct quantitative and qualitative research on all aspects of Human Capital Management to better understand the skills, capabilities, technology, and underlying strategies required to deliver business results in today’s complex work environment. 

About Sapia

Sapia has become one of the most trusted mobile-first Ai recruitment platforms, used by companies across Australia, India, South Africa, UK and the US, with a candidate every two minutes engaging with their unique Ai chatbot Smart Interviewer.

What makes their approach unique in its a disruption of three paradigms in recruitment -candidates being ghosted, biased hiring and the false notion that automation diminishes the human experience.

The end result for companies – bias is interrupted at the top of the funnel, your hiring managers make more objective decisions empowered by Smart Interviewer their co-pilot, inclusivity is enhanced, and your hired profile starts to look more like your applicant profile.

Media contacts
Barb Hyman, CEO Sapia

Madeline Laurano , Researcher Aptitude Research

Read Online