Written by: Team PredictiveHire
When unconscious bias becomes a conscious bias
A perfect example of unconscious bias manifesting in a conscious and damaging way.
At Sapia we are attuned to research and stories around bias – for most of us, it’s the reason we work here.
Our team has observed the speed with which the blame for Coronavirus has targeted an entire ethnicity.
In this case, I’ve heard some say, “it’s not racism, people are genuinely scared of the spread of the virus. It’s a deadly virus. As it originated in China people naturally worry about anyone from China”.
Unfortunately, this is the very definition of bias.
A flawed logic that seems sensible on the surface, nothing but pure stereotyping underneath. Simply, everyone who looks Chinese are not recent travels from China.
Australia is home to 1.2mil Chinese origin Australians according to the 2016 Census. Should we worry about all of them? Bias has no place in fighting any problem, even when it is a deadly virus. It only creates stress and disharmony.
The irony is this: the virus is a true fair operator. It has no racial preferences.
At the beginning of this week, one of our team who had come down with a cold shared he would work from home, to keep the team safe from his contagion.
We laughed at the time about him being a carrier of Coronavirus. By the end of the week, members of our team with holidays booked to visit family and travel in China during the Easter break had cancelled their trip.
They did this before Qantas stopped their direct flights and before the Australian government announced that Chinese people won’t be allowed back into Australia.
The team member who had a cold this week is Sri Lankan by birth. I guess that means we would have all been safe if he turned up to work as he is the ‘right’ ethnicity.
We are on a mission: To solve issues of bias in hiring
As a white immigrant myself, I don’t experience those prejudices. I have had career and life opportunities beyond my dreams, unfettered by racial bias.
Building a technology that gives equivalence to such career opportunities is why we work for our company. Some of our team have been screened out of job openings. Maybe they had the wrong name, went to the wrong school or just didn’t look like a cultural fit?
Unfortunately, AI hiring tools can be biased
Not all AI is equal. HireVue, an AI-driven recruitment company, has recently been taken to the US Federal Trade Commission with a prominent rights group claiming unfair and deceptive trade practices in HireVue’s use of face-scanning technology to assess job candidates’ “employability.”
Using video is an obvious problem as a data source for reasons around race and gender and their associated biases, but you might be surprised to know that CV’s can be just as flawed and are in much broader use as a first parse for algorithms.
How does AI solve the issues of discrimination and bias in recruitment?
At Sapia, we rely on a simple open, transparent interview via a text conversation to evaluate someone for a role. No visuals, no CV data. No voice data as that too carries the risk of bias. Neither do we take data from Facebook. Using nothing that the candidate does not know about.
Bottom line, testing for bias and removing it from algorithms is possible. Whereas for humans, it’s not.
If you want to learn more about how we test for bias and why bias testing is critical to an AI screening tool get in touch here.
No amount of bias training will make you less biased. Maybe that’s one reason why using machines to augment and challenge decisions is fast becoming mainstream.
It certainly helps to reduce the impact of unconscious bias in hiring decisions.