Resources › Whitepaper › Identifying and mitigating ethnicity bias in structured interview responses › Identifying and mitigating ethnicity bias in structured interview responses
Identifying and Mitigating Ethnicity Bias in Structured Interview Responses
Study Details
- Data Sample: Over 633,000 candidates from large retail companies in the UK and Australia.
- Analysis Process: Machine learning models were used to predict ethnicity from both raw text and from structured features derived from candidates’ responses to open-ended interview questions.
Key Findings
- Gender can be detected in raw text through gender-specific language cues.
- Derived features reduce ethnicity bias as they carry less ethnicity information.
- Fairer hiring practices can be achieved by assessing a candidate’s features, like competencies, instead of their raw text responses.
Key Takeaway
By focusing on specific features within interview responses, companies can significantly reduce gender bias, promoting fairer and more objective hiring practices. Sapia.ai’s approach demonstrates that AI can be effectively used to not only improve recruitment outcomes without introducing bias, but actually help to eliminate bias. For more information, read the full paper here.