Blog

Back

Written by Nathan Hewitt

7 critical questions to ask when selecting your ‘Ai for Hiring’ technology

 

Interrupting bias in people decisions

We hope that the debate over the value of diverse teams is now over.  There is plenty of evidence that diverse teams lead to better decisions and therefore, business outcomes for any organisation.

This means that CHROs today are being charged with interrupting the bias in their people decisions and expected to manage bias as closely as the  CFO manages the financials.

But the use of Ai tools in hiring and promotion requires careful consideration to ensure the technology does not inadvertently introduce bias or amplify any existing biases.

To assist HR decision-makers to navigate these decisions confidently, we invite you to consider these 8 critical questions when selecting your Ai technology.

You will find not only the key questions to ask when testing the tools but why these are critical questions to ask and how to differentiate between the answers you are given.

Question 1  

What training data do you use?

Another way to ask this is: what data do you use to assess someone’s fit for a role?

First up- why is this an important question to ask …

Machine-learning algorithms use statistics to find and apply patterns in data.  Data can be anything that can be measured or recorded, e.g. numbers, words,  images, clicks etc. If it can be digitally stored, it can be fed into a machine-

learning algorithm.

The process is quite basic: find the pattern, apply the pattern.

This is why the data you use to build a predictive model, called training data, is so critical to understand.

In HR, the kinds of data that could be used to build predictive models for  hiring and promotion are:

  • CV data and cover letters
  • Games built to measure someone’s memory capacity and processing speed
  • Behavioural data, e.g. how you engage in an assessment,
  • Video Ai can capture how you act in an interview—your gestures, pose, lean, as well as your tone and cadence.
  • Your text or voice responses to structured interview questions
  • Public data sources such as your social media profile, your tweets, and other social media activity

If you consider the range of data that can be used in training data, not all data sources are equal, and on its surface, you can certainly see how some carry the risk of amplifying existing bases and the risk of alienating your candidates.

Consider the training data through these lenses:

> Is the data visible or opaque to the candidate?

Using data that is invisible to the candidate may impact your employer brand. And relying on behavioural data such as how quickly a candidate completes the assessment, social data or any data that is invisible to the candidate might expose you to not only brand risk but also a legal risk. Will your candidates trust an assessment that uses data that is invisible to them, scraped about them or which can’t be readily explained?

Increasingly companies are measuring the business cost from poor hiring processes that contribute to customer churn. 65% of candidates with a positive experience would be a customer again even if they were not hired and 81% will share their positive experience with family, friends and peers (Source: Talent Board).

Visibility of the data used to generate recommendations is also linked to explainability which is a common attribute now demanded by both governments and organisations in the responsible use of Ai.

Video Ai tools have been legally challenged on the basis that they fail to comply with baseline standards for AI decision-making, such as the OECD AI Principles and the Universal Guidelines for AI.

Or that they perpetuate societal biases and could end up penalising nonnative speakers, visibly nervous interviewees or anyone else who doesn’t fit the model for look and speech.

If you are keen to attract and retain applicants through your recruitment pipeline, you may also care about how explainable and trustworthy your assessment is. When the candidate can see the data that is used about them and knows that only the data they consent to give is being used, they may be more likely to apply and complete the process. Think about how your own trust in a recruitment process could be affected by different assessment types.

> Is the data 1st party data or 3rd party data?

1st party data is data such as the interview responses written by a candidate to answer an interview question. It is given openly, consensually and knowingly. There is full awareness about what this data is going to be used for and it’s typically data that is gathered for that reason only.

3rd party data is data that is drawn from or acquired through public sources about a candidate such as their Twitter profile. It could be your social media profile. It is data that is not created for the specific use case of interviewing for a job, but which is scraped and extracted and applied for a different purpose. It is self-evident that an Ai tool that combines visible data and 1st party data is likely to be both more accurate in the application for recruitment and have outcomes more likely to be trusted by the candidate and the recruiter.


Trust matters to your candidates and to your culture …

At PredictiveHire, we are committed to building ethical and engaging assessments. This is why we have taken the path of a text chat with no time pressure. We allow candidates to take their own time, reflect and submit answers in text format.

We strictly do not use any information other than the candidate responses to the interview questions (i.e. fairness through unawareness – algorithm knows nothing about sensitive attributes).

For example, no explicit use of race, age, name, location etc, candidate behavioural data such as how long they take to complete, how fast they type, how many corrections they make, information scraped from the internet etc. While these signals may carry information, we do not use any such data.


2. Can you explain why ‘person y’ was recommended by the Ai and not ‘person z’?

Another way to ask this is – Can you explain how your algorithm works? and does your solution use deep learning models?

This is an interesting question especially given that we humans typically obfuscate our reasons for rejecting a candidate behind the catch-all explanation of “Susie was not a cultural fit”.

For some reason, we humans have a higher-order need and expectation to unpack how an algorithm arrived at a recommendation. Perhaps because there is not much to say to a phone call that tells you were rejected for cultural fit.

This is probably the most important aspect to consider, especially if you are the change leader in this area. It is fair to expect that if an algorithm affects someone’s life, you need to see how that algorithm works.

Transparency and explainability are fundamental ingredients of trust, and there is plenty of research to show that high trust relationships create the most productive relationships and cultures.

This is also one substantial benefit of using AI at the top of the funnel to screen candidates. Subject to what kind of Ai you use, it enables you to explain why a candidate was screened in or out.

This means recruitment decisions become consistent and fairer with AI  screening tools.

But if Ai solutions are not clear why some inputs (called “features” in machine learning jargon) are used and how they contribute to the outcome,  explainability becomes impossible.

For example, when deep learning models are used, you are sacrificing explainability for accuracy. Because no one can explain how a particular data feature contributed to the recommendation. This can further erode candidate trust and impact your brand.

The most important thing is that you know what data is being used and then ultimately, it’s your choice as to whether you feel comfortable to explain the algorithm’s recommendations to both your people and the candidate.

3. What assumptions and scientific methods are behind the product? Are they validated?

Assessment should be underpinned by validated scientific methods and like all science, the proof is in the research that underpins that methodology.

This raises another question for anyone looking to rely on AI tools for human decision making – where is the published and peer-reviewed research that ensures you can have confidence that a) it works and b) it’s fair.

This is an important question given the novelty of AI methods and the pace at which they advance.

At PredictiveHire, we have published our research to ensure that anyone can investigate for themselves the science that underpins our AI solution.


INSERT RESEARCH


We continuously analyse the data used to train models for latent patterns that reveal insights for our customers as well as inform us of improving the outcomes.

4. What are the bias tests that you use and how often do you test for bias?

It’s probably self-evident why this is an important question to ask. You can’t have much confidence in the algorithm being fair for your candidates if no one is testing that regularly.

Many assessments report on studies they have conducted on testing for bias.  While this is useful, it does not guarantee that the assessment may not demonstrate biases in new candidate cohorts it’s applied on.

The notion of “data drift” discussed in machine learning highlights how changing patterns in data can cause models to behave differently than expected, especially when the new data is significantly different from the training data.

Therefore on-going monitoring of models is critical in identifying and mitigating risks of bias.

Potential biases in data can be tested for and measured.

These include all assumed biases such as between gender and race groups that can be added to a suite of tests. These tests can be extended to include other groups of interest where those group attributes are available like  English As Second Language (EASL) users.

On bias testing, look out for at least these 3 tests and ask to see the tech manual and an example bias testing report.

  • Proportional Parity Test. This is the standard EEOC measure for adverse impact on selection and recommendations.
  • Score Distribution Test. This measures whether the assessment score distributions are similar across groups of interest
  • Fairness Test. This measures whether the assessment is making the same rate of errors across groups of interest

INSERT IMAGE


At PredictiveHire, we conduct all the above tests. We conduct statistical tests to check for significant differences between groups of feature values,  model outcomes and recommendations. Tests such as t-tests, effect sizes,  ANOVA, 4/5th, Chi-Squared etc. are used for this. We consider this standard practice.

We go beyond the above standard proportional and distribution tests on fairness and adhere to stricter fairness considerations, especially at the model training stage on the error rates. These include following guidelines set by  IBM’s AI Fairness 360 Open Source Toolkit. Reference: https://aif360.mybluemix.net/) and the Aequitas project at the Centre for  Data Science and Public Policy at the University of Chicago

We continuously analyse the data used to train models for latent patterns that reveal insights for our customers as well as inform us of improving the outcomes.

5. How can you remove bias from an algorithm?

We all know that despite best intentions, we cannot be trained out of our biases. Especially the unconscious biases.

This is another reason why using data-driven methods to screen candidates is fairer than using humans.

Biases can occur in many different forms. Algorithms and Ai learn according to the profile of the data we feed it. If the data it learns from is taken from a  CV, it’s only going to amplify our existing biases. Only clean data, like the answers to specific job-related questions, can give us a true bias-free outcome.

If any biases are discovered, the vendor should be able to investigate and highlight the cause of the bias (e.g. a feature or definition of fitness) and take corrective measure to mitigate it.

  1. On which minority groups have you tested your products?

If you care about inclusivity, then you want every candidate to have an equal and fair opportunity at participating in the recruitment process.

This means taking account of minority groups such as those with autism,  dyslexia and English as a second language (EASL), as well as the obvious need to ensure the approach is inclusive for different ethnic groups, ages and genders.

At PredictiveHire, we test the algorithms for bias on gender and race. Tests can be conducted for almost any group in which the customer is interested.  For example, we run tests on “English As a Second Language” (EASL) vs. native speakers.

  1. What kind of success have you had in terms of creating hiring equity?

If one motivation for you introducing Ai tools to your recruitment process is to deliver more diverse hiring outcomes, it’s natural you should expect the provider to have demonstrated this kind of impact in its customers.

If you don’t measure it, you probably won’t improve it. At PredictiveHire, we provide you with tools to measure equality. Multiple dimensions are measured through the pipeline from those who applied, were recommended and then who was ultimately hired.

8. What is the composition of the team building this technology?

Thankfully, HR decision-makers are much more aware of how human bias  can creep into technology design. Think of how the dominance of one trait in  the human designers and builders have created an inadvertent unfair  outcome.

In 2012, YouTube noticed something odd.

About 10% of the videos being uploaded were upside down.

When designers investigated the problem, they found something unexpected:  Left-handed people picked up their phones differently, rotating them 180  degrees, which lead to upside-down videos being uploaded,

The issue here was a lack of diversity in the design process. The engineers and designers who created the YouTube app were all right-handed, and none had considered that some people might pick up their phones differently.

In our team at PredictiveHire, from the top down, we look for diversity in its broadest definition.

Gender, race, age, education, immigrant vs native-born, personality traits,  work experience. It all adds up to ensure that we minimise our collective blind spots and create a candidate and user experience that works for the greatest number of people and minimises bias.

What other questions have you used to validate the fairness and integrity of the Ai tools you have selected to augment your hiring and promotion processes?

We’d love to know!

 


 

To keep up to date on all things “Hiring with Ai” subscribe to our blog!

You can try out PredictiveHire’s FirstInterview right now, or leave us your details to get a personalised demo


Blog

Sapia named top performing HR vendor

Top performing HR vendor

The HR Service Provider Awards 2020 hosted by HRD Mag sets out each year to find the best HR vendors in Australia.

Sapia Awarded A Silver Medal in the Category Of Recruitment Systems & Technology

Taken from the HRD website:

The winners are selected from a pool of submissions from vendors providing an overview of their business or product, insight into their point of difference in the industry, statistics around market share and growth over the past 12 months, and other relevant information such as industry accolades, client testimonials, and the like.

Finding a dependable service provider can be quite a daunting task for HR professionals. From an impressive array of vendors offering their expertise, HR professionals need to choose the one that suits their company’s unique needs.

To assist HR professionals with this challenging task, HRD’s annual Service Provider Awards recognises the industry’s top performers. The Sapia submission was judged by a panel of HR leaders who determined the top performers in this category.


 Thank you HRD for recognising Sapia as a top-performing HR vendor in your fourth annual Service Provider Awards report! 


About Sapia

Sapia automates interviews so that every applicant is interviewed in-depth and at scale for you. All by using a text chat so that you can get to the best people fast.

Much faster: Candidates are assessed, scored and ranked using Ai, dramatically reducing recruiter time and effort. 90% recruiter time savings, against standard recruiting processes.

Improves candidate experience: An accessible, mobile-first familiar text experience that candidates enjoy with no confronting videos interviews or questionnaires. 99% candidate satisfaction and 90% completion rates.

Inclusive and fair: Blind screening at its best using Ai with the same structured behavioural interview for every candidate. Gender/Ethnicity/Indignity mixes preserved through recruitment stages due to Ai objectively assessing performance/personality, not their background.

See Solutions Here > 


To keep up to date on all things “Hiring with Ai” subscribe to our blog!

You can try out Sapia’s Chat Interview right now, or leave us your details to get a personalised demo

Read Online
Blog

Four ways recruitment automation is giving candidates a more human experience

To find out how to improve candidate experience using Recruitment Automation, we also have a great eBook on candidate experience.

New insights from Aptitude Research suggest recruitment automation can play a much greater role in talent acquisition than just improving efficiency for hiring managers, it can also make the interview process more human for candidates.

The research shows that when you shift the focus from an employer-driven view to a candidate-first view, then it is possible to reduce bias in hiring and improve the overall human element of talent acquisition.

For most companies, the value of automation is perceived through the recruiter and hiring manager experience, with the benefits to the candidate often ignored. However, recruitment automation has to be about more than simply moving candidates through the process quickly to have any significant benefit to a company.

When you focus on the impact and experience of the candidate, the benefits to both recruiters and candidates can significantly improve through recruitment automation.  This approach has given rise to a movement called humanistic automation technology.

But humanistic automation sounds like an oxymoron right? Is it even possible?

The Aptitude Research showed not only is this possible, but that when Ai is used this way, it creates personal connection at scale, and works to reduce bias, something no other technology or even human-centred solution can deliver.

So, how exactly does it do this?

Here are four main areas of talent acquisition that candidate-focussed recruitment automation improves on,and how it achieves this:

1. Connection

There have been some slight improvements in building connections through the hiring process recently, but only 50% of companies have a single point of contact for communication, which results in candidates feeling engaged or valued through the process.

Recruitment automation with a candidate-focus means that communication is personalised for high-engagement with the ability for the conversation to adapt to what it learns about a candidate almost immediately.

As a candidate finding out that you are not successful is tough, and worse, most companies just ghost those they don’t wish to move ahead with. Automation can ensure that every candidate is engaged and cared for even when they are not moving forward in the process – and that doesn’t mean a standard rejection email. Ai can deliver highly personalised communication that builds connection even for those unsuccessful in their application.

2. Inclusivity

Although some companies have made efforts to remove bias from resumes, companies still have a lot of work to do on inclusion. For starters, many are relying on training programs, which have shown to be largely ineffective in delivering long-term change.

It’s true that recruitment automation can amplify bias, but automation that works to reduce bias is continually testing against biases in the system and has been shown to be effective in reducing the impact of bias in hiring decisions. Somethings humans cannot do (we’re inherently biased, whether we like it or not).

When you have the right data input gathered through blind screening and blind interviews – that don’t rely on CV data – then you can help companies achieve an equal and fair experience to all candidates.

Want to remove bias from recruitment and not just talk about it?

Download the Inclusive Hiring e-Book here

Inclusive hiring is not limited to gender and race. Companies need a broader view of diversity, equity, and inclusion that includes individuals with disabilities and neurodiversity. This requires the right digital tools and technology to ensure that candidates have a positive experience. In many cases, chat and text are more inclusive over video or even phone screening and interviews for these candidates.

3. Feedback

Most companies see feedback as a risky area and something they have no ability to do in a fair and timely manner. Essentially this is a lost opportunity for learning and development.

When you see feedback as a value proposition of an employer brand, its power in transforming your TA strategy becomes clear. Recruitment automation allows companies to deliver personalized feedback building trust and strengthening your employer brand.

Personalized feedback with tangible action items, means that candidates feel empowered even if they are rejected. Technology can help to deliver these action items in a human way, that even humans are not able to do at scale or even very well.

These insights are only made possible through natural language processing and machine learning that work in the background to reveal important information about the candidate. When a candidate feels like they are ‘seen’ that can be a transformational moment in their career paths.

Only recruitment automation can deliver individual feedback to everyone who takes time to do a job interview.

4. Trust

In an era of growing awareness around the privacy of data, only 1 in 4   candidates trust the data being will be used to drive hiring decisions. As companies look at recruitment automation through a candidate-centric lens, they must consider both the quality of the data they use and how to build trust between employers and candidates.

The biggest mistake that most companies make is using the wrong data. Resume data is not necessarily an indicator of performance or quality of hire.

Ethical Ai is something that hiring managers need to understand and use to evaluate providers. Providers using ethical Ai operate transparently,  are backed by explanations, describe their methodology, and frequently publish their data.

Aptitude Research found that when data is transparent, it increases the trust in talent acquisition leaders, hiring managers, and senior leaders. With data transparency, 84% of talent acquisition leaders stated that they trust the data, and 78% of senior leaders trust the data.

Talent acquisition transformation has accelerated the demand for recruitment automation.

55% of companies are increasing their investment in recruitment automation this year. These companies recognise that automation can improve efficiency, lift the administrative burden, reduce costs, and enable data-driven decisions.

This report focuses on a new look at automation through the eyes of the candidate

After all, automation is more than moving candidates through a process quickly. It should also enable companies to communicate in a meaningful and inclusive way and build trust between candidates and employers.

Download the full report here.

Read Online
Blog

Lever + Sapia = Faster, fairer hiring

Interested in a demo of our Lever integration? Fill out the form below!

Like Sapia, the team at Lever like to make life easy for recruiters. Lever streamline the hiring experience, helping recruiters source, engage, and hire from a single platform. Now you can supercharge your Lever ATS by seamlessly integrating interview automation from Sapia. Integrating is easy, and secures fairer, faster, and better hiring results. In the war for talent, you’ll pull ahead of your competitors even faster with Sapia + Lever. 

Hiring is more complex than ever

There’s a lot expected of recruiters these days. Attracting candidates from diverse backgrounds and delivering exceptional candidate care whilst selecting from thousands of candidates isn’t easy.

Recruiters are expected to:

  • Find the right people, ensuring a diversity of candidates
  • Fill roles quickly and efficiently
  • Interrupt bias in hiring and promotion
  • Ensure every person hired amplifies the organisation’s values
  • Create a candidate experience that is engaging and rewarding

Simplify Lever ATS hiring

A lot is expected from recruiters, from screening thousands of applicants to attracting candidates of diverse backgrounds and delivering a great candidate experience. But technology has advanced a lot and can now better support recruiters.  

The great news is that when you integrate Sapia artificial intelligence technology with the powerful Lever ATS, you will have a faster, fairer and more efficient recruitment process that candidates love.

You can now:

  • Reduce your screening time by up to 90%
  • Increase your candidate satisfaction to 99%
  • Achieve interview completion rates over 90% and
  • Reduce screening bias for good!

Sapia + Lever

Gone are the days of screening CVs, followed by phone screens to find the best talent. The number of people applying for each job has grown 5-10 times in size recently. Reading each CV is simply no longer an option. In any case, the attributes that are markers of a high performer often aren’t in CVs and the risk of increasing bias is high.

You can now streamline your Lever process by integrating Sapia interview automation with Lever.

We’ve created a quick, easy and fair hiring process that candidates love.

  1. Create a vacancy in Lever, and a Sapia interview link will be created. 
  2. Candidates receive the link. Every candidate will have an opportunity to complete a FirstInterview via chat, automatically sent to them after completing their application in Lever.
  3. Candidates receive feedback. Every candidate will automatically receive a personalised personality profile and coaching tip after completing their first interview! No more candidate ghosting.
  4. See results as soon as candidates complete their interview. Each candidate’s scores, rank, personality assessment, role-based traits and communication skills are available as soon as they complete the interview inside Lever. 

By sending out one simple interview link, you nail speed, quality and candidate experience in one hit.

Integrate Lever and get ahead.

Sapia’s award-winning chat Ai is available to all Lever users. You can automate interviewing, screening, ranking, and more, with a minimum of effort! Save time, reduce bias and deliver an outstanding candidate experience.

Experience a Sapia FirstInterview for yourself

 

The interview that all candidates love.

As unemployment rates rise, it’s more important than ever to show empathy for candidates and add value when we can. Using Sapia, every single candidate gets a FirstInterview through an engaging text experience on their mobile device, whenever it suits them. Every candidate receives personalized insights, with helpful coaching tips that candidates love.

Together, Sapia and Lever deliver an approach that is: 

  • Relevant Move beyond the CV to the attributes that matter most to you: grit, curiosity, accountability, critical thinking, agility, and communication skills
  • Respectful Give every single person an interview and never ghost a candidate again
  • Dignified Show you value people’s time by providing every single applicant with personal feedback
  • Fair Avoid video in the first round of interviews and take an approach that’s 100% blind to gender, age, ethnicity, and other irrelevant attributes
  • Familiar Text chat interviewing is not only highly efficient, but it’s also familiar to people of all ages  

There are thousands of comments just like this…

  • “I have never had an interview like this in my life and it was really good to be able to speak without fear of judgment and have the freedom to do so.”
  • “The feedback is also great. This is a great way to interview people as it helps an individual to be themselves.”
  • “The response back is written with a good sense of understanding and compassion.”
  • “I don’t know if it is a human or a robot answering me, but if it is a robot then technology is quite amazing.”

Test drive it for yourself here (it takes 2 minutes!)  

Recruiters love using artificial intelligence in hiring.

Recruiters love that Sapia TalentInsights surface in Lever as soon as each candidate finishes their interview.

Together, Sapia and Lever deliver an approach that is: 

  • Fast Ai-powered scores and rankings make shortlisting candidates quicker
  • Insightful Deep dive into the unique personality and other traits of each candidate 
  • Fair Candidates are scored and ranked on their responses. The system is blind to other attributes and regularly checked for bias.
  • Streamlined Our stand-alone LiveInterview mobile app makes arranging assessment centres easy. Automated record-keeping reduces paperwork and ensures everyone is fairly assessed.
  • Time-saving Automating the first interview screening process and second-round scheduling delivers 90% time savings against a standard recruiting process.

HR Directors and CHROs love reliable bias tracking.

Well-intentioned organizations have been trying to shift the needle on the bias that impacts diversity and inclusion for many years, without significant results.

Together, Sapia and Lever deliver an approach that is: 

  • Measurable DiscoverInsights, our operations dashboard that provides clear reporting on recruitment, including pipeline shortlisting, candidate experience and bias tracking.
  • Competitive The Sapia and Lever experience is loved by candidates, ensuring you’ll attract the best candidates, and hire faster than competitors.
  • Scalable Whether you’re hiring one hundred people, or one thousand, you can hire the best person for the job, on time, every time.
  • Best-in-class Sapia easily integrates with Lever to provide you with a best-in-class AI-enabled HRTech stack. 

Getting started is easy!

Let’s chat about getting you started – book a time here

Read Online