Posted: 27 Jun. 2016 05 min. read

Using digital recruitment platforms to remove unconscious bias

Does the theory stack up in practice?

‘Applied’ is a digital platform developed by the Behavioural Insights Team (‘BIT’) that uses behavioural science to remove bias from the recruitment process. BIT tested Applied against a traditional CV sift as part of their 2015/6 graduate recruitment round to see if digital recruitment tools really can make hiring smart, fair and easy in practice.

The digital recruitment platform called ‘Applied’ was designed by the Behavioural Insights Team (‘BIT’). It uses the latest empirical evidence on effective practices to overcome unconscious bias in the hiring process, which makes recruitment smarter, fairer and easier.  Does the behavioural science translate into better recruitment outcomes in the real-world setting? A validation exercise undertaken by the BIT put the science to the test.

The benefits of having a truly diverse workforce are now widely recognised, with an ever-increasing number of companies aware that having a diverse workforce can pay serious commercial dividends, particularly where tasks require innovation, creativity and depth of analysis. However, recent research by Edgley, C. et al. 2016 highlights that despite this recognition, companies continue to lack the required level of diversity. A key reason is attributed to unconscious biases present in the early stages of recruitment practices, which negatively affects the chances of diverse applicants and hinders the company’s ability to find the right candidate for the job.

With respect to CVs in particular, research argues that CVs typically contain information that is largely irrelevant to a candidate’s performance on the job. Nevertheless, this information has the potential to prey on the unconscious biases of the assessor. In an effort to reduce the level of unconscious bias in recruitment practices, an increasing array of automated screening tools are being developed and rolled out across the recruitment industry.

In line with this trend, the Applied platform developed by BIT includes four key features to remove common biases and improve recruitment decisions. These features are anonymisation, chunking, collective intelligence and predictive assessment. The meanings of these four terms in the context of this study are introduced below.

  • Anonymisation: anonymised recruitment is considered good practice because it allows reviewers to concentrate on assessing quality rather than getting distracted by irrelevant information like gender and ethnicity.

  • Chunking: we are all affected by the order in which we read things. By chunking the answers to each question together and structuring the review horizontally, hiring managers can better compare candidates and avoid the ‘halo effect’.

  • Collective intelligence: harnesses the collective intelligence of a team by gathering three independent assessments of candidate materials, which helps hiring managers to make more objective decisions that reduce the risk of mistakes caused by cognitive overload or decision fatigue.

  • Predictive assessment: focusses the assessment on what is genuinely predictive of performance on the job; including encouraging the use of situational work tests and structured interviews.


Aim

Using BIT’s 2015/6 graduate recruitment round, the objective of the study was to test the Applied digital platform against a traditional CV sift.

Three criteria were developed to enable the researchers to evaluate the outcomes of the study. These criteria were:

  • Effectiveness (‘smart’ recruitment): which process was more predictive of future hiring rounds?
  • Diversity (‘fair’ recruitment): which process resulted in less biased outcomes with regard to candidates’ gender, ethnicity, disability and educational background?
  • Efficiency (‘easy’ recruitment): which process took the least amount of time?


Method

In 2015/6, BIT ran graduate recruitment round and received over 700 applications. It was not possible for BIT to subject candidates for the same job to different recruitment processes. Instead, for the candidates that passed the initial multiple choice exam, each was scored on their CV and their performance on Applied in parallel, giving them effectively two methods for advancing to the next round.

For the Applied screen, candidates were assessed on their responses to work sample questions using the features described above (anonymous, chunked, multiple independent reviews). These scores were aggregated into one merit list. For the CV screen, candidates’ CVs were put through an initial sift by a senior member of the team. Next, the CVs were evaluated in more detail by two BIT employees (independently) and given a granular score. These were averaged to yield a second merit list.

Candidates who advanced to the assessment centre fit into three groups: (1) candidates who passed only the Applied screen; (2) candidates who passed only the CV screen; and (3) candidates who passed both (unfortunately, these groups did not have equal numbers). By comparing the success of these three groups in the assessment centre, final interview and job offer, BIT would be able to infer which screening method is more predictive of performance on later tasks and which produces greater diversity. To ensure objectivity, assessors in later rounds were not aware of the CV or Applied scores given.

Findings

Effectiveness (‘smart’ recruitment): Candidates’ Applied scores were strongly correlated with their performance in the assessment centre and at the final interview. Candidates who scored one point higher in the Applied process went on to score 1.13 points higher in the assessment centre, all else equal. The study highlighted that a candidate’s CV score was not predictive of interview performance. The implication of these results are that a CV sift would have to progress three times the number of candidates as Applied to identify the same number of top performers at the assessment centre.

Candidates who had an Applied score of one point above the average were 16.7% more likely to receive a job offer, whereas higher CV scores did not correlate with receiving a job offer. Of the 10 candidates ultimately offered jobs, 60% and three of the top five performers would have been discarded under the CV sift.

This blog was originally authored by "Kate Glazebrook, Samantha Steele and Janna Ter Meer"

Assessment center score
Assessment center score
Assessment center score
Assessment center score

A possible explanation for these findings is that Applied was able to pick up on high-quality candidates in ways that a CV sift cannot by focussing on work tasks that predict performance on the job. In addition, the digital platform helped reviewers ignore characteristics that were not correlated with assessment centre performance.

Diversity (‘fair’ recruitment): There was no evidence of discrimination on the basis of gender, ethnicity or disability in either the CV sift or Applied. Overall, these characteristics appear less important in Applied than the CV screen, but this was not statistically significant. When assessing diversity, it is worth bearing in mind that BIT might not be a representative employer. The team has a 50% gender split and staff from over 15 different countries. Many team members work on policy issues related to social mobility. This may have resulted in a heightened awareness of potential bias during the CV screen.

Efficiency (‘easy’ recruitment): Aside from the quality of these sifting processes, the results indicate that Applied is also faster. The CV sift took approximately 7.7 minutes per candidate, while Applied took approximately 6.7 minutes per candidate. Applied also enabled time to be spent more productively with 75% of the time to spent reviewing candidates. In contrast, 61% of the time using the CV sift was spent on administration.

Implications

Hiring the wrong candidate is costly and time consuming for companies. As the results above evidence, the adoption of behaviourally informed digital recruitment tools has the potential to provide businesses with the opportunity to remove unconscious bias from their recruitment decisions. This leaves hiring managers free to offer the job to the best candidate, regardless of their personal or professional background.

By removing unconscious bias, the study results indicate that digital recruitment platforms such as Applied are more effective than a CV screen at predicting a candidate’s performance in later recruitment rounds. While we do not have the sample size to robustly correlate this with performance on the job, the recruitment mapping process was designed to align to key tasks expected in the actual work environment. From a business perspective, this insight has the potential to make the recruitment process more efficient and in turn to save recruiters time and money. The use of a process that removes bias can also assist recruiters in avoiding missing out on high-quality candidates at the CV sift stage.

For further information contact Kate Glazebrook.

References:

(1) The Behavioural Insights Team (BIT) is a social purpose company based in the UK but with offices globally, who started as the world’s first government institution dedicated to the application of behavioural sciences. They redesign public services with an empirical approach based in the behavioural sciences.

Meet our author