This is the fifth in a series of TalentRISE tips to help businesses secure an edge over competitors by evolving their HR and recruitment practices in the digital era.
Key Takeaways
- In today’s hyper-competitive market for talent, bias (intended or not, perceived or actual) can and will limit your ability to engage and hire great talent
- While the human factor is real, bias often occurs pre-interview, before a candidate ever interacts with any single person at your organization
The tips below will help you minimize bias when sourcing and screening
In recruitment, both machines and humans play a large role. Much of the pre-interview process these days, in fact, is automated. This trend is bound to accelerate as even smaller organizations increasingly access inexpensive and sophisticated recruitment tools, such as chatbots, video screening, artificial intelligence-based data mining sourcing tools and the like. All of this means that candidates may not ever speak to, or ever meet face-to-face, with a human at your organization until they are a finalist for a particular job.
Most of us recognize that there is always the possibility of bias when humans engage with one another. HR – and the law – have been focused on that for years. Less attention, however, is paid to potential biases within the digital systems that run our hiring processes. Below are a few essential tips to mitigate that issue:
First of all, be sure that you adhere to the law. Conduct a periodic review (we recommend every 1-2 years) of any/all questions that are posed in the sourcing/application process. Be sure you are compliant with the latest regulations, including those related to asking questions about candidates’ salary histories (click here for a running list of the states and local municipalities banning the salary question).
Beyond the legal aspects, be sure to audit your electronic applications for biases that aren’t illegal but may nonetheless knock-out great candidates. This may mean, for example, eliminating drop down menus that allow the applicant to ONLY select educational institutions in the U.S. This is particularly important for global organizations or those who are seeking talent with global experience. Same goes for asking about GPAs – not every educational system uses the standard U.S. 5-point scale. At minimum, if a system change or update is not possible, at least be sure to NOT require those fields in order for the candidate to move forward in the process. So add a text box for “OTHER” to allow candidates can provide an appropriate response. Preventing a candidate from bypassing a “required field” demanding information in a format that he/she can’t provide forces that candidate into either fudging the question or self-selecting out of the process.
On a larger scale, take a close look at how your organization uses data from the past to hire people today. Some of the newest, coolest technologies such as chatbot and data mining sourcing tool vendors use Artificial Intelligence (AI) to automate candidate sourcing, engagement and pre-screening. These tools are programmed using existing data sets in your ATS, or information found online that may be out of date, so, even if you build a hiring decision-tree with state-of-the-art AI, stale data or outdated candidate pre-screening criteria ( i.e current or past compensation) programmed into the bot logic may produce results that are less than optimal. Using old data sets may carry forward unintentional biases related to gender or age. (To read more on how this can backfire, go here).
When it comes to removing gender bias from your recruiting process, be sure to focus initially on the words used to describe a particular job in your job advertisement job descriptions. This will ensure you attract people who possess the ideal experiences, competencies and personality traits of your ideal candidate. A terrific example is cited by Amelia Weitrak, author of this blog post, where she writes that “words used to typically describe a CEO often paint a picture that many people typically associate with a man versus a woman — even though 32% of the top performing 500 U.S. companies are run by female CEOs!” Here is a free tool to get you started in assessing your current job descriptions for gender bias.
Weitrak goes on to point out that the key to ensuring fair gender hiring success lies in the systems and tools you use to assess the job candidate’s relevant COMPETENCIES and QUALIFICATIONS and how you make your hiring decisions. Team-based decisions, for instance, are less likely to be biased. Therefore, it is important to evaluate existing, or implement new, unbiased, validated competency assessment tools, interview guides and/or hiring decision practices periodically. In some cases, you may even need to consider training hiring teams to ensure both male and female candidates are assessed fairly and hiring decisions made by the interview teams are based upon a best match in qualifications.
The same holds true for employee referral systems. These can be highly effective, particularly when your people are engaged, happy in your workplace and you reward them financially for referring their friends to come work for you. The downside is that people tend to hang with people like themselves so, in some cases, you may further limit the diversity of your referral candidate pool. Continually tapping into the same pools of talent through your employees’ networks may result in your hiring the same type (race and gender) of people dominated by their networks over and over again. Don’t get me wrong – beyond a doubt, there is tremendous value in these networks, but our point is to be aware of the potential for bias, especially if your employee base is not very diverse to begin with.
Taking that a step further, also recognize that there is inherent bias in hiring from the same colleges. That’s not to say that hiring, for instance, from the boss’s alma mater is necessarily a bad thing, particularly if it’s among the best business schools in the country. But placing too much emphasis on hiring from a handful of schools does tend to result in groupthink when teams are required to innovate or make critical decisions. Innovation stemming from diverse opinions and experiences is the impetus for success in today’s highly competitive, global business world so hiring people with a broad array of educational backgrounds just makes more sense.
Another often overlooked bias in the pre-interview process is against people with disabilities, such as visual impairment. So be sure your candidate website, online and mobile application processes are ADA compliant and available to be read with browser-based text readers. Start with a FREE scan of your website’s accessibility using this online tool.
Lastly, while it’s become much more commonplace, video interviewing presents a whole host of challenges related to bias because it offers the viewer visual clues about race, sex, and potentially even disabilities. That’s why legal experts recommend establishing protocols around video interviews that use a fact-based approach to measure candidates against pre-defined competencies. Click Here for more tips on how to avoid bias in using these tools that present such obvious ways to bias an interview and selection decision — visually!
In sum, hiring bias is prevalent at many junctions in the recruitment process, from sourcing to interviewing. What’s often overlooked however are built-in biases in the tools we commonly use, whether AI-powered high tech sourcing and video interview tools or our job descriptions, website, interview questions and ATS systems. One simple question, such as asking an applicant for their GPA score prior to progressing in the process can screen out a great candidate. That’s why it’s critical to examine every step of your process and systems to eliminate as many of these biases as possible.
Need more info? Contact Carl Kutsmode, Partner, TalentRISE.