“AI is based on data, and data is a reflection of our history,” said MIT Media Lab researcher Joy Buolamwini in the documentary Coded Bias. The 2020 documentary explores how artificial intelligence (AI) is not exempted from bias. It also advances legislation that will protect US citizens from the potential negative impacts of bias and prejudice in algorithms and codes. Bias against people of color is one potential danger that could hurt the recruitment process in many companies and staffing firms using AI.
For many years, artificial intelligence has made the recruitment process more efficient. From resume screening and cover letter checking to automated interviews, AI has made the struggles of going through thousands of applications more bearable. Companies and job seekers profit from this ease since AI makes it quick to find the best candidates.
Artificial intelligence has streamlined processes, thus reducing the cost of both time and labor. However, this convenience comes with the challenge of preserving the credibility and validity of the recruitment process. What are these little-known dangers that can be overlooked by recruitment staff? Here are some of them:
1. Parameters That Can Cause Candidate Oversight
Depending on the AI application you will use to keep track of candidates, the parameters may need to be scrutinized if effective. AI recruitment software is designed to automatically reject candidates that don’t fit in the parameters set. For example, project managers with logistics, coordination, and communication skills may be disregarded for positions looking for operational analysts. Skillsets may include abilities to process and interpret data trends, which those with project management experience in multinational operations could be experts.
The challenge for recruiters is to tweak the AI programs specifically for skillsets or job experience that could help them see diamonds in the rough. However, excessive reliance on the convenience of AI may cause the company to lose track of talents that are the best match for the vacancy.
2. Coded Bias and Prejudice in the Software
Anything invented can never be entirely objective. The documentary Coded Bias is a great example of how AI is not free of bias and prejudice. As the algorithm learns from its programmers, parameter inputs, and subsequent user interactions, it may also learn about underlying prejudice embedded in human communication.
In 2018, Reuters reported that Amazon had to stop using an AI program that preferred male candidates over women. The database used in the program’s development was mostly CVs from male candidates, as the tech industry is male-dominated. This example only shows that gender equality is a factor that needs to be coded better in AI programs to ensure diversity in recruitment.
Candidates’ backgrounds during interviews can be a factor in a favorable rating from an AI recruitment program. Some AI programs may rate candidates lower for conscientiousness just by wearing headscarves or glasses during the interview recording. Voice analysis can also automatically reject applicants with accents that could make them seem unprofessional or less skilled than their fluent counterparts. Posture and facial expressions can be rated low for candidates with physiological and facial disabilities.
3. Inaccessible AI Recruitment Process
Not all have access to personal computers. Some people who may have recently lost their jobs and have to start anew may need access to laptops to go through the application process.
Apart from limited access to the digital world, some applicants may be discouraged from submitting their initial application or continuing when they see an AI-initiated feature. Compatibility issues might frustrate applicants used to a different operating system than what AI programs use. Candidates might also feel confused over applications not running on devices amid many trials. People unfamiliar with camera angles and navigating new software might find the online assessments and interview inconvenient.
The same may happen to recruitment staff who cannot learn the intricacies of AI programs. This potential difficulty may lead to delays in the recruitment process since the learning curve might take time to overcome.
Candidates and recruitment staff who find themselves in such an uncomfortable situation may reconsider or ultimately dismiss the application process altogether. It can reflect poorly on the recruitment experience with your company.
Be Proactive with Your AI Recruitment Process
While the convenience of AI recruitment programs comes with little-known dangers at the expense of your recruitment process, humans are still in control of preventing AI’s downsides from happening. Here are some ways:
1. Research thoroughly the program’s development.
Before you purchase a subscription for an AI program, research how it was developed and what database it used, and if your IT team or consultant knows AI, consult the integrity of the program you’re planning to get. Better yet, get an IT lead who will work with you from procurement to maintenance of the AI program.
2. Set better parameters and codes in your hiring process.
The function of any technology will depend on and is a reflection of its users. If you notice that your hiring process needs to implement diversity more, chances are that the parameters and coding may be flawed. In this case, conduct quality assurance in your own recruitment process. You might be surprised by the rewards you could reap from a diversified employee pool.
3. Keep humans in the recruitment loop and beyond.
Human resource processes should always involve humans. The benefits of having your staff oversee your AI-supported recruitment process outweigh the convenience of having AI programs. Any bump in the road in your recruitment process can be handled by your HR staff and can even save you from wasting money on subscriptions.
4. Ask for help outside the company.
Getting help from consultants, marketers, and others can provide you with better insight. Learn about the experiences of other companies. Then, consolidate the information that you have gathered and discuss it with the company stakeholders and recruitment staff.
Using AI in the recruitment process is a learning curve for programmers and users. Since the AI recruitment process is still in its infancy, more work must be done to eliminate bias and inefficiencies. The goal is to develop a sustainable system that marries efficient human-based and AI-supported processes.
CULTIVATE THE HUMAN SIDE OF HUMAN RESOURCES WITH ALLIED INSIGHT.
A B2B fractional CMO and growth marketing delivery agency, Allied Insight, was created especially for staffing firms like yours. We offer services that integrate your whole digital strategy with the technological environment and architecture of your business. Our organization was established on the principles of accessibility, impact, and development.
To help preserve trademarks, offers, and market positions, Allied Insight works with clients to develop brand-centric competitive advantages. Learn more about how Allied Insight can help you by contacting us today!