Nov 8, 2022 5:03:28 PM | 6 Min Read

Why Businesses Must Overcome AI Bias in Hiring — and How They Can Do It

Posted By
Arran Stewart
Share
Why Businesses Must Overcome AI Bias in Hiring — and How They Can Do It

Technological advancements have done much to make filling job openings easier for businesses. In particular, the growing use of artificial intelligence has had a profound effect not only on the logistics of hiring but also on efforts to increase diversity within companies and industries. However, AI bias, which is often unintentionally programmed into algorithms, can negatively impact companies’ sincere efforts to increase diversity.

The unconscious bias that can exist in AI matching is something that must be overcome to use the technology more ethically. What do companies need to know to do so?

AI Has Changed How Businesses Find Talent

When AI was integrated into hiring and recruiting, it alleviated a longtime pain point: the human struggle to determine which job candidates to interview out of a large group of applicants.

AI simplifies the most laborious part of the hiring process by automating the way businesses find and shortlist candidates.

For example, AI technology can scour databases like LinkedIn for keywords to find candidates that match job descriptions. With the right information to work with, AI can help identify candidates whose personalities are good matches for a company’s culture, or those who seem more open to leaving their current jobs.

All of this has proved beneficial to companies, HR employees and recruiters. It has, however, come at the expense of another group of people.

How Does AI Bias Happen?

Being technology, AI has no inherent concept of race or gender, but it does look for patterns, statistical data and trends. That can lead to AI bias when pervasive societal and institutional inequality impact that data.

For example, where people receive their educations can influence (among other things) their writing styles. AI can recognize that. This might lead the technology to screen out candidates who are otherwise perfectly qualified. Not everyone has the same opportunities that lead to an education that creates what the AI might consider a “winning” writing style.

The problem of AI bias can be seen clearly in the IT industry, where according to one estimate, more than 77 percent of professionals are men and 59 percent are white. If you give an AI system every resume that resulted in a successful hire for a particular IT technology role and allow the AI to learn from that database, it will seek out the same sort of candidates.

Using that biased criterion, a hiring AI can shortlist or remove candidates from diverse backgrounds before a human ever sees their resumes, reinforcing a biased status quo instead of efforts toward diversity, equity and inclusion. 

 
"What can companies do? For one, they can use machine learning tools that have equal data sets of backgrounds, races and genders.”

Arran Stewart co-founder and chief visionary officer, Job.com

The Challenge of Course Correcting AI Bias

What can companies do? For one, they can use machine learning tools that have equal data sets of backgrounds, races and genders. If a tech company is looking for staff, for example, it should look to create an AI database that includes an equal spread of genders and races that have worked in technology. That would undoubtedly reduce the amount of unconscious bias.

However, it’s not a foolproof approach. If you continuously allow AI to learn from every placement that’s made, and the lion’s share of the labor force remains white men, the AI will start to go down that rabbit hole again.

To really fight AI bias, businesses must fundamentally change their hiring processes to steer their AI systems toward diverse and inclusive hiring, closely monitoring their systems’ progress to ensure they aren’t returning to biased paths. Companies that don’t use their own software can do that by demanding from their suppliers that they have independent validation that their AI is doing everything it can to maximize the throughput of diverse candidates.

A company can request feedback regarding the entire hiring funnel. That constant checking is important, because the best way to find leaks in a bucket is to fill it with water.

Despite some progress in diverse hiring, the problem of bias is less about the AI than it is about the people who do the hiring. Human bias begets AI bias.

We can all agree that technology is great at doing tasks on behalf of humans and making us more efficient. The bit that matters most is driving for more equal, fair places to work that provide opportunity for all. AI has a huge part to play in that, but businesses should ensure that they don’t allow technology to perpetuate negative stereotypes. It is about driving forward to continuous improvement.


Subscribe To Job.com Newsroom!

Author:  Arran Stewart, CVO, Job.com

Original Article Published: Biztech Magazine

Topics: Technology

Related Posts

ManoByte

How to negotiate the Best Job Package

Getting the most value for your time is a learned skill and it's an essential one to master in...

Read More
ManoByte

A How-to-Guide for Hiring During and After the Covid-19 Pandemic

Our world has turned upside down in the space of a few short months with many companies closed for...

Read More