Site icon Tech Newsday

AI tools fail to reduce hiring bias

After a two-year study, researchers concluded that artificially intelligent hiring tools do not minimize partiality or enhance variations.

In 2020, an international survey of 500 human resources professionals was conducted by a human resources panel, and nearly a quarter used AI to harness talent in the form of automation.

According to Dr Kerry Mackereth, a postdoctoral fellow at Cambridge University’s Centre for Gender Studies, using these tools to minimise prejudice is counterproductive, as the tools cannot be trained to identify only occupational traits and remove gender and race from the recruitment process.

“These tools can’t be trained to only identify job-related characteristics and strip out gender and race from the hiring process, because the kinds of attributes we think are essential for being a good employee are inherently bound up with gender and race,” Mackereth said.

According to researchers at Cambridge University, the use of AI is becoming more widespread, but analysing job application videos or applications is pseudoscientific and problematic for some companies, despite growing interest in new ways to solve problems such as interview bias.

Another study shows that six computer science students have developed their own simplified AI recruitment tool to rate candidates’ photos for the “big five” personality traits of agreeableness, extroversion, openness, conscientiousness, and neuroticism, with ratings skewed by many irrelevant variables, suggesting that the tools are not fully equipped and ready.

The sources for this piece include an article in BBC.

Exit mobile version