Credit: Pixabay/CC0 public domain
New research from Monash Business School has found that throughout the job application process, women believe artificial intelligence assessments reduce bias, while men fear it takes away an advantage.
Professor Andreas Leibbrandt, from the Department of Economics, explored how artificial intelligence recruitment tools impact existing recruitment biases and argued whether there was a way to break down the barriers that prevent underrepresented groups from reaching their full potential in achieving their desired roles .
“People in minority groups have inferior market outcomes, they earn less, they have a harder time finding and keeping a job. It is important to understand why this is the case so that we can identify and remove the barriers,” said Professor Leibbrandt.
One major hurdle lies in the recruitment process itself, which is undergoing a shift along with the rise of AI. “We know that a large majority of organizations are now using AI in their recruitment process,” he says.
To uncover recruitment barriers, the first study of its kind focused on the two key areas of applicant behavior and recruiter bias.
In one field experiment, more than 700 applicants for a web designer position were told whether their application would be reviewed by AI or by a human.
“Women were significantly more likely to complete their application if they knew AI would be involved, while men were less likely to complete an application,” he said.
A second experiment focused on the behavior of 500 tech recruiters.
“We found that when recruiters knew the applicant’s gender, they consistently scored women lower than men. However, these biases disappeared completely when the applicant’s gender was hidden,” he said.
When recruiters had access to both the AI score and the applicant’s gender, there was also no gender difference in the score.
“This finding shows us that they are using AI as a tool and anchor – it helps remove gender bias in assessment.”
Professor Leibbrandt said a crucial aspect of the research was that, unlike the vast majority of current research, it focused on the human interaction with AI, rather than the algorithm behind it.
“My research isn’t just about dismantling bias, it’s about building a future of work where everyone has the opportunity to thrive,” he said.
Professor Leibbrandt explores other frontiers in the fight for inclusion in the workplace.
One project will test the impact of informing applicants being assessed by AI about the potential bias in AI training data.
He also plans to tackle the concept of “narrative discrimination,” where unconscious stereotypes influence hiring decisions in the tech industry, and explore the potential for bias in remote work.
Provided by Monash University
Quote: How Artificial Intelligence Unmasks Bias During the Recruitment Process (2024, October 11) retrieved October 16, 2024 from https://phys.org/news/2024-10-artificial-intelligence-unmasking-bias.html
This document is copyrighted. Except for fair dealing purposes for the purpose of private study or research, no part may be reproduced without written permission. The content is provided for informational purposes only.