This article is from the source 'guardian' and was first published or seen on . The next check for changes will be
You can find the current article at its original source at https://www.theguardian.com/australia-news/2025/may/14/people-interviewed-by-ai-for-jobs-face-discrimination-risks-australian-study-warns
The article has changed 6 times. There is an RSS feed of changes available.
Version 1 | Version 2 |
---|---|
People interviewed by AI for jobs face discrimination risks, Australian study warns | People interviewed by AI for jobs face discrimination risks, Australian study warns |
(about 3 hours later) | |
Data used to train artificial intelligence may not ‘reflect the demographic groups we have in Australia’, says researcher | |
Job candidates being interviewed by AI recruiters risk being discriminated against if they don’t have American accents, or are living with a disability, a new study has warned. | Job candidates being interviewed by AI recruiters risk being discriminated against if they don’t have American accents, or are living with a disability, a new study has warned. |
This month, videos of job candidates interacting with at-times faulty AI video interviewers as part of the recruitment process have been widely shared on TikTok. | This month, videos of job candidates interacting with at-times faulty AI video interviewers as part of the recruitment process have been widely shared on TikTok. |
This article includes content provided by TikTok. We ask for your permission before anything is loaded, as they may be using cookies and other technologies. To view this content, click 'Allow and continue'. | This article includes content provided by TikTok. We ask for your permission before anything is loaded, as they may be using cookies and other technologies. To view this content, click 'Allow and continue'. |
The use of AI video recruitment has grown in recent years. HireVue, an AI recruitment software company used by many employers, reported in February that, among 4,000 employers surveyed worldwide, AI use in hiring had risen from 58% in 2024 to 72% in 2025. | The use of AI video recruitment has grown in recent years. HireVue, an AI recruitment software company used by many employers, reported in February that, among 4,000 employers surveyed worldwide, AI use in hiring had risen from 58% in 2024 to 72% in 2025. |
Sign up for Guardian Australia’s breaking news email | Sign up for Guardian Australia’s breaking news email |
Sign up for Guardian Australia’s breaking news email | Sign up for Guardian Australia’s breaking news email |
Australian research published this month estimates the use is significantly lower – about 30% in Australian organisations – but expected to grow in the next five years. | Australian research published this month estimates the use is significantly lower – about 30% in Australian organisations – but expected to grow in the next five years. |
However, the paper, by Dr Natalie Sheard, a University of Melbourne law school researcher, warns the use of AI hiring systems to screen and shortlist candidates risks discriminating against applicants, due to biases introduced by the limited datasets the AI models were trained on. | However, the paper, by Dr Natalie Sheard, a University of Melbourne law school researcher, warns the use of AI hiring systems to screen and shortlist candidates risks discriminating against applicants, due to biases introduced by the limited datasets the AI models were trained on. |
In her research, Sheard interviewed 23 human resources professionals in Australia on their use of AI in recruitment. Of these, 13 had used AI recruitment systems in their companies, with the most common tool being CV analysis systems, followed by video interviewing systems. | In her research, Sheard interviewed 23 human resources professionals in Australia on their use of AI in recruitment. Of these, 13 had used AI recruitment systems in their companies, with the most common tool being CV analysis systems, followed by video interviewing systems. |
Datasets based on limited information that often favours American data over international data presents a risk of bias in those AI systems, Sheard said. One AI systems company featured in Sheard’s research, for example, has said only 6% of its job applicant training data came from Australia or New Zealand, and 33% of the job applicants in the training data were white. | Datasets based on limited information that often favours American data over international data presents a risk of bias in those AI systems, Sheard said. One AI systems company featured in Sheard’s research, for example, has said only 6% of its job applicant training data came from Australia or New Zealand, and 33% of the job applicants in the training data were white. |
The same company has said, according to the paper, that its word error rate for transcription of English-language speakers in the US is less than 10%, on average. However, when testing non-native English speakers with accents from other countries, that error rate increases to between 12 and 22%. The latter error rate is for non-native English speakers from China. | The same company has said, according to the paper, that its word error rate for transcription of English-language speakers in the US is less than 10%, on average. However, when testing non-native English speakers with accents from other countries, that error rate increases to between 12 and 22%. The latter error rate is for non-native English speakers from China. |
This article includes content provided by TikTok. We ask for your permission before anything is loaded, as they may be using cookies and other technologies. To view this content, click 'Allow and continue'. | This article includes content provided by TikTok. We ask for your permission before anything is loaded, as they may be using cookies and other technologies. To view this content, click 'Allow and continue'. |
“The training data will come from the country where they’re built – a lot of them are built in the US, so they don’t reflect the demographic groups we have in Australia,” Sheard said. | “The training data will come from the country where they’re built – a lot of them are built in the US, so they don’t reflect the demographic groups we have in Australia,” Sheard said. |
Research participants told Sheard that non-native English speakers or those with a disability affecting their speech could find their words not being transcribed correctly, and would then not be rated highly by the recruitment algorithm. | Research participants told Sheard that non-native English speakers or those with a disability affecting their speech could find their words not being transcribed correctly, and would then not be rated highly by the recruitment algorithm. |
This prompted two of the participants to seek reassurance from their software vendor that it did not disadvantage candidates with accents. Sheard said they were given reassurances that the AI was “really good at understanding accents” but no evidence was provided to support this. | This prompted two of the participants to seek reassurance from their software vendor that it did not disadvantage candidates with accents. Sheard said they were given reassurances that the AI was “really good at understanding accents” but no evidence was provided to support this. |
Sign up to Afternoon Update | Sign up to Afternoon Update |
Our Australian afternoon update breaks down the key stories of the day, telling you what’s happening and why it matters | Our Australian afternoon update breaks down the key stories of the day, telling you what’s happening and why it matters |
after newsletter promotion | after newsletter promotion |
Sheard said there was little to no transparency about the AI interview systems used, for potential recruits, the recruiter, or the employer. | Sheard said there was little to no transparency about the AI interview systems used, for potential recruits, the recruiter, or the employer. |
“This is the problem. In a human process, you can go back to the recruiter and ask for feedback, but what I found is recruiters don’t even know why the decisions have been made, so they can’t give feedback,” she said. | “This is the problem. In a human process, you can go back to the recruiter and ask for feedback, but what I found is recruiters don’t even know why the decisions have been made, so they can’t give feedback,” she said. |
“That’s a problem for job seekers … It’s really hard to pick where liability lies, but absolutely vendors and employers are legally liable for any discrimination by these systems.” | “That’s a problem for job seekers … It’s really hard to pick where liability lies, but absolutely vendors and employers are legally liable for any discrimination by these systems.” |
No case of AI discrimination had yet reached the courts in Australia, Sheard said, with any instances of discrimination needing to first go to the Australian Human Rights Commission. | No case of AI discrimination had yet reached the courts in Australia, Sheard said, with any instances of discrimination needing to first go to the Australian Human Rights Commission. |
In 2022, the federal merit protection commissioner revealed 11 promotion decisions in Services Australia in the previous year had been overturned, after the agency outsourced the process to a recruitment specialist that used AI automated selection techniques, including psychometric testing, questionnaires and self-recorded video responses. | In 2022, the federal merit protection commissioner revealed 11 promotion decisions in Services Australia in the previous year had been overturned, after the agency outsourced the process to a recruitment specialist that used AI automated selection techniques, including psychometric testing, questionnaires and self-recorded video responses. |
It was found that the selection process “did not always meet the key objective of selecting the most meritorious candidates”. | It was found that the selection process “did not always meet the key objective of selecting the most meritorious candidates”. |
Sheard said the returned Albanese Labor government should consider a specific AI act to regulate the use of AI, and potentially strengthen existing discrimination laws to guard against AI-based discrimination. | Sheard said the returned Albanese Labor government should consider a specific AI act to regulate the use of AI, and potentially strengthen existing discrimination laws to guard against AI-based discrimination. |