AI bias can affect HR, just ask Amazon


Amazon’s experimental AI recruiting tool has been shuttered for having a strong bias against female candidates.

Machine learning is great at taking lots of data and finding correlations humans don’t see. In this way it can also reveal biases we don’t realise we had. Case in point, Amazon has shut down its experimental recruitment tool because it was biased against female candidates.

According to a Reuter’s article, in 2014 an Amazon team began crafting software to review resumes. “Everyone wanted this holy grail,” one of Reuter’s sources said. “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those.”

In 2015 Amazon first realised its ‘tech-savvy’ approach to recruitment wasn’t working out as they’d hoped. Positions for software developers and other technical roles were favouring male candidates.

“That is because Amazon’s computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men, a reflection of male dominance across the tech industry,” says Dastin, the author of the article. In 2017, Amazon’s leadership disbanded the team responsible for the recruitment tool.

The company told Reuters that the tool was never used to evaluate candidates, though it didn’t deny that recommendations were looked at by people in charge of hiring.

How did the technology favour male candidates?

The system worked off of 10 years of resumes that were sent to Amazon. By assessing patterns in that data it learned to give low value to tech credentials – such as coding skills, probably because they were too typical – and gave greater weight to other data points in candidate’s resumes. For instance, here are some of the criteria for which it penalised women and favoured men when it came to tech roles:

  • The inclusion of the word “women” – for example when listing extracurricular activities like “women’s drama club leader”
  • Listing attendance at one of two all-female colleges, perhaps because no past Amazon employees went to either of them
  • Favouring language more commonly found on the CVs of men, such as “executed” and “captured”.

Machine learning our mistakes

That last point dovetails with 2016 research [paywalled] from Princeton University focussed on a machine learning tool called word embedding. The statistical approach tries to hone in on the meaning of a word by analysing words that commonly sit near it. For example, the researchers were able to replicate a previous study showing that word embeddings ‘discover’ that humans find flowers significantly more pleasant than insects.

As the researchers wrote: “Word embeddings ‘know’ these properties… with no direct experience of the world and no representation of semantics other than the implicit metrics of words’ co-occurrence statistics with other nearby words.”

More relevant to HR, the researchers found that female names were linked with “family” and male names with “career”. Similarly, it found that maths and science were linked with male terms, and arts with female terms. It didn’t stop at gender bias either, it also showed that machine learning is likely to associate young people’s names with “pleasant” and old people’s names with “unpleasant”.

“A lot of people are saying this is showing that AI is prejudiced. No. This is showing we’re prejudiced and that AI is learning it,” Joanna Bryson, a computer scientist at the University of Bath and one of the study’s authors told The Guardian.

Amazon seems to have found the same thing in recruitment that the researchers warned about in language comprehension. They wrote: “If we build an intelligent system that learns enough about the properties of language to be able to understand and produce it, in the process it will also acquire historical cultural associations, some of which can be objectionable.”

Credit where credit isn’t due?

Writing for Slate, Jordan Weismann said, “Amazon deserves some credit for realizing its tool had a problem, trying to fix it, and eventually moving on (assuming it didn’t have a serious impact on the company’s recruiting over the last few years).”

What Weismann doesn’t point out is that years of biased hiring in most tech companies seems to be the cause of the AI’s problem with women. The tool picked up on a clear trend against women and tried to replicate it.

And it’s not as if the company is now beyond the tech bias against women, and has effectively embraced inclusion. Just this year Amazon had to navigate its way through a potential PR disaster, after it was reported that its board voted against a proposal requiring it to shortlist qualified minority and women candidates for new board positions.

If Amazon – a growing provider of machine learning services and a company that is trying to leverage AI across all aspects of its business – can’t get this right, then maybe automating this particular HR function will not happen anytime soon.

Photo credit: Matan Zedev via Pexels


Identify ways to manage and reduce likelihood of bias through effective HR policies and practices with the Ignition Training in-house training course ‘Managing unconscious bias’.

Subscribe to receive comments
Notify me of
guest

5 Comments
Inline Feedbacks
View all comments
Adrian Kaminski
Adrian Kaminski
6 years ago

Machines and humans have a lot of work to do in the recruitment space. Its not about speed, its accuracy. Recruitment is also not a “sales” role.

Joanna
Joanna
6 years ago

Great article – thank you Girard

trackback
The Privilege Walk | Group Activity | Evolve Communities
5 years ago

[…] was taught to see racism only in individual acts of meanness, not in invisible systems conferring dominance on my […]

trackback
A guide to Australian workplace changes in 2020 - Red Wagon Solutions
4 years ago

[…] example, in late 2018 Amazon scrapped its secret AI recruiting tool because it showed bias against women for software developer jobs and other technical […]

trackback
Analysts Beware: 6 Common Causes of Incorrect Data Analysis | Pecan AI
10 months ago

[…] due to a lack of awareness about Excel spreadsheet capabilities. Or an HR recruiting system that only recommends male job candidates. Or your company overpaying hundreds of millions of dollars in inventory because of a bad […]

More on HRM

AI bias can affect HR, just ask Amazon


Amazon’s experimental AI recruiting tool has been shuttered for having a strong bias against female candidates.

Machine learning is great at taking lots of data and finding correlations humans don’t see. In this way it can also reveal biases we don’t realise we had. Case in point, Amazon has shut down its experimental recruitment tool because it was biased against female candidates.

According to a Reuter’s article, in 2014 an Amazon team began crafting software to review resumes. “Everyone wanted this holy grail,” one of Reuter’s sources said. “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those.”

In 2015 Amazon first realised its ‘tech-savvy’ approach to recruitment wasn’t working out as they’d hoped. Positions for software developers and other technical roles were favouring male candidates.

“That is because Amazon’s computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men, a reflection of male dominance across the tech industry,” says Dastin, the author of the article. In 2017, Amazon’s leadership disbanded the team responsible for the recruitment tool.

The company told Reuters that the tool was never used to evaluate candidates, though it didn’t deny that recommendations were looked at by people in charge of hiring.

How did the technology favour male candidates?

The system worked off of 10 years of resumes that were sent to Amazon. By assessing patterns in that data it learned to give low value to tech credentials – such as coding skills, probably because they were too typical – and gave greater weight to other data points in candidate’s resumes. For instance, here are some of the criteria for which it penalised women and favoured men when it came to tech roles:

  • The inclusion of the word “women” – for example when listing extracurricular activities like “women’s drama club leader”
  • Listing attendance at one of two all-female colleges, perhaps because no past Amazon employees went to either of them
  • Favouring language more commonly found on the CVs of men, such as “executed” and “captured”.

Machine learning our mistakes

That last point dovetails with 2016 research [paywalled] from Princeton University focussed on a machine learning tool called word embedding. The statistical approach tries to hone in on the meaning of a word by analysing words that commonly sit near it. For example, the researchers were able to replicate a previous study showing that word embeddings ‘discover’ that humans find flowers significantly more pleasant than insects.

As the researchers wrote: “Word embeddings ‘know’ these properties… with no direct experience of the world and no representation of semantics other than the implicit metrics of words’ co-occurrence statistics with other nearby words.”

More relevant to HR, the researchers found that female names were linked with “family” and male names with “career”. Similarly, it found that maths and science were linked with male terms, and arts with female terms. It didn’t stop at gender bias either, it also showed that machine learning is likely to associate young people’s names with “pleasant” and old people’s names with “unpleasant”.

“A lot of people are saying this is showing that AI is prejudiced. No. This is showing we’re prejudiced and that AI is learning it,” Joanna Bryson, a computer scientist at the University of Bath and one of the study’s authors told The Guardian.

Amazon seems to have found the same thing in recruitment that the researchers warned about in language comprehension. They wrote: “If we build an intelligent system that learns enough about the properties of language to be able to understand and produce it, in the process it will also acquire historical cultural associations, some of which can be objectionable.”

Credit where credit isn’t due?

Writing for Slate, Jordan Weismann said, “Amazon deserves some credit for realizing its tool had a problem, trying to fix it, and eventually moving on (assuming it didn’t have a serious impact on the company’s recruiting over the last few years).”

What Weismann doesn’t point out is that years of biased hiring in most tech companies seems to be the cause of the AI’s problem with women. The tool picked up on a clear trend against women and tried to replicate it.

And it’s not as if the company is now beyond the tech bias against women, and has effectively embraced inclusion. Just this year Amazon had to navigate its way through a potential PR disaster, after it was reported that its board voted against a proposal requiring it to shortlist qualified minority and women candidates for new board positions.

If Amazon – a growing provider of machine learning services and a company that is trying to leverage AI across all aspects of its business – can’t get this right, then maybe automating this particular HR function will not happen anytime soon.

Photo credit: Matan Zedev via Pexels


Identify ways to manage and reduce likelihood of bias through effective HR policies and practices with the Ignition Training in-house training course ‘Managing unconscious bias’.

Subscribe to receive comments
Notify me of
guest

5 Comments
Inline Feedbacks
View all comments
Adrian Kaminski
Adrian Kaminski
6 years ago

Machines and humans have a lot of work to do in the recruitment space. Its not about speed, its accuracy. Recruitment is also not a “sales” role.

Joanna
Joanna
6 years ago

Great article – thank you Girard

trackback
The Privilege Walk | Group Activity | Evolve Communities
5 years ago

[…] was taught to see racism only in individual acts of meanness, not in invisible systems conferring dominance on my […]

trackback
A guide to Australian workplace changes in 2020 - Red Wagon Solutions
4 years ago

[…] example, in late 2018 Amazon scrapped its secret AI recruiting tool because it showed bias against women for software developer jobs and other technical […]

trackback
Analysts Beware: 6 Common Causes of Incorrect Data Analysis | Pecan AI
10 months ago

[…] due to a lack of awareness about Excel spreadsheet capabilities. Or an HR recruiting system that only recommends male job candidates. Or your company overpaying hundreds of millions of dollars in inventory because of a bad […]

More on HRM