Amazon machine-learning recruitment tool


Amazon accidentally made an A.I Recruitment Tool that Didn’t like Women

Amazon machine-learning recruitment tool

E-commerce giant Amazon reportedly had to scrap an A.I recruitment tool because it was biased against female candidates.

According to a report from Reuters, Amazon started building the platform in 2014, to help speed up the recruitment process by making it automated and more efficient. It rated resumes out of five, automatically ranking candidates in terms of suitability.

The team had been building computer programs since 2014 to review job applicants’ resumes with the aim of mechanizing the search for top talent, five people familiar with the effort told Reuters.

black Samsung Galaxy smartphone displaying Amazon logo

Photo by Christian Wiediger on Unsplash

Automation has been key to Amazon’s e-commerce dominance, be it inside warehouses or driving pricing decisions. The company’s experimental hiring tool used artificial intelligence to give job candidates scores ranging from one to five stars – much like shoppers rate products on Amazon, some of the people said.

“Everyone wanted this holy grail,” one of the people said. “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those.”

However, by 2015, the company realised the tool was consistently ranking resumes that featured the word ‘women’ poorly, compared to those that didn’t.

That is because Amazon’s computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men, a reflection of male dominance across the tech industry.

Any resume that mentioned involvement in women’s clubs or sports teams, for example, were downgraded. According to Reuters, it also downgraded graduates of two women’s colleges.

Any artificial intelligence or machine learning technology is only as smart as the information it’s learning from. Amazon’s tool was designed to analyse hiring patterns over a ten-year period — a period when the vast majority of hires, especially engineers and data scientists, were male.

In effect, Amazon’s system taught itself that male candidates were preferable. It penalized resumes that included the word “women’s,” as in “women’s chess club captain.” And it downgraded graduates of two all-women’s colleges, according to people familiar with the matter. They did not specify the names of the schools.

This trend led to the tool ‘learning’ that Amazon did not want to hire female applicants.

According to Reuters, Amazon recruiters never relied on the tool entirely and stopped using it once the issue was raised. Now, the tech giant only uses a very watered down version of the tool, and only for administrative tasks.

As of 2017 Amazon’s workforce are 40% female — a more equal gender split than Facebook (36%), Apple (32%), Google (31%) and Microsoft (26%).

Amazon, the company managed to salvage some of what it learned from its failed AI experiment. It now uses a “much-watered down version” of the recruiting engine to help with some rudimentary chores, including culling duplicate candidate profiles from databases, one of the people familiar with the project said.

Blockchain Hub set to launch in Melbourne by the Foundation
Amazon accidentally made an A.I Recruitment Tool that Didn’t like Women
Share This Article
Blockchain Hub set to launch in Melbourne by the Foundation