Home Business Amazon Scraps Secret AI Recruiting Tool That Showed Bias Against Women (Video)

Amazon Scraps Secret AI Recruiting Tool That Showed Bias Against Women (Video)

phishing

Amazon software engineers uncovered a big problem: their new recruiting engine did not like women. Reuters’ Jeffrey Dastin explains.

Amazon.com Inc’s (AMZN.O) machine-learning specialists uncovered a big problem: their new recruiting engine did not like women.

The team had been building computer programs since 2014 to review job applicants’ resumes with the aim of mechanizing the search for top talent, five people familiar with the effort told Reuters.

Automation has been key to Amazon’s e-commerce dominance, be it inside warehouses or driving pricing decisions. The company’s experimental hiring tool used artificial intelligence to give job candidates scores ranging from one to five stars – much like shoppers rate products on Amazon, some of the people said.

“Everyone wanted this holy grail,” one of the people said. “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those.”

Faith Based Events

But by 2015, the company realized its new system was not rating candidates for software developer jobs and other technical posts in a gender-neutral way.

That is because Amazon’s computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men, a reflection of male dominance across the tech industry.

In effect, Amazon’s system taught itself that male candidates were preferable. It penalized resumes that included the word “women’s,” as in “women’s chess club captain.” And it downgraded graduates of two all-women’s colleges, according to people familiar with the matter. They did not specify the names of the schools.

Amazon edited the programs to make them neutral to these particular terms. But that was no guarantee that the machines would not devise other ways of sorting candidates that could prove discriminatory, the people said.

[vc_btn title=”Continue reading” color=”primary” link=”url:https%3A%2F%2Fwww.reuters.com%2Farticle%2Fus-amazon-com-jobs-automation-insight%2Famazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G||target:%20_blank|”][vc_message message_box_style=”outline” message_box_color=”blue”]Reuters, excerpt posted on  SouthFloridaReporter.com, Oct. 10, 2018

Video by Reuters TV/Jeffrey Dastin[/vc_message]


Disclaimer

The information contained in South Florida Reporter is for general information purposes only.
The South Florida Reporter assumes no responsibility for errors or omissions in the contents of the Service.
In no event shall the South Florida Reporter be liable for any special, direct, indirect, consequential, or incidental damages or any damages whatsoever, whether in an action of contract, negligence or other tort, arising out of or in connection with the use of the Service or the contents of the Service. The Company reserves the right to make additions, deletions, or modifications to the contents of the Service at any time without prior notice.
The Company does not warrant that the Service is free of viruses or other harmful components