Item Amazon scraps secret AI recruiting tool that showed bias against women (Thanks to Mark Charters for the tip.)

Amazon.com Inc’s (AMZN.O) machine-learning specialists uncovered a big problem: their new recruiting engine did not like women. The team had been building computer programs since 2014 to review job applicants’ resumes with the aim of mechanizing the search for top talent, five people familiar with the effort told Reuters… “Everyone wanted this holy grail,” one of the people said. “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those.” But by 2015, the company realized its new system was not rating candidates for software developer jobs and other technical posts in a gender-neutral way. That is because Amazon’s computer models were trained to vet applicants by observing patterns in resumes submitted to the company over a 10-year period. Most came from men, a reflection of male dominance across the tech industry.

We’ve seen before that “algorithms” are called “racist.” Read it.

Feed the algorithm, the curve-fitting AI, measures such as use of purple hair dye, purchased tampons or bought video game, and so forth, and it will, for the painfully obvious reasons, pick out men from women. Not with perfection, of course, but it will be pretty good.

Then, since as everybody knows, but many don’t like knowing, men at the extremes are better at analytic tasks than non-men, an algorithm to maximize a candidate’s ability to code not fed sex, but measures highly predictive of sex, will pick out more men than women. The algorithm will be “biased” (to reality).

There are only two ways to avoid the algorithm suggesting more men than women: (1) feed the algorithm only measures which in no way are predictive of sex; but, since men (at the extremes) are better than non-men at coding, the algorithm will do a lousy job predicting coding success; (2) instruct the algorithm to spit out Equality; which also will force the algorithm to do a rotten job.

Equality is defined as the hope in absence of all evidence that men and women are equal. But if men and women were equal, we would not even know to say “men” and “women.”

Bias is defined as politically unacceptable result.

Share this: Facebook

Reddit

Twitter

Pinterest

Email

More

Tumblr

LinkedIn



WhatsApp

Print



