Amazon AI Hiring System is Sexist

#1

Grand Vol

Official VN Armorer
Lab Rat
Joined
Nov 23, 2012
Messages
75,760
Likes
104,340
#1
We don't have an applicable thread, so figured it'd be good on its own.

Amazon scraps secret AI recruiting tool that showed bias against women | Reuters

Amazon Killed an AI Recruitment System Because It Couldn't Stop the Tool from Discriminating Against Women

According to a new Reuters report, Amazon spent years working on a system for automating the recruitment process. The idea was for this AI-powered system to be able to look at a collection of resumes and name the top candidates. To achieve this, Amazon fed the system a decade’s worth of resumes from people applying for jobs at Amazon.

The tech industry is famously male-dominated and, accordingly, most of those resumes came from men. So, trained on that selection of information, the recruitment system began to favor men over women.

According to Reuters’ sources, Amazon’s system taught itself to downgrade resumes with the word “women’s” in them, and to assign lower scores to graduates of two women-only colleges. Meanwhile, it decided that words such as “executed” and “captured,” which are apparently deployed more often in the resumes of male engineers, suggested the candidate should be ranked more highly.
 
#3
#3
I think the term AI gets tossed around WAY too much.

interesting that the AI is basing its process on pure numbers and not percentages. Don't know if that would actually help women but going off pure numbers is definitely going to be weighted.
 
#4
#4
This looks like a bigger issue with the training data set than with this particular AI (or AI in general). If the examples used to train the AI don't specify which properties of the application/resume were important to the hiring decision (read: if there's no context), then the AI has to figure it out from the raw data. That gives it no way to distinguish between properties resulting from current trends and properties that are actually relevant to the task at hand when there is a strong correlation with the outcomes.

Example: If most of the people hired over the last decade used a certain font on their resumes, and the training set includes this information without specifying in some way that it's irrelevant to the hiring decision, then we can expect the algorithm to discriminate against people who used a different font on their resume. It's meaningless noise in the training data set that happens to have a strong correlation with the outcomes, which is confusing to an AI trying to determine the context on its own.
 
#6
#6
Guess I need to add executed and captured to my resume someplace. Probably not together though. That might not look very good.
 
  • Like
Reactions: Quaint88
#7
#7
Doesn't have much to do with the topic, but I would leave my Gov job to work for Amazon. The hours are long, but they pay incredibly well at the management level.
 

VN Store



Back
Top