Amazon scraps secret AI recruiting tool that showed bias against women:

"Amazon’s system taught itself that male candidates were preferable. It penalized resumes that included the word “women’s,” as in “women’s chess club captain.”

Instead, the technology favored candidates who described themselves using verbs more commonly found on male engineers’ resumes, such as “executed” and “captured,” one person said."

This is the kind of nonsense that inspired my colleagues at Scrapinghub to write ELI5:
With common linear models it can directly explain the scoring per-word and even highlight passages of text in a Jupyter notebook. It can also do black-box analysis. You can directly view the scores for a whole lexicon and pick up irrational biases from the dataset. I have used it.. would never now not-use it. :)

Sign in to participate in the conversation
Mastodon U. Twente

A social network for the University of Twente community. Anyone with an or @* email address can create an account here and participate in the global fediverse with millions of other users. This means that students, staff and alumni can create an account here. Content does not reflect the opinions or policies of the University of Twente.
We support \( \LaTeX \) formulas: Use \( and \) for inline LaTeX formulas, and \[ and \] for display mode.