Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる
発見する
投稿する
  • ようこそAIIDへ
  • インシデントを発見
  • 空間ビュー
  • テーブル表示
  • リスト表示
  • 組織
  • 分類法
  • インシデントレポートを投稿
  • 投稿ランキング
  • ブログ
  • AIニュースダイジェスト
  • リスクチェックリスト
  • おまかせ表示
  • サインアップ
閉じる

インシデント 37: Female Applicants Down-Ranked by Amazon Recruiting Tool

概要: Amazon shuts down internal AI recruiting tool that would down-rank female applicants.

ツール

新しいレポート新しいレポート新しいレスポンス新しいレスポンス発見する発見する履歴を表示履歴を表示

組織

すべての組織を表示
推定: Amazonが開発し提供したAIシステムで、female applicantsに影響を与えた

インシデントのステータス

インシデントID
37
レポート数
33
インシデント発生日
2016-08-10
エディタ
Sean McGregor
Applied Taxonomies
CSETv0, CSETv1, GMF, MIT

CSETv1 分類法のクラス

分類法の詳細

Incident Number

The number of the incident in the AI Incident Database.
 

37

CSETv0 分類法のクラス

分類法の詳細

Problem Nature

Indicates which, if any, of the following types of AI failure describe the incident: "Specification," i.e. the system's behavior did not align with the true intentions of its designer, operator, etc; "Robustness," i.e. the system operated unsafely because of features or changes in its environment, or in the inputs the system received; "Assurance," i.e. the system could not be adequately monitored or controlled during operation.
 

Specification

Physical System

Where relevant, indicates whether the AI system(s) was embedded into or tightly associated with specific types of hardware.
 

Software only

Level of Autonomy

The degree to which the AI system(s) functions independently from human intervention. "High" means there is no human involved in the system action execution; "Medium" means the system generates a decision and a human oversees the resulting action; "low" means the system generates decision-support output and a human makes a decision and executes an action.
 

Medium

Nature of End User

"Expert" if users with special training or technical expertise were the ones meant to benefit from the AI system(s)’ operation; "Amateur" if the AI systems were primarily meant to benefit the general public or untrained users.
 

Expert

Public Sector Deployment

"Yes" if the AI system(s) involved in the accident were being used by the public sector or for the administration of public goods (for example, public transportation). "No" if the system(s) were being used in the private sector or for commercial purposes (for example, a ride-sharing company), on the other.
 

No

Data Inputs

A brief description of the data that the AI system(s) used or were trained on.
 

Resumes

MIT 分類法のクラス

Machine-Classified
分類法の詳細

Risk Subdomain

A further 23 subdomains create an accessible and understandable classification of hazards and harms associated with AI
 

1.1. Unfair discrimination and misrepresentation

Risk Domain

The Domain Taxonomy of AI Risks classifies risks into seven AI risk domains: (1) Discrimination & toxicity, (2) Privacy & security, (3) Misinformation, (4) Malicious actors & misuse, (5) Human-computer interaction, (6) Socioeconomic & environmental harms, and (7) AI system safety, failures & limitations.
 
  1. Discrimination and Toxicity

Entity

Which, if any, entity is presented as the main cause of the risk
 

AI

Timing

The stage in the AI lifecycle at which the risk is presented as occurring
 

Post-deployment

Intent

Whether the risk is presented as occurring as an expected or unexpected outcome from pursuing a goal
 

Unintentional

インシデントレポート

レポートタイムライン

Incident Occurrence+23
Amazon scraps 'sexist AI' recruitment tool
+4
Amazon Shuts Down It’s Problematic Sexist AI Recruitment System
Sexist AI: Amazon ditches recruitment tool that turned out to be anti-women+1
New York City proposes regulating algorithms used in hiring
AI tools fail to reduce recruitment bias - study
Amazon scraps 'sexist AI' recruitment tool

Amazon scraps 'sexist AI' recruitment tool

independent.co.uk

Amazon scraps 'sexist AI' recruiting tool that showed bias against women

Amazon scraps 'sexist AI' recruiting tool that showed bias against women

telegraph.co.uk

How Amazon Accidentally Invented a Sexist Hiring Algorithm

How Amazon Accidentally Invented a Sexist Hiring Algorithm

inc.com

Amazon Killed Its AI Recruitment System For Bias Against Women-Report

Amazon Killed Its AI Recruitment System For Bias Against Women-Report

fortune.com

Amazon scrapped 'sexist AI' tool

Amazon scrapped 'sexist AI' tool

bbc.com

Amazon scraps secret AI recruiting tool that showed bias against women

Amazon scraps secret AI recruiting tool that showed bias against women

reuters.com

Amazon's AI hiring tool discriminated against women.

Amazon's AI hiring tool discriminated against women.

slate.com

Amazon's AI recruitment tool scrapped for being sexist

Amazon's AI recruitment tool scrapped for being sexist

alphr.com

Amazon Fired Its Resume-Reading AI for Sexism

Amazon Fired Its Resume-Reading AI for Sexism

popularmechanics.com

It turns out Amazon’s AI hiring tool discriminated against women

It turns out Amazon’s AI hiring tool discriminated against women

siliconrepublic.com

Amazon ditches sexist AI

Amazon ditches sexist AI

information-age.com

Amazon's sexist recruiting algorithm reflects a larger gender bias

Amazon's sexist recruiting algorithm reflects a larger gender bias

mashable.com

Amazon abandoned sexist AI recruitment tool

Amazon abandoned sexist AI recruitment tool

channels.theinnovationenterprise.com

Amazon ditched AI recruiting tool that favored men for technical jobs

Amazon ditched AI recruiting tool that favored men for technical jobs

theguardian.com

Amazon Accidentally Created A 'Sexist' Recruitment Tool, Then Shut It Down

Amazon Accidentally Created A 'Sexist' Recruitment Tool, Then Shut It Down

aplus.com

Amazon Shuts Down AI Hiring Tool for Being Sexist

Amazon Shuts Down AI Hiring Tool for Being Sexist

globalcitizen.org

Amazon built an AI tool to hire people but had to shut it down because it was discriminating against women

Amazon built an AI tool to hire people but had to shut it down because it was discriminating against women

businessinsider.com.au

Amazon scraps ‘sexist’ AI hiring tool

Amazon scraps ‘sexist’ AI hiring tool

news.com.au

Amazon ditches AI recruitment tool that 'learnt to be sexist'

Amazon ditches AI recruitment tool that 'learnt to be sexist'

afr.com

Amazon Shuts Down Secret AI Recruiting Tool That Taught Itself to be Sexist

Amazon Shuts Down Secret AI Recruiting Tool That Taught Itself to be Sexist

interestingengineering.com

Amazon trained a sexism-fighting, resume-screening AI with sexist hiring data, so the bot became sexist

Amazon trained a sexism-fighting, resume-screening AI with sexist hiring data, so the bot became sexist

boingboing.net

Amazon scraps sexist AI recruiting tool

Amazon scraps sexist AI recruiting tool

radionz.co.nz

Amazon AI sexist tool scrapped

Amazon AI sexist tool scrapped

insights.tmpw.co.uk

Is Tech Doomed To Reflect The Worst In All Of Us?

Is Tech Doomed To Reflect The Worst In All Of Us?

tech.co

Amazon Shuts Down It’s Problematic Sexist AI Recruitment System

Amazon Shuts Down It’s Problematic Sexist AI Recruitment System

mansworldindia.com

Is AI Sexist?

Is AI Sexist?

wellesley.edu

Amazon scraps 'sexist AI' recruiting tool that showed bias against women

Amazon scraps 'sexist AI' recruiting tool that showed bias against women

msn.com

Why Amazon's sexist AI recruiting tool is better than a human.

Why Amazon's sexist AI recruiting tool is better than a human.

imd.org

2018 in Review: 10 AI Failures

2018 in Review: 10 AI Failures

medium.com

Sexist AI: Amazon ditches recruitment tool that turned out to be anti-women

Sexist AI: Amazon ditches recruitment tool that turned out to be anti-women

rt.com

New York City proposes regulating algorithms used in hiring

New York City proposes regulating algorithms used in hiring

arstechnica.com

Auditors are testing hiring algorithms for bias, but there’s no easy fix

Auditors are testing hiring algorithms for bias, but there’s no easy fix

technologyreview.com

AI tools fail to reduce recruitment bias - study

AI tools fail to reduce recruitment bias - study

bbc.com

Amazon scraps 'sexist AI' recruitment tool
independent.co.uk · 2018

Amazon has scrapped a “sexist” tool that used artificial intelligence to decide the best candidates to hire for jobs.

Members of the team working on the system said it effectively taught itself that male candidates were preferable.

The arti…

Amazon scraps 'sexist AI' recruiting tool that showed bias against women
telegraph.co.uk · 2018

Amazon has scrapped a "sexist" internal tool that used artificial intelligence to sort through job applications.

The AI was created by a team at Amazon's Edinburgh office in 2014 as a way to automatically sort through CVs and pick out the m…

How Amazon Accidentally Invented a Sexist Hiring Algorithm
inc.com · 2018

Amazon discovered a problem with using artificial intelligence to hire: their AI was biased against women.

The Seattle-based company developed computer programs designed to filter through hundreds of resumes and surface the best candidates,…

Amazon Killed Its AI Recruitment System For Bias Against Women-Report
fortune.com · 2018

Machine learning, one of the core techniques in the field of artificial intelligence, involves teaching automated systems to devise new ways of doing things, by feeding them reams of data about the subject at hand. One of the big fears here…

Amazon scrapped 'sexist AI' tool
bbc.com · 2018

Image copyright Getty Images Image caption The algorithm repeated bias towards men, reflected in the technology industry

An algorithm that was being tested as a recruitment tool by online giant Amazon was sexist and had to be scrapped, acco…

Amazon scraps secret AI recruiting tool that showed bias against women
reuters.com · 2018

SAN FRANCISCO (Reuters) - Amazon.com Inc’s (AMZN.O) machine-learning specialists uncovered a big problem: their new recruiting engine did not like women.

The team had been building computer programs since 2014 to review job applicants’ resu…

Amazon's AI hiring tool discriminated against women.
slate.com · 2018

Amazon sign, with dude. David Ryder/Getty Images

Thanks to Amazon, the world has a nifty new cautionary tale about the perils of teaching computers to make human decisions.

According to a Reuters report published Wednesday, the tech giant d…

Amazon's AI recruitment tool scrapped for being sexist
alphr.com · 2018

Amazon has been forced to scrap its AI recruitment system after it was discovered to be biased against female applicants.

The AI was developed in 2014 by Amazon as a way of filtering out most candidates to provide the firm with the top five…

Amazon Fired Its Resume-Reading AI for Sexism
popularmechanics.com · 2018

Algorithms are often pitched as being superior to human judgement, taking the guesswork out of decisions ranging from driving to writing an email. But they're still programmed by humans and trained on the data that humans create, which mean…

It turns out Amazon’s AI hiring tool discriminated against women
siliconrepublic.com · 2018

Amazon had to scrap its AI hiring tool because it was ‘sexist’ and discriminated against female applicants, a report from Reuters has found.

Amazon’s hopes for creating the perfect AI hiring tool were dashed when it realised that the algori…

Amazon ditches sexist AI
information-age.com · 2018

Amazon ditches sexist AI

It’s not news to learn that AI can be something of a bigot.

Amazon scrapped an algorithm designed to become a recruitment tool because it was too sexist.

Did you hear the one about my wife — well, she… is a really n…

Amazon's sexist recruiting algorithm reflects a larger gender bias
mashable.com · 2018

AI may have sexist tendencies. But, sorry, the problem is still us humans.

Amazon recently scrapped an employee recruiting algorithm plagued with problems, according to a report from Reuters. Ultimately, the applicant screening algorithm di…

Amazon abandoned sexist AI recruitment tool
channels.theinnovationenterprise.com · 2018

Amazon decided to scrap a machine learning (ML) algorithm it was creating to help automate the recruitment process because the model kept favoring male candidates, Reuters revealed. The discrimination against female candidates has been put …

Amazon ditched AI recruiting tool that favored men for technical jobs
theguardian.com · 2018

Specialists had been building computer programs since 2014 to review résumés in an effort to automate the search process

This article is more than 5 months old

This article is more than 5 months old

Amazon’s machine-learning specialists unc…

Amazon Accidentally Created A 'Sexist' Recruitment Tool, Then Shut It Down
aplus.com · 2018

Machine learning technology is becoming increasingly common across various industries, from policing to recruiting. But reports have shown that many of these systems have long-standing problems regarding discrimination. To avoid amplifying …

Amazon Shuts Down AI Hiring Tool for Being Sexist
globalcitizen.org · 2018

Why Global Citizens Should Care

Gender discrimination in the workplace prevents women from achieving to their full potential. Eliminating gender inequality in the workforce would greatly increase economic activity. When half of the populati…

Amazon built an AI tool to hire people but had to shut it down because it was discriminating against women
businessinsider.com.au · 2018

David Ryder/Getty Images Amazon CEO Jeff Bezos.

Amazon tried building an artificial-intelligence tool to help with recruiting, but it showed a bias against women,Reuters reports.

Engineers reportedly found the AI was unfavorable toward fema…

Amazon scraps ‘sexist’ AI hiring tool
news.com.au · 2018

What is artificial intelligence (AI)? We look at the progress of AI and automation in Australia compared to the rest of the world and how the Australian workforce may be affected by this movement.

Will the rise of AI take away our jobs? 0:5…

Amazon ditches AI recruitment tool that 'learnt to be sexist'
afr.com · 2018

London | Amazon has scrapped a "sexist" internal tool that used artificial intelligence to sort through job applications.

The program was created by a team at Amazon's Edinburgh office in 2014 as a way to sort through CVs and pick out the m…

Amazon Shuts Down Secret AI Recruiting Tool That Taught Itself to be Sexist
interestingengineering.com · 2018

Artificial intelligence (AI) human resourcing tools are all the rage at the moment and becoming increasingly popular. The systems can speed up, simplify and even decrease the cost of the hiring process becoming every recruiter's dream come …

Amazon trained a sexism-fighting, resume-screening AI with sexist hiring data, so the bot became sexist
boingboing.net · 2018

Amazon trained a sexism-fighting, resume-screening AI with sexist hiring data, so the bot became sexist

Some parts of machine learning are incredibly esoteric and hard to grasp, surprising even seasoned computer science pros; other parts of…

Amazon scraps sexist AI recruiting tool
radionz.co.nz · 2018

Amazon's has scraped its artificial intelligence hiring tool after it was found to be sexist.

Photo: © 2014, Ken Wolter

A team of specialists familiar with the project told Reuters that they had been building computer programmes since 2014…

Amazon AI sexist tool scrapped
insights.tmpw.co.uk · 2018

So AI may be the future in hiring and recruitment but it certainly isn't there yet it seems.

If you're basing it's learning on history which quite possibly may have been biased towards men, then it is likely that it will discriminate agains…

Is Tech Doomed To Reflect The Worst In All Of Us?
tech.co · 2018

Amazon’s AI gurus scrapped a new machine-learning recruiting engine earlier this month. Why? It transpired that the AI behind it was sexist. What does this mean as we race to produce ever-better artificial intelligence, and how can we under…

Amazon Shuts Down It’s Problematic Sexist AI Recruitment System
mansworldindia.com · 2018

The tech giant canned their experimental recruitment system riddled with problems, according to Reuters.

Amazon, back in 2014, set up the recruiting system in place, hoping to mechanize the entire hiring process. It used artificial intellig…

Is AI Sexist?
wellesley.edu · 2018

Amazon recently scrapped an experimental artificial intelligence (AI) recruiting tool that was found to be biased against women. At this point, I hope you might have a few questions, such as: What is an AI recruiting tool and how does it wo…

Amazon scraps 'sexist AI' recruiting tool that showed bias against women
msn.com · 2018

Amazon has scrapped a "sexist" internal tool that used artificial intelligence to sort through job applications.

The AI was created by a team at Amazon's Edinburgh office in 2014 as a way to automatically sort through CVs and pick out the m…

Why Amazon's sexist AI recruiting tool is better than a human.
imd.org · 2018

However, bias also appears for other unrelated reasons. A recent study into how an algorithm delivered ads promoting STEM jobs showed that men were more likely to be shown the ad, not because men were more likely to click on it, but because…

2018 in Review: 10 AI Failures
medium.com · 2018

Last December Synced compiled its first “Artificial Intelligence Failures” recap of AI gaffes from the previous year. AI has achieved remarkable progress, and many scientists dream of creating the Master Algorithm proposed by Pedro Domingos…

Sexist AI: Amazon ditches recruitment tool that turned out to be anti-women
rt.com · 2019

It was supposed to make finding the right person for the job easier. However, an AI tool developed by Amazon to sift through potential hires has been dropped by the firm after developers found it was biased against picking women.

From prici…

New York City proposes regulating algorithms used in hiring
arstechnica.com · 2021

In 1964, the Civil Rights Act barred the humans who made hiring decisions from discriminating on the basis of sex or race. Now, software often contributes to those hiring decisions, helping managers screen résumés or interpret video intervi…

Auditors are testing hiring algorithms for bias, but there’s no easy fix
technologyreview.com · 2021

I’m at home playing a video game on my computer. My job is to pump up one balloon at a time and earn as much money as possible. Every time I click “Pump,” the balloon expands and I receive five virtual cents. But if the balloon pops before …

AI tools fail to reduce recruitment bias - study
bbc.com · 2022

Artificially intelligent hiring tools do not reduce bias or improve diversity, researchers say in a study.

"There is growing interest in new ways of solving problems such as interview bias," the Cambridge University researchers say, in the …

バリアント

「バリアント」は既存のAIインシデントと同じ原因要素を共有し、同様な被害を引き起こし、同じ知的システムを含んだインシデントです。バリアントは完全に独立したインシデントとしてインデックスするのではなく、データベースに最初に投稿された同様なインシデントの元にインシデントのバリエーションとして一覧します。インシデントデータベースの他の投稿タイプとは違い、バリアントではインシデントデータベース以外の根拠のレポートは要求されません。詳細についてはこの研究論文を参照してください

よく似たインシデント

テキスト類似度による

Did our AI mess up? Flag the unrelated incidents

AI Beauty Judge Did Not Like Dark Skin

A beauty contest was judged by AI and the robots didn't like dark skin

Sep 2016 · 10 レポート
Biased Sentiment Analysis

Google's sentiment analysis API is just as biased as humans

Oct 2017 · 7 レポート
Racist AI behaviour is not a new problem

Racist AI behaviour is not a new problem

Mar 1998 · 4 レポート
前のインシデント次のインシデント

よく似たインシデント

テキスト類似度による

Did our AI mess up? Flag the unrelated incidents

AI Beauty Judge Did Not Like Dark Skin

A beauty contest was judged by AI and the robots didn't like dark skin

Sep 2016 · 10 レポート
Biased Sentiment Analysis

Google's sentiment analysis API is just as biased as humans

Oct 2017 · 7 レポート
Racist AI behaviour is not a new problem

Racist AI behaviour is not a new problem

Mar 1998 · 4 レポート

リサーチ

  • “AIインシデント”の定義
  • “AIインシデントレスポンス”の定義
  • データベースのロードマップ
  • 関連研究
  • 全データベースのダウンロード

プロジェクトとコミュニティ

  • AIIDについて
  • コンタクトとフォロー
  • アプリと要約
  • エディタのためのガイド

インシデント

  • 全インシデントの一覧
  • フラグの立ったインシデント
  • 登録待ち一覧
  • クラスごとの表示
  • 分類法

2023 - AI Incident Database

  • 利用規約
  • プライバシーポリシー
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • 30ebe76