Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Entities

racial minority groups

Incidents Harmed By

Incident 5293 Report
Stable Diffusion Exhibited Biases for Prompts Featuring Professions

2022-08-22

Stable Diffusion reportedly posed risks of bias and stereotyping along gender and cultural lines for prompts containing descriptors and professions.

More

Incident 3671 Report
iGPT, SimCLR Learned Biased Associations from Internet Training Data

2020-06-17

Unsupervised image generation models trained using Internet images such as iGPT and SimCLR were shown to have embedded racial, gender, and intersectional biases, resulting in stereotypical depictions.

More

Related Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.
 

Entity

OpenAI

Incidents involved as both Developer and Deployer
  • Incident 367
    1 Report

    iGPT, SimCLR Learned Biased Associations from Internet Training Data

More
Entity

Google

Incidents involved as both Developer and Deployer
  • Incident 367
    1 Report

    iGPT, SimCLR Learned Biased Associations from Internet Training Data

More
Entity

gender minority groups

Incidents Harmed By
  • Incident 529
    3 Reports

    Stable Diffusion Exhibited Biases for Prompts Featuring Professions

  • Incident 367
    1 Report

    iGPT, SimCLR Learned Biased Associations from Internet Training Data

More
Entity

underrepresented groups in training data

Incidents Harmed By
  • Incident 367
    1 Report

    iGPT, SimCLR Learned Biased Associations from Internet Training Data

More
Entity

Stability AI

Incidents involved as both Developer and Deployer
  • Incident 529
    3 Reports

    Stable Diffusion Exhibited Biases for Prompts Featuring Professions

More
Entity

Runway

Incidents involved as Developer
  • Incident 529
    3 Reports

    Stable Diffusion Exhibited Biases for Prompts Featuring Professions

More
Entity

LAION

Incidents involved as Developer
  • Incident 529
    3 Reports

    Stable Diffusion Exhibited Biases for Prompts Featuring Professions

More
Entity

EleutherAI

Incidents involved as Developer
  • Incident 529
    3 Reports

    Stable Diffusion Exhibited Biases for Prompts Featuring Professions

More
Entity

CompVis LMU

Incidents involved as Developer
  • Incident 529
    3 Reports

    Stable Diffusion Exhibited Biases for Prompts Featuring Professions

More
Entity

Women

Incidents Harmed By
  • Incident 529
    3 Reports

    Stable Diffusion Exhibited Biases for Prompts Featuring Professions

More

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2023 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • 9d70fba