Children
Affecté par des incidents
Incident 62418 Rapports
Child Sexual Abuse Material Taints Image Generators
2023-12-20
The LAION-5B dataset (a commonly used dataset with more than 5 billion image-description pairs) was found by researchers to contain child sexual abuse material (CSAM), which increases the likelihood that downstream models will produce CSAM imagery. The discovery taints models built with the LAION dataset requiring many organizations to retrain those models. Additionally, LAION must now scrub the dataset of the imagery.
PlusIncident 5516 Rapports
Alexa Plays Pornography Instead of Kids Song
2016-12-30
An Amazon Echo Dot using the Amazon Alex software started to play pornographic results when a child asked it to play a song.
PlusIncident 114 Rapports
Google’s YouTube Kids App Presents Inappropriate Content
2015-05-19
YouTube’s content filtering and recommendation algorithms exposed children to disturbing and inappropriate videos.
PlusIncident 95811 Rapports
Europol Operation Cumberland Investigates at Least 273 Suspects in 19 Countries for AI-Generated Child Sexual Abuse Material
2025-02-26
Europol’s Operation Cumberland uncovered a global network distributing AI-generated child sexual abuse material (CSAM). The operation has led to 25 arrests and 273 identified suspects across 19 countries. The AI-enabled abuse allows criminals to create exploitative content at scale with minimal expertise.
PlusEntités liées
Autres entités liées au même incident. Par exemple, si le développeur d'un incident est cette entité mais que le responsable de la mise en œuvre est une autre entité, ils sont marqués comme entités liées.
Entités liées
Incidents impliqués en tant que développeur et déployeur
- Incident 7881 Report
Instagram's Algorithm Reportedly Recommended Sexual Content to Teenagers' Accounts
- Incident 5831 Report
Instagram Algorithms Allegedly Promote Accounts Facilitating Child Sex Abuse Content