Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Incident 957: Alleged Instagram Algorithm Malfunction Floods Users’ Reels Feeds with Violent and Graphic Content

Description: An alleged Instagram algorithm malfunction caused users’ Reels feeds to be overwhelmed with violent and distressing content. Many reported seeing deaths, extreme brutality, and other graphic material in rapid succession, often without prior engagement with similar content. The sudden exposure caused psychological distress for many users, with minors and vulnerable individuals particularly affected by the graphic content. Meta confirmed an AI failure was responsible and apologized.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Entities

View all entities
Alleged: Meta , Instagram recommendation algorithm and Instagram Reels developed and deployed an AI system, which harmed Meta users , Instagram users and minors.
Alleged implicated AI systems: Instagram recommendation algorithm and Instagram Reels

Incident Stats

Incident ID
957
Report Count
1
Incident Date
2025-02-28
Editors

Incident Reports

Reports Timeline

+1
Meta apologises over flood of gore, violence and dead bodies on Instagram
Meta apologises over flood of gore, violence and dead bodies on Instagram

Meta apologises over flood of gore, violence and dead bodies on Instagram

theguardian.com

Meta apologises over flood of gore, violence and dead bodies on Instagram
theguardian.com · 2025

Mark Zuckerberg’s Meta has apologised after Instagram users were subjected to a flood of violence, gore, animal abuse and dead bodies on their Reels feeds.

Users reported the footage after an apparent malfunction in Instagram’s algorithm, w…

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

YouTube's Algorithms Failed to Remove Violating Content Related to Suicide and Self-Harm

YouTube's Algorithms Failed to Remove Violating Content Related to Suicide and Self-Harm

Feb 2019 · 3 reports
Images of Black People Labeled as Gorillas

Images of Black People Labeled as Gorillas

Jun 2015 · 24 reports
TikTok’s Content Moderation Allegedly Failed to Adequately Take down Videos Promoting Eating Disorders

TikTok’s Content Moderation Allegedly Failed to Adequately Take down Videos Promoting Eating Disorders

Dec 2020 · 1 report
Previous IncidentNext Incident

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

YouTube's Algorithms Failed to Remove Violating Content Related to Suicide and Self-Harm

YouTube's Algorithms Failed to Remove Violating Content Related to Suicide and Self-Harm

Feb 2019 · 3 reports
Images of Black People Labeled as Gorillas

Images of Black People Labeled as Gorillas

Jun 2015 · 24 reports
TikTok’s Content Moderation Allegedly Failed to Adequately Take down Videos Promoting Eating Disorders

TikTok’s Content Moderation Allegedly Failed to Adequately Take down Videos Promoting Eating Disorders

Dec 2020 · 1 report

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2023 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • 30ebe76