Instagram users
Incidents Harmed By
Incident 26710 Reports
Clearview AI Algorithm Built on Photos Scraped from Social Media Profiles without Consent
2017-06-15
Face-matching algorithm by Clearview AI was built using scraped images from social media sites such as Instagram and Facebook without user consent, violating social media site policies, and allegedly privacy regulations.
MoreIncident 3246 Reports
GAN Faces Deployed by The BL's Fake Account Network to Push Pro-Trump Content on Meta Platforms
2019-11-12
A large network of pages, groups, and fake accounts having GAN-generated face photos associated with The BL, a US-based media outlet, reportedly bypassed Facebook moderation systems to push "pro-Trump" narratives on its platform and Instagram.
MoreIncident 1904 Reports
ByteDance Allegedly Trained "For You" Algorithm Using Content Scraped without Consent from Other Social Platforms
2017-01-15
ByteDance allegedly scraped short-form videos, usernames, profile pictures, and descriptions of accounts on Instagram, Snapchat, and other sources, and uploaded them without consent on Flipagram, TikTok’s predecessor, in order to improve its “For You” algorithm's performance on American users.
MoreIncident 4693 Reports
Automated Adult Content Detection Tools Showed Bias against Women Bodies
2006-02-25
Automated content moderation tools to detect sexual explicitness or "raciness" reportedly exhibited bias against women bodies, resulting in suppression of reach despite not breaking platform policies.
MoreIncidents involved as Deployer
Incident 7975 Reports
Teenager Makes Deepfake Pornography of 50 Girls at Bacchus Marsh Grammar School in Australia
2024-06-07
At Bacchus Marsh Grammar, a school in Victoria state in Australia, an adolescent male made deepfake pornography of 50 girls, ages 9 to 12. He then allegedly uploaded the pictures to Instagram, and others subsequently shared them on Snapchat.
MoreRelated Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.
Related Entities
Incidents involved as both Developer and Deployer
- Incident 3432 Reports
Facebook, Instagram, and Twitter Failed to Proactively Remove Targeted Racist Remarks via Automated Systems
- Incident 3591 Report
Facebook, Instagram, and Twitter Cited Errors in Automated Systems as Cause for Blocking pro-Palestinian Content on Israeli-Palestinian Conflict
Incidents involved as Deployer
- Incident 4693 Reports
Automated Adult Content Detection Tools Showed Bias against Women Bodies
- Incident 7581 Report
Teen's Overdose Reportedly Linked to Meta's AI Systems Failing to Block Ads for Illegal Drugs
Incidents implicated systems
Incidents involved as both Developer and Deployer
- Incident 3432 Reports
Facebook, Instagram, and Twitter Failed to Proactively Remove Targeted Racist Remarks via Automated Systems
- Incident 3312 Reports
Bug in Instagram’s “Related Hashtags” Algorithm Allegedly Caused Disproportionate Treatment of Political Hashtags
Incidents involved as Deployer
- Incident 4693 Reports
Automated Adult Content Detection Tools Showed Bias against Women Bodies
- Incident 7232 Reports
Instagram Algorithms Reportedly Directed Children's Merchandise Ad Campaign to Adult Men and Sex Offenders
Incidents implicated systems
Meta
Incidents involved as both Developer and Deployer
- Incident 7232 Reports
Instagram Algorithms Reportedly Directed Children's Merchandise Ad Campaign to Adult Men and Sex Offenders
- Incident 8852 Reports
Meta AI Characters Allegedly Exhibited Racism, Fabricated Identities, and Exploited User Trust