Description: Kate Isaacs, a London-based activist and founder of the #NotYourPorn campaign, was targeted in a deepfake incident. Her face was alleged to have been digitally manipulated onto a pornographic video using AI and shared online. The reported video, tagged with her name, is alleged to have led to streams of abuse, doxing, and threats of violence. The attack reportedly followed her efforts to pressure PornHub to remove unverified content.
Editor Notes: A note on the timeline of this event: The specific timing of this incident remains unclear. The #NotYourPorn campaign is reported to have succeeded in pressuring PornHub to remove 10 million unverified videos in 2020. It is reported that the deepfaked video itself appeared sometime in 2020 as well. The initial reporting on the deepfake incident is dated 10/21/2022.
Tools
New ReportNew ResponseDiscoverView History
The OECD AI Incidents and Hazards Monitor (AIM) automatically collects and classifies AI-related incidents and hazards in real time from reputable news sources worldwide.
Entities
View all entitiesAlleged: Unknown deepfake technology developer developed an AI system deployed by Unknown Twitter user, which harmed Kate Isaacs.
Alleged implicated AI system: Unknown deepfake app
Incident Stats
Incident ID
904
Report Count
8
Incident Date
2022-10-21
Editors
Dummy Dummy
Incident Reports
Reports Timeline
Loading...

Imagine if your face had been digitally edited into a porn video without your consent and then shared on the internet. One woman reveals the horror of it happening to her.
Scrolling through her Twitter feed one evening, Kate Isaacs stumbled …
Loading...

Loading...

Scrolling through her phone in 2020, Kate Isaacs opened up her Twitter to see a post that consumed her with sheer terror.
Someone had publicly tweeted an explicit video of what looked like her having sex.
With no recollection of ever being …
Loading...