Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Incident 601: AI-Generated Fake Audio of Verbal Abuse Incident Circulates of British Labour Leader Keir Starmer

Responded
Description: An AI-generated audio clip, purporting to show UK opposition leader Keir Starmer verbally abusing staff, was debunked as fake. The clip, circulated on social media, was analyzed and found likely manipulated, with added background noise to evade detection.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Entities

View all entities
Alleged: unknown developed and deployed an AI system, which harmed UK Labour Party and Keir Starmer.

Incident Stats

Incident ID
601
Report Count
10
Incident Date
2023-10-08
Editors
Applied Taxonomies
MIT

MIT Taxonomy Classifications

Machine-Classified
Taxonomy Details

Risk Subdomain

A further 23 subdomains create an accessible and understandable classification of hazards and harms associated with AI
 

4.3. Fraud, scams, and targeted manipulation

Risk Domain

The Domain Taxonomy of AI Risks classifies risks into seven AI risk domains: (1) Discrimination & toxicity, (2) Privacy & security, (3) Misinformation, (4) Malicious actors & misuse, (5) Human-computer interaction, (6) Socioeconomic & environmental harms, and (7) AI system safety, failures & limitations.
 
  1. Malicious Actors & Misuse

Entity

Which, if any, entity is presented as the main cause of the risk
 

Human

Timing

The stage in the AI lifecycle at which the risk is presented as occurring
 

Post-deployment

Intent

Whether the risk is presented as occurring as an expected or unexpected outcome from pursuing a goal
 

Intentional

Incident Reports

Reports Timeline

+1
Top Tory defends Keir Starmer over scary ‘deepfake' video ahead of AI summit - Response
+5
UK opposition leader targeted by AI-generated fake audio smear
The Starmer deepfake affair - letter to the editorNo evidence that audio clip of Keir Starmer supposedly swearing at his staff is genuinePolitical Deepfake: Keir Starmer
Top Tory defends Keir Starmer over scary ‘deepfake' video ahead of AI summit

Top Tory defends Keir Starmer over scary ‘deepfake' video ahead of AI summit

express.co.uk

UK opposition leader targeted by AI-generated fake audio smear

UK opposition leader targeted by AI-generated fake audio smear

therecord.media

Deepfake Audio Is a Political Nightmare

Deepfake Audio Is a Political Nightmare

wired.com

'Deepfake' Starmer clips posted during Labour conference in democracy 'threat'

'Deepfake' Starmer clips posted during Labour conference in democracy 'threat'

mirror.co.uk

Deepfake audio of Sir Keir Starmer released on first day of Labour conference

Deepfake audio of Sir Keir Starmer released on first day of Labour conference

news.sky.com

Keir Starmer suffers UK politics’ first deepfake moment. It won’t be the last

Keir Starmer suffers UK politics’ first deepfake moment. It won’t be the last

politico.eu

Deepfakes warning after false video emerges of Keir Starmer at Labour conference

Deepfakes warning after false video emerges of Keir Starmer at Labour conference

standard.co.uk

The Starmer deepfake affair - letter to the editor

The Starmer deepfake affair - letter to the editor

westcountryvoices.co.uk

No evidence that audio clip of Keir Starmer supposedly swearing at his staff is genuine

No evidence that audio clip of Keir Starmer supposedly swearing at his staff is genuine

fullfact.org

Political Deepfake: Keir Starmer

Political Deepfake: Keir Starmer

resemble.ai

Top Tory defends Keir Starmer over scary ‘deepfake' video ahead of AI summit
express.co.uk · 2023
Christian Calgie post-incident response

Keir Starmer appeared in a totally doctored AI clip going viral on Twitter this morning (Image: Leo Hutz Twitter / Getty)

Politicians were dealt a warning this morning about the potential disruptive power of AI in the next general election.…

UK opposition leader targeted by AI-generated fake audio smear
therecord.media · 2023

An audio clip posted to social media on Sunday, purporting to show Britain’s opposition leader Keir Starmer verbally abusing his staff, has been debunked as being AI-generated by private-sector and British government analysis.

The audio of …

Deepfake Audio Is a Political Nightmare
wired.com · 2023

As members of the UK’s largest opposition party gathered in Liverpool for their party conference—probably their last before the UK holds a general election—a potentially explosive audio file started circulating on X, formerly known as Twitt…

'Deepfake' Starmer clips posted during Labour conference in democracy 'threat'
mirror.co.uk · 2023

"Deepfake" clips of Keir Starmer released during Labour conference have sparked warnings over the threat to democracy from artificial intelligence (AI).

An AI-generated audio clip of the Labour leader appearing to berate a staff member has …

Deepfake audio of Sir Keir Starmer released on first day of Labour conference
news.sky.com · 2023

Deepfake videos of Sir Keir Starmer have been posted on the first day of Labour Party conference in a move that underlines the threat posed by deepfake technology and AI in UK politics.

The fake video of the Labour leader emerged on X, form…

Keir Starmer suffers UK politics’ first deepfake moment. It won’t be the last
politico.eu · 2023

LONDON — The United Kingdom wants to lead the world on AI safety, but at home it is struggling with its most urgent threat.

Fears over the proliferation of AI-generated media, known as deepfakes, intensified this weekend as an audio clip ap…

Deepfakes warning after false video emerges of Keir Starmer at Labour conference
standard.co.uk · 2023

Deepfake videos of Sir Keir Starmer have been shared online at the beginning of the Labour party conference in a move that emphasises the threat posed to UK politics by artificial intelligence.

Labour campaigners are set to be trained to fl…

The Starmer deepfake affair - letter to the editor
westcountryvoices.co.uk · 2023

Dear Editor,

An element of doubt is insidious, I thought, having read about the ‘recording’ posted on X (formerly twitter), purporting to be a tirade from Sir Keir Starmer, effing and blinding at his staff because they forgot to bring his t…

No evidence that audio clip of Keir Starmer supposedly swearing at his staff is genuine
fullfact.org · 2023

There is no evidence that an audio clip which has gone viral on X (formerly Twitter), allegedly of Labour leader Sir Keir Starmer swearing at a member of his staff, is genuine. 

The clip, which has over 1.5 million views at the time of writ…

Political Deepfake: Keir Starmer
resemble.ai · 2023

A recent incident has thrust the dangers of deepfake technology into the spotlight again. An audio clip surfaced on Twitter last week allegedly capturing British opposition leader Sir Keir Starmer swearing at staffers. But evidence suggests…

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

Selected by our editors
Deepfake Recordings Allegedly Influence Slovakian Election

Deepfake Recordings Allegedly Influence Slovakian Election

Oct 2023 · 11 reports
By textual similarity

Did our AI mess up? Flag the unrelated incidents

Deepfake Obama Introduction of Deepfakes

Deepfake Obama Introduction of Deepfakes

Jul 2017 · 29 reports
Wikipedia Vandalism Prevention Bot Loop

Wikipedia Vandalism Prevention Bot Loop

Feb 2017 · 6 reports
Hackers Break Apple Face ID

Hackers Break Apple Face ID

Sep 2017 · 24 reports
Previous IncidentNext Incident

Similar Incidents

Selected by our editors
Deepfake Recordings Allegedly Influence Slovakian Election

Deepfake Recordings Allegedly Influence Slovakian Election

Oct 2023 · 11 reports
By textual similarity

Did our AI mess up? Flag the unrelated incidents

Deepfake Obama Introduction of Deepfakes

Deepfake Obama Introduction of Deepfakes

Jul 2017 · 29 reports
Wikipedia Vandalism Prevention Bot Loop

Wikipedia Vandalism Prevention Bot Loop

Feb 2017 · 6 reports
Hackers Break Apple Face ID

Hackers Break Apple Face ID

Sep 2017 · 24 reports

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2023 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • 5fc5e5b