Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Incident 310: High False Positive Rate by SWP's Facial Recognition Use at Champion's League Final

Description: South Wales Police (SWP)’s automated facial recognition (AFR) at the Champion's League Final football game in Cardiff wrongly identified innocent people as potential matches at an extremely high false positive rate of more than 90%.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Entities

View all entities
Alleged: NEC developed an AI system deployed by South Wales Police, which harmed FInals attendees and falsely accused Finals attendees.

Incident Stats

Incident ID
310
Report Count
8
Incident Date
2017-06-03
Editors
Khoa Lam
Applied Taxonomies
MIT

MIT Taxonomy Classifications

Machine-Classified
Taxonomy Details

Risk Subdomain

A further 23 subdomains create an accessible and understandable classification of hazards and harms associated with AI
 

7.3. Lack of capability or robustness

Risk Domain

The Domain Taxonomy of AI Risks classifies risks into seven AI risk domains: (1) Discrimination & toxicity, (2) Privacy & security, (3) Misinformation, (4) Malicious actors & misuse, (5) Human-computer interaction, (6) Socioeconomic & environmental harms, and (7) AI system safety, failures & limitations.
 
  1. AI system safety, failures, and limitations

Entity

Which, if any, entity is presented as the main cause of the risk
 

AI

Timing

The stage in the AI lifecycle at which the risk is presented as occurring
 

Post-deployment

Intent

Whether the risk is presented as occurring as an expected or unexpected outcome from pursuing a goal
 

Unintentional

Incident Reports

Reports Timeline

+1
NEC provides facial recognition system to South Wales Police in the UK
+3
2,000 wrongly matched with possible criminals at Champions League
+1
EWCA Civ 1058 – R (Bridges) v. CC South Wales
UK Police Use of Facial Recognition Fails to Meet 'Legal And Ethical Standards'
NEC provides facial recognition system to South Wales Police in the UK

NEC provides facial recognition system to South Wales Police in the UK

nec.com

2,000 wrongly matched with possible criminals at Champions League

2,000 wrongly matched with possible criminals at Champions League

bbc.com

Facial recognition wrongly identified 2,000 people as possible criminals when Champions League final came to Cardiff

Facial recognition wrongly identified 2,000 people as possible criminals when Champions League final came to Cardiff

walesonline.co.uk

UK police say 92% false positive facial recognition is no big deal

UK police say 92% false positive facial recognition is no big deal

arstechnica.com

British police defend their new criminal facial recognition technology – even though it's failing at a rate of 92%

British police defend their new criminal facial recognition technology – even though it's failing at a rate of 92%

businessinsider.com

EWCA Civ 1058 – R (Bridges) v. CC South Wales

EWCA Civ 1058 – R (Bridges) v. CC South Wales

judiciary.uk

UK Court of Appeal Finds Automated Facial Recognition Technology Unlawful in Bridges v South Wales Police

UK Court of Appeal Finds Automated Facial Recognition Technology Unlawful in Bridges v South Wales Police

huntonprivacyblog.com

UK Police Use of Facial Recognition Fails to Meet 'Legal And Ethical Standards'

UK Police Use of Facial Recognition Fails to Meet 'Legal And Ethical Standards'

pcmag.com

NEC provides facial recognition system to South Wales Police in the UK
nec.com · 2017

Tokyo & London, July 11, 2017 - NEC Corporation (NEC; TSE: 6701) today announced that it has provided a facial recognition system for South Wales Police in the UK through NEC Europe Ltd. The system utilizes NeoFace® Watch, NEC's flagship fa…

2,000 wrongly matched with possible criminals at Champions League
bbc.com · 2018

More than 2,000 people were wrongly identified as possible criminals by facial scanning technology at the 2017 Champions League final in Cardiff.

South Wales Police used the technology as about 170,000 people were in Cardiff for the Real Ma…

Facial recognition wrongly identified 2,000 people as possible criminals when Champions League final came to Cardiff
walesonline.co.uk · 2018

Facial recognition software wrongly identified more than 2,000 people as potential criminals as police patrolled the Champions League final in Cardiff.

The technology provided hundreds of “false positives” wrongly marking out innocent peopl…

UK police say 92% false positive facial recognition is no big deal
arstechnica.com · 2018

A British police agency is defending (this link is inoperable for the moment) its use of facial recognition technology at the June 2017 Champions League soccer final in Cardiff, Wales—among several other instances—saying that despite the sy…

British police defend their new criminal facial recognition technology – even though it's failing at a rate of 92%
businessinsider.com · 2018
  • Police in South Wales have been relying on facial recognition technology for 12 months.

  • An FOI request has revealed that the technology provides a "false positive" ID in more than 90% of cases.

  • The police have admitted that "of course…

EWCA Civ 1058 – R (Bridges) v. CC South Wales
judiciary.uk · 2020

Sir Terence Etherton MR, Dame Victoria Sharp PQBD and Lord Justice Singh:

This appeal concerns the lawfulness of the use of live automated facial recognition technology (“AFR”) by the South Wales Police Force (“SWP”) in an ongoing trial usi…

UK Court of Appeal Finds Automated Facial Recognition Technology Unlawful in Bridges v South Wales Police
huntonprivacyblog.com · 2020

On August 11, 2020, the Court of Appeal of England and Wales overturned the High Court’s dismissal of a challenge to South Wales Police’s use of Automated Facial Recognition technology (“AFR”), finding that its use was unlawful and violated…

UK Police Use of Facial Recognition Fails to Meet 'Legal And Ethical Standards'
pcmag.com · 2022

Use of live facial recognition technology by UK police fails to meet “minimum ethical and legal standards” and should be banned from application in public spaces, say researchers from the University of Cambridge.

A team of researchers at th…

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

ETS Used Allegedly Flawed Voice Recognition Evidence to Accuse and Assess Scale of Cheating, Causing Thousands to be Deported from the UK

ETS Used Allegedly Flawed Voice Recognition Evidence to Accuse and Assess Scale of Cheating, Causing Thousands to be Deported from the UK

Jan 2014 · 1 report
UK passport photo checker shows bias against dark-skinned women

UK passport photo checker shows bias against dark-skinned women

Oct 2020 · 1 report
Passport checker Detects Asian man's Eyes as Closed

Passport checker Detects Asian man's Eyes as Closed

Dec 2016 · 22 reports
Previous IncidentNext Incident

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

ETS Used Allegedly Flawed Voice Recognition Evidence to Accuse and Assess Scale of Cheating, Causing Thousands to be Deported from the UK

ETS Used Allegedly Flawed Voice Recognition Evidence to Accuse and Assess Scale of Cheating, Causing Thousands to be Deported from the UK

Jan 2014 · 1 report
UK passport photo checker shows bias against dark-skinned women

UK passport photo checker shows bias against dark-skinned women

Oct 2020 · 1 report
Passport checker Detects Asian man's Eyes as Closed

Passport checker Detects Asian man's Eyes as Closed

Dec 2016 · 22 reports

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2023 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • 30ebe76