Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Incident 395: Amazon Forced Deployment of AI-Powered Cameras on Delivery Drivers

Description: Amazon delivery drivers were forced to consent to algorithmic collection and processing of their location, movement, and biometric data through AI-powered cameras, or be dismissed.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Entities

View all entities
Alleged: Netradyne developed an AI system deployed by Amazon, which harmed Amazon delivery drivers.

Incident Stats

Incident ID
395
Report Count
4
Incident Date
2021-03-02
Editors
Khoa Lam
Applied Taxonomies
MIT

MIT Taxonomy Classifications

Machine-Classified
Taxonomy Details

Risk Subdomain

A further 23 subdomains create an accessible and understandable classification of hazards and harms associated with AI
 

2.1. Compromise of privacy by obtaining, leaking or correctly inferring sensitive information

Risk Domain

The Domain Taxonomy of AI Risks classifies risks into seven AI risk domains: (1) Discrimination & toxicity, (2) Privacy & security, (3) Misinformation, (4) Malicious actors & misuse, (5) Human-computer interaction, (6) Socioeconomic & environmental harms, and (7) AI system safety, failures & limitations.
 
  1. Privacy & Security

Entity

Which, if any, entity is presented as the main cause of the risk
 

Human

Timing

The stage in the AI lifecycle at which the risk is presented as occurring
 

Post-deployment

Intent

Whether the risk is presented as occurring as an expected or unexpected outcome from pursuing a goal
 

Intentional

Incident Reports

Reports Timeline

Surveilling Drivers Can't Fix Amazon's Road Safety Problem+1
Senators question Amazon about using cameras to monitor delivery drivers
For this Amazon van driver, AI surveillance was the final strawAmazon Delivery Drivers Forced to Sign ‘Biometric Consent’ Form or Lose Job
Surveilling Drivers Can't Fix Amazon's Road Safety Problem

Surveilling Drivers Can't Fix Amazon's Road Safety Problem

vice.com

Senators question Amazon about using cameras to monitor delivery drivers

Senators question Amazon about using cameras to monitor delivery drivers

cnbc.com

For this Amazon van driver, AI surveillance was the final straw

For this Amazon van driver, AI surveillance was the final straw

news.trust.org

Amazon Delivery Drivers Forced to Sign ‘Biometric Consent’ Form or Lose Job

Amazon Delivery Drivers Forced to Sign ‘Biometric Consent’ Form or Lose Job

vice.com

Surveilling Drivers Can't Fix Amazon's Road Safety Problem
vice.com · 2021

On the Clock is Motherboard's reporting on the organized labor movement, gig work, automation, and the future of work.

Amazon is planning on installing surveillance cameras inside vehicles in its delivery fleet to watch delivery drivers, Th…

Senators question Amazon about using cameras to monitor delivery drivers
cnbc.com · 2021

Five senators are calling on Amazon CEO Jeff Bezos to provide more information on the company's recent deployment of "surveillance cameras" in vehicles used by contracted delivery drivers.

In a letter Wednesday, Sens. Ed Markey of Massachus…

For this Amazon van driver, AI surveillance was the final straw
news.trust.org · 2021

March 19 (Thomson Reuters Foundation) – When Vic started delivering packages for Amazon in 2019, he enjoyed it - the work was physical, he liked the autonomy, and it let him explore new neighborhoods in Denver, Colorado.

But Vic, who asked …

Amazon Delivery Drivers Forced to Sign ‘Biometric Consent’ Form or Lose Job
vice.com · 2021

On the Clock is Motherboard's reporting on the organized labor movement, gig work, automation, and the future of work.

Amazon delivery drivers nationwide have to sign a "biometric consent" form this week that grants the tech behemoth permis…

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Kronos Scheduling Algorithm Allegedly Caused Financial Issues for Starbucks Employees

Kronos Scheduling Algorithm Allegedly Caused Financial Issues for Starbucks Employees

Aug 2014 · 10 reports
Amazon's AI Cameras Incorrectly Penalized Delivery Drivers for Mistakes They Did Not Make

Amazon's AI Cameras Incorrectly Penalized Delivery Drivers for Mistakes They Did Not Make

Sep 2021 · 2 reports
Amazon Flex Drivers Allegedly Fired via Automated Employee Evaluations

Amazon Flex Drivers Allegedly Fired via Automated Employee Evaluations

Sep 2015 · 5 reports
Previous IncidentNext Incident

Similar Incidents

By textual similarity

Did our AI mess up? Flag the unrelated incidents

Kronos Scheduling Algorithm Allegedly Caused Financial Issues for Starbucks Employees

Kronos Scheduling Algorithm Allegedly Caused Financial Issues for Starbucks Employees

Aug 2014 · 10 reports
Amazon's AI Cameras Incorrectly Penalized Delivery Drivers for Mistakes They Did Not Make

Amazon's AI Cameras Incorrectly Penalized Delivery Drivers for Mistakes They Did Not Make

Sep 2021 · 2 reports
Amazon Flex Drivers Allegedly Fired via Automated Employee Evaluations

Amazon Flex Drivers Allegedly Fired via Automated Employee Evaluations

Sep 2015 · 5 reports

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2023 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • 30ebe76