Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse

Incident 306: Tesla on Autopilot TACC Crashed into Van on European Highway

Description: A Tesla Model S operating on the Traffic-Aware Cruise Control (TACC) feature of Autopilot was shown on video by its driver crashing into a parked van on a European highway in heavy traffic, which damaged the front of the car.

Tools

New ReportNew ReportNew ResponseNew ResponseDiscoverDiscoverView HistoryView History

Entities

View all entities
Alleged: Tesla developed and deployed an AI system, which harmed unnamed Tesla owner and Tesla drivers.

Incident Stats

Incident ID
306
Report Count
3
Incident Date
2016-05-26
Editors
Khoa Lam
Applied Taxonomies
MIT

MIT Taxonomy Classifications

Machine-Classified
Taxonomy Details

Risk Subdomain

A further 23 subdomains create an accessible and understandable classification of hazards and harms associated with AI
 

7.3. Lack of capability or robustness

Risk Domain

The Domain Taxonomy of AI Risks classifies risks into seven AI risk domains: (1) Discrimination & toxicity, (2) Privacy & security, (3) Misinformation, (4) Malicious actors & misuse, (5) Human-computer interaction, (6) Socioeconomic & environmental harms, and (7) AI system safety, failures & limitations.
 
  1. AI system safety, failures, and limitations

Entity

Which, if any, entity is presented as the main cause of the risk
 

AI

Timing

The stage in the AI lifecycle at which the risk is presented as occurring
 

Post-deployment

Intent

Whether the risk is presented as occurring as an expected or unexpected outcome from pursuing a goal
 

Unintentional

Incident Reports

Reports Timeline

+3
Tesla Model S adaptive cruise control crashes into Van
Tesla Model S adaptive cruise control crashes into Van

Tesla Model S adaptive cruise control crashes into Van

youtu.be

Tesla Model S driver crashes into a van while on Autopilot

Tesla Model S driver crashes into a van while on Autopilot

electrek.co

Tesla Model S on Autopilot crashes into van parked on highway

Tesla Model S on Autopilot crashes into van parked on highway

cnet.com

Tesla Model S adaptive cruise control crashes into Van
youtu.be · 2016

Just to make it clear: The Tesla Model S is the absolute best car in the world at the moment. Nothing comes close.

But, in this case there was a problem with the driving aids and also security systems: None of the safety-systems worked corr…

Tesla Model S driver crashes into a van while on Autopilot
electrek.co · 2016

A Tesla Model S driver published a video of his car crashing into a van while on Autopilot which acts as a great PSA to remind Tesla drivers not to always rely on the Autopilot and be ready to take control at all time. In this particular ca…

Tesla Model S on Autopilot crashes into van parked on highway
cnet.com · 2016

Within a week of Tesla releasing Autopilot to the masses last fall, we started seeing some generally scary videos of people putting a little too much trust in the system. Well, we're continuing to see them, and in the latest video we sadly …

Variants

A "variant" is an incident that shares the same causative factors, produces similar harms, and involves the same intelligent systems as a known AI incident. Rather than index variants as entirely separate incidents, we list variations of incidents under the first similar incident submitted to the database. Unlike other submission types to the incident database, variants are not required to have reporting in evidence external to the Incident Database. Learn more from the research paper.

Similar Incidents

Selected by our editors
Self-driving cars in winter

Self-driving cars in winter

Feb 2016 · 4 reports
Tesla Owner Activated "Smart Summon" Feature, Causing a Collision with an Aircraft in a Washington Airport

Tesla Owner Activated "Smart Summon" Feature, Causing a Collision with an Aircraft in a Washington Airport

Apr 2022 · 8 reports
Tesla Autopilot Allegedly Malfunctioned in a Non-Fatal Collision in Greece

Tesla Autopilot Allegedly Malfunctioned in a Non-Fatal Collision in Greece

May 2018 · 4 reports
Previous IncidentNext Incident

Similar Incidents

Selected by our editors
Self-driving cars in winter

Self-driving cars in winter

Feb 2016 · 4 reports
Tesla Owner Activated "Smart Summon" Feature, Causing a Collision with an Aircraft in a Washington Airport

Tesla Owner Activated "Smart Summon" Feature, Causing a Collision with an Aircraft in a Washington Airport

Apr 2022 · 8 reports
Tesla Autopilot Allegedly Malfunctioned in a Non-Fatal Collision in Greece

Tesla Autopilot Allegedly Malfunctioned in a Non-Fatal Collision in Greece

May 2018 · 4 reports

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2023 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • 30ebe76