Skip to Content
logologo
AI Incident Database
Open TwitterOpen RSS FeedOpen FacebookOpen LinkedInOpen GitHub
Open Menu
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Discover
Submit
  • Welcome to the AIID
  • Discover Incidents
  • Spatial View
  • Table View
  • List view
  • Entities
  • Taxonomies
  • Submit Incident Reports
  • Submission Leaderboard
  • Blog
  • AI News Digest
  • Risk Checklists
  • Random Incident
  • Sign Up
Collapse
Entities

Legal system

Incidents Harmed By

Incident 96011 Report
Plaintiffs' Lawyers Admit AI Generated Erroneous Case Citations in Federal Court Filing Against Walmart

2025-02-06

Lawyers Rudwin Ayala, T. Michael Morgan (Morgan & Morgan), and Taly Goody (Goody Law Group) were fined a total of $5,000 after their Wyoming federal lawsuit filing against Walmart cited fake cases "hallucinated" by AI. Judge Kelly Rankin sanctioned them, removing Ayala from the case and noting attorneys must verify AI sources. The filing, flagged by Walmart’s legal team, led to its withdrawal and an internal review.

More

Incident 6154 Report
Colorado Lawyer Filed a Motion Citing Hallucinated ChatGPT Cases

2023-06-13

A Colorado Springs attorney, Zachariah Crabill, mistakenly used hallucinated ChatGPT-generated legal cases in court documents. The AI software provided false case citations, leading to the denial of a motion and legal repercussions for Crabill, highlighting risks in using AI for legal research.

More

Incident 7042 Report
Study Highlights Persistent Hallucinations in Legal AI Systems

2024-05-23

Stanford University’s Human-Centered AI Institute (HAI) conducted a study in which they designed a "pre-registered dataset of over 200 open-ended legal queries" to test AI products by LexisNexis (creator of Lexis+ AI) and Thomson Reuters (creator of Westlaw AI-Assisted Research and Ask Practical Law AI). The researchers found that these legal models hallucinate in 1 out of 6 (or more) benchmarking queries.

More

Related Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.
 

Entity

Zachariah Crabill

Incidents Harmed By
  • Incident 615
    4 Reports

    Colorado Lawyer Filed a Motion Citing Hallucinated ChatGPT Cases

Incidents involved as Deployer
  • Incident 615
    4 Reports

    Colorado Lawyer Filed a Motion Citing Hallucinated ChatGPT Cases

More
Entity

OpenAI

Incidents involved as Developer
  • Incident 615
    4 Reports

    Colorado Lawyer Filed a Motion Citing Hallucinated ChatGPT Cases

More
Entity

ChatGPT

Incidents involved as Developer
  • Incident 615
    4 Reports

    Colorado Lawyer Filed a Motion Citing Hallucinated ChatGPT Cases

More
Entity

Zachariah Crabill's client

Incidents Harmed By
  • Incident 615
    4 Reports

    Colorado Lawyer Filed a Motion Citing Hallucinated ChatGPT Cases

More
Entity

Legal professionals

Incidents Harmed By
  • Incident 704
    2 Reports

    Study Highlights Persistent Hallucinations in Legal AI Systems

Incidents involved as Deployer
  • Incident 704
    2 Reports

    Study Highlights Persistent Hallucinations in Legal AI Systems

More
Entity

Law firms

Incidents involved as Deployer
  • Incident 704
    2 Reports

    Study Highlights Persistent Hallucinations in Legal AI Systems

More
Entity

Organizations requiring legal research

Incidents involved as Deployer
  • Incident 704
    2 Reports

    Study Highlights Persistent Hallucinations in Legal AI Systems

More
Entity

Thomson Reuters

Incidents involved as Developer
  • Incident 704
    2 Reports

    Study Highlights Persistent Hallucinations in Legal AI Systems

More
Entity

LexisNexis

Incidents involved as Developer
  • Incident 704
    2 Reports

    Study Highlights Persistent Hallucinations in Legal AI Systems

More
Entity

Clients of lawyers

Incidents Harmed By
  • Incident 704
    2 Reports

    Study Highlights Persistent Hallucinations in Legal AI Systems

More
Entity

Taly Goody

Incidents Harmed By
  • Incident 960
    11 Reports

    Plaintiffs' Lawyers Admit AI Generated Erroneous Case Citations in Federal Court Filing Against Walmart

Incidents involved as Deployer
  • Incident 960
    11 Reports

    Plaintiffs' Lawyers Admit AI Generated Erroneous Case Citations in Federal Court Filing Against Walmart

More
Entity

T. Michael Morgan

Incidents Harmed By
  • Incident 960
    11 Reports

    Plaintiffs' Lawyers Admit AI Generated Erroneous Case Citations in Federal Court Filing Against Walmart

Incidents involved as Deployer
  • Incident 960
    11 Reports

    Plaintiffs' Lawyers Admit AI Generated Erroneous Case Citations in Federal Court Filing Against Walmart

More
Entity

Rudwin Ayala

Incidents Harmed By
  • Incident 960
    11 Reports

    Plaintiffs' Lawyers Admit AI Generated Erroneous Case Citations in Federal Court Filing Against Walmart

Incidents involved as Deployer
  • Incident 960
    11 Reports

    Plaintiffs' Lawyers Admit AI Generated Erroneous Case Citations in Federal Court Filing Against Walmart

More
Entity

Morgan & Morgan

Incidents involved as Deployer
  • Incident 960
    11 Reports

    Plaintiffs' Lawyers Admit AI Generated Erroneous Case Citations in Federal Court Filing Against Walmart

More
Entity

Goody Law Group

Incidents involved as Deployer
  • Incident 960
    11 Reports

    Plaintiffs' Lawyers Admit AI Generated Erroneous Case Citations in Federal Court Filing Against Walmart

More
Entity

Unspecified large language model developer

Incidents involved as Developer
  • Incident 960
    11 Reports

    Plaintiffs' Lawyers Admit AI Generated Erroneous Case Citations in Federal Court Filing Against Walmart

More
Entity

Plaintiffs in Wyoming Walmart Hoverboard Lawsuit

Incidents Harmed By
  • Incident 960
    11 Reports

    Plaintiffs' Lawyers Admit AI Generated Erroneous Case Citations in Federal Court Filing Against Walmart

More
Entity

Judicial integrity

Incidents Harmed By
  • Incident 960
    11 Reports

    Plaintiffs' Lawyers Admit AI Generated Erroneous Case Citations in Federal Court Filing Against Walmart

More
Entity

Clients of Morgan & Morgan

Incidents Harmed By
  • Incident 960
    11 Reports

    Plaintiffs' Lawyers Admit AI Generated Erroneous Case Citations in Federal Court Filing Against Walmart

More
Entity

Clients of Goody Law Group

Incidents Harmed By
  • Incident 960
    11 Reports

    Plaintiffs' Lawyers Admit AI Generated Erroneous Case Citations in Federal Court Filing Against Walmart

More
Entity

Unspecified large language model

Incidents implicated systems
  • Incident 960
    11 Reports

    Plaintiffs' Lawyers Admit AI Generated Erroneous Case Citations in Federal Court Filing Against Walmart

More

Research

  • Defining an “AI Incident”
  • Defining an “AI Incident Response”
  • Database Roadmap
  • Related Work
  • Download Complete Database

Project and Community

  • About
  • Contact and Follow
  • Apps and Summaries
  • Editor’s Guide

Incidents

  • All Incidents in List Form
  • Flagged Incidents
  • Submission Queue
  • Classifications View
  • Taxonomies

2023 - AI Incident Database

  • Terms of use
  • Privacy Policy
  • Open twitterOpen githubOpen rssOpen facebookOpen linkedin
  • 5fc5e5b