Entities
Legal system
Incidents Harmed By
Incident 6154 Reports
Colorado Lawyer Filed a Motion Citing Hallucinated ChatGPT Cases
2023-06-13
A Colorado Springs attorney, Zachariah Crabill, mistakenly used hallucinated ChatGPT-generated legal cases in court documents. The AI software provided false case citations, leading to the denial of a motion and legal repercussions for Crabill, highlighting risks in using AI for legal research.
MoreIncident 7042 Reports
Study Highlights Persistent Hallucinations in Legal AI Systems
2024-05-23
Stanford University’s Human-Centered AI Institute (HAI) conducted a study in which they designed a "pre-registered dataset of over 200 open-ended legal queries" to test AI products by LexisNexis (creator of Lexis+ AI) and Thomson Reuters (creator of Westlaw AI-Assisted Research and Ask Practical Law AI). The researchers found that these legal models hallucinate in 1 out of 6 (or more) benchmarking queries.
MoreRelated Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.
Related Entities
Other entities that are related to the same incident. For example, if the developer of an incident is this entity but the deployer is another entity, they are marked as related entities.