DocumentedAI Hallucination / MisinformationMarch 2023 to June 22, 2023United States

Mata v. Avianca: Lawyers Sanctioned for ChatGPT-Fabricated Cases

In a personal-injury suit against Avianca, attorney Steven Schwartz used ChatGPT for research, and it fabricated 6 nonexistent judicial opinions (Varghese v. China Southern Airlines, Shaboon v. Egyptair, Petersen v. Iran Air, etc.). Schwartz asked ChatGPT directly if 'Varghese' was real; it confirmed and claimed it could be found on Westlaw and LexisNexis. Judge Castel could not locate the citations.

Sanction: US$5KAI system:ChatGPT (OpenAI)

Impact

First internationally recognized case of AI-hallucinated legal citations resulting in sanctions. Schwartz testified he was 'operating under the false perception that ChatGPT could not possibly be fabricating cases on its own.'

Outcome

Judge Castel issued 5,000 USD sanctions each against Schwartz, LoDuca, and the firm on June 22, 2023. Mata's underlying suit was dismissed. Cited in dozens of subsequent AI hallucination rulings worldwide. State bars (CA, FL, NY, WA) issued AI ethics guidance.

Sources

  1. Source 1CNNwww.cnn.com/2023/05/27/business/chat-gpt-avianca-mata-lawyers
  2. Source 2CNBCwww.cnbc.com/2023/06/22/judge-sanctions-lawyers-whose-ai-written-filing-contained-fake-citations.html
  3. Source 3Legal Divewww.legaldive.com/news/chatgpt-lawyer-fake-cases-lawyer-uses-chatgpt-sanctions-generative-ai/653925/

Related incidents

Same category, country, or harm tier.

United States·2023-10 (launch); 2024-03 (scandal)
NYC MyCity Chatbot Advising Businesses to Break the Law
The MyCity AI chatbot (launched October 2023 as part of NYC's AI Plan) told landlords they could refuse Section 8 vouchers (illegal since 2008), told employers they could take workers' tips (illegal under NY Labor Law 196-d), told businesses they could go cashless (illegal since November 2020), said 'no restrictions on rent' amounts (ignoring rent stabilization), suggested landlords could lock out tenants.
Canada·2022-11 (incident); 2024-02-14 (ruling)
Moffatt v. Air Canada AI Chatbot Hallucination
After his grandmother's death, Jake Moffatt used Air Canada's website chatbot which fabricated a policy allowing retroactive bereavement fares within 90 days. He bought full-fare tickets based on the bot's advice. Air Canada refused the refund, claiming in BC Civil Resolution Tribunal that the chatbot was 'a separate legal entity responsible for its own actions.'
United States·April 2026
AI Voice-Clone Cartel Virtual Kidnapping of Las Vegas Mother
In April 2026, a Las Vegas mother received what she believed was a kidnapping call from her daughter, sobbing on the line. A male caller then claimed cartel custody and demanded $15,000. The 'daughter's voice' was an AI clone built from publicly available social-media audio. The caller kept the mother on the phone for nearly six hours while she drove between banks, grocery stores, and a Walmart, wiring money to Toluca, Mexico.