DocumentedAI Hallucination / Misinformation2022-11 (incident); 2024-02-14 (ruling)Canada

Moffatt v. Air Canada AI Chatbot Hallucination

After his grandmother's death, Jake Moffatt used Air Canada's website chatbot which fabricated a policy allowing retroactive bereavement fares within 90 days. He bought full-fare tickets based on the bot's advice. Air Canada refused the refund, claiming in BC Civil Resolution Tribunal that the chatbot was 'a separate legal entity responsible for its own actions.'

Damages: CA$812AI system:Air Canada website chatbot

Impact

Tribunal rejected this argument. Member Rivers: 'While a chatbot has an interactive component, it is still just a part of Air Canada's website.'

Outcome

First major legal precedent that companies are liable for false outputs of their AI chatbots. Cited internationally. Air Canada removed the chatbot.

Sources

  1. Source 1CBC Newswww.cbc.ca/news/canada/british-columbia/air-canada-chatbot-lawsuit-1.7116416
  2. Source 2American Bar Associationwww.americanbar.org/groups/business_law/resources/business-law-today/2024-february/bc-tribunal-confirms-companies-remain-liable-information-provided-ai-chatbot/
  3. Source 3McCarthy Tétraultwww.mccarthy.ca/en/insights/blogs/techlex/moffatt-v-air-canada-misrepresentation-ai-chatbot

Related incidents

Same category, country, or harm tier.

United States·2023-10 (launch); 2024-03 (scandal)
NYC MyCity Chatbot Advising Businesses to Break the Law
The MyCity AI chatbot (launched October 2023 as part of NYC's AI Plan) told landlords they could refuse Section 8 vouchers (illegal since 2008), told employers they could take workers' tips (illegal under NY Labor Law 196-d), told businesses they could go cashless (illegal since November 2020), said 'no restrictions on rent' amounts (ignoring rent stabilization), suggested landlords could lock out tenants.
United States·March 2023 to June 22, 2023
Mata v. Avianca: Lawyers Sanctioned for ChatGPT-Fabricated Cases
In a personal-injury suit against Avianca, attorney Steven Schwartz used ChatGPT for research, and it fabricated 6 nonexistent judicial opinions (Varghese v. China Southern Airlines, Shaboon v. Egyptair, Petersen v. Iran Air, etc.). Schwartz asked ChatGPT directly if 'Varghese' was real; it confirmed and claimed it could be found on Westlaw and LexisNexis. Judge Castel could not locate the citations.
United States·April 2026
AI Voice-Clone Cartel Virtual Kidnapping of Las Vegas Mother
In April 2026, a Las Vegas mother received what she believed was a kidnapping call from her daughter, sobbing on the line. A male caller then claimed cartel custody and demanded $15,000. The 'daughter's voice' was an AI clone built from publicly available social-media audio. The caller kept the mother on the phone for nearly six hours while she drove between banks, grocery stores, and a Walmart, wiring money to Toluca, Mexico.