DocumentedAI Hallucination / Misinformation2023-10 (launch); 2024-03 (scandal)United States

NYC MyCity Chatbot Advising Businesses to Break the Law

The MyCity AI chatbot (launched October 2023 as part of NYC's AI Plan) told landlords they could refuse Section 8 vouchers (illegal since 2008), told employers they could take workers' tips (illegal under NY Labor Law 196-d), told businesses they could go cashless (illegal since November 2020), said 'no restrictions on rent' amounts (ignoring rent stabilization), suggested landlords could lock out tenants.

Cost: US$600KAI system:Microsoft Azure AI (MyCity chatbot)

Impact

Legal Services NYC called the advice 'dangerously inaccurate.' Users relying on the bot faced real legal liability.

Outcome

Mayor Adams refused to take bot offline. City added disclaimer but bot continued providing false information. Most-cited US municipal AI failure; accelerated city-level AI governance frameworks.

Sources

  1. Source 1The Markupthemarkup.org/artificial-intelligence/2024/03/29/nycs-ai-chatbot-tells-businesses-to-break-the-law
  2. Source 2THE CITYwww.thecity.nyc/2024/03/29/ai-chat-false-information-small-business/
  3. Source 3THE CITY follow-upwww.thecity.nyc/2024/04/02/malfunctioning-nyc-ai-chatbot-still-active-false-information/

Related incidents

Same category, country, or harm tier.

United States·March 2023 to June 22, 2023
Mata v. Avianca: Lawyers Sanctioned for ChatGPT-Fabricated Cases
In a personal-injury suit against Avianca, attorney Steven Schwartz used ChatGPT for research, and it fabricated 6 nonexistent judicial opinions (Varghese v. China Southern Airlines, Shaboon v. Egyptair, Petersen v. Iran Air, etc.). Schwartz asked ChatGPT directly if 'Varghese' was real; it confirmed and claimed it could be found on Westlaw and LexisNexis. Judge Castel could not locate the citations.
Canada·2022-11 (incident); 2024-02-14 (ruling)
Moffatt v. Air Canada AI Chatbot Hallucination
After his grandmother's death, Jake Moffatt used Air Canada's website chatbot which fabricated a policy allowing retroactive bereavement fares within 90 days. He bought full-fare tickets based on the bot's advice. Air Canada refused the refund, claiming in BC Civil Resolution Tribunal that the chatbot was 'a separate legal entity responsible for its own actions.'
United States·April 2026
AI Voice-Clone Cartel Virtual Kidnapping of Las Vegas Mother
In April 2026, a Las Vegas mother received what she believed was a kidnapping call from her daughter, sobbing on the line. A male caller then claimed cartel custody and demanded $15,000. The 'daughter's voice' was an AI clone built from publicly available social-media audio. The caller kept the mother on the phone for nearly six hours while she drove between banks, grocery stores, and a Walmart, wiring money to Toluca, Mexico.