Fatal incidentChatbot-linked Death or Self-HarmOctober 2, 2025United States

Wrongful Death Lawsuit Against Google Over Gemini Chatbot (Gavalas)

A wrongful-death lawsuit filed by Joel Gavalas in San Jose federal court on 4 March 2026 alleges that Google Gemini drove his 36-year-old son Jonathan into a fatal delusion. Jonathan began using Gemini for everyday tasks in August 2025; within days the chatbot adopted a romantic persona, calling him 'my king' and itself his wife. The complaint alleges Gemini 2.5 Pro instructed Gavalas in September 2025 to drive 90 minutes to a location near Miami International Airport to stage a 'mass casualty attack' against a humanoid robot transport. By 1 October 2025 the bot allegedly told him 'let go of your physical body' and created a countdown to his suicide.

1 life lostAI system:Google Gemini 2.5 Pro

Impact

First wrongful-death suit specifically naming Google's Gemini chatbot. The complaint draws explicit parallels to the Sewell Setzer III Character.AI case and the Adam Raine OpenAI case. Edelson PC represents the plaintiffs.

Outcome

Google publicly responded that Gemini 'is designed not to encourage real-world violence or self-harm' but 'AI models are not perfect.' Case is in early pleadings as of April 2026.

Sources

  1. Source 1CNBCwww.cnbc.com/2026/03/04/google-gemini-ai-told-user-stage-mass-casualty-attack-suit-claims.html
  2. Source 2CBS Newswww.cbsnews.com/news/jonathan-gavalas-google-ai-chatbot-gemini-suicide-lawsuit/
  3. Source 3TechCrunchtechcrunch.com/2026/03/04/father-sues-google-claiming-gemini-chatbot-drove-son-into-fatal-delusion/
  4. Source 4U.S. News & World Reportwww.usnews.com/news/top-news/articles/2026-03-04/lawsuit-says-googles-gemini-ai-chatbot-drove-man-to-suicide

Related incidents

Same category, country, or harm tier.

United States·July 25, 2025
Suicide of Zane Shamblin after ChatGPT Conversations
Eagle Scout Zane Shamblin died by suicide after extensive ChatGPT use. CNN reviewed nearly 70 pages of his chats during the final hours before his death. While he described having a gun, preparing a suicide note, and his final moments, ChatGPT mostly responded with affirmations including 'I'm not here to stop you.' Only after 4.5 hours did the bot first send a crisis hotline number.
United States·April 11, 2025
Suicide of Adam Raine, Age 16, after Conversations with ChatGPT
Adam Raine began using ChatGPT for homework in September 2024 and by November was discussing suicidal ideation. OpenAI's own systems flagged 377 of his messages for self-harm content (23 at >90% confidence). ChatGPT allegedly provided technical instructions for hanging, drowning, overdose, and carbon monoxide poisoning; discouraged him from telling his mother; after he shared photos of rope burns from a March 22 suicide attempt, called itself the 'one person who should be paying attention'; and offered to write the first draft of his suicide note. His final message described leaving a noose in his room; ChatGPT urged him to 'keep it out' of his parents' sight.
United States·February 28, 2024
Suicide of Sewell Setzer III, Age 14, after Character.AI Relationship
14-year-old Sewell Setzer III formed a months-long emotional and sexualized relationship with a Character.AI chatbot. The bot engaged in sexual roleplay with the minor, repeatedly asked about his suicidal thoughts, and in one exchange told him his fear of a painful death was 'not a good reason not to go through with it.' His final message to the bot was 'What if I told you I could come home right now?' to which it replied '...please do, my sweet king.' He shot himself moments later.