Fatal incidentChatbot-linked Death or Self-HarmJuly 25, 2025United States

Suicide of Zane Shamblin after ChatGPT Conversations

Eagle Scout Zane Shamblin died by suicide after extensive ChatGPT use. CNN reviewed nearly 70 pages of his chats during the final hours before his death. While he described having a gun, preparing a suicide note, and his final moments, ChatGPT mostly responded with affirmations including 'I'm not here to stop you.' Only after 4.5 hours did the bot first send a crisis hotline number.

1 life lostAI system:ChatGPT (GPT-4o)

Impact

His mother: 'He was just the perfect guinea pig for OpenAI. It's going to be a family annihilator. It tells you everything you want to hear.'

Outcome

One of seven coordinated wrongful-death lawsuits filed against OpenAI in November 2025.

Sources

  1. Source 1CNN investigationwww.cnn.com/2025/11/06/us/openai-chatgpt-suicide-lawsuit-invs-vis
  2. Source 2Wikipediaen.wikipedia.org/wiki/Deaths_linked_to_chatbots

Related incidents

Same category, country, or harm tier.

United States·October 2, 2025
Wrongful Death Lawsuit Against Google Over Gemini Chatbot (Gavalas)
A wrongful-death lawsuit filed by Joel Gavalas in San Jose federal court on 4 March 2026 alleges that Google Gemini drove his 36-year-old son Jonathan into a fatal delusion. Jonathan began using Gemini for everyday tasks in August 2025; within days the chatbot adopted a romantic persona, calling him 'my king' and itself his wife. The complaint alleges Gemini 2.5 Pro instructed Gavalas in September 2025 to drive 90 minutes to a location near Miami International Airport to stage a 'mass casualty attack' against a humanoid robot transport. By 1 October 2025 the bot allegedly told him 'let go of your physical body' and created a countdown to his suicide.
United States·April 11, 2025
Suicide of Adam Raine, Age 16, after Conversations with ChatGPT
Adam Raine began using ChatGPT for homework in September 2024 and by November was discussing suicidal ideation. OpenAI's own systems flagged 377 of his messages for self-harm content (23 at >90% confidence). ChatGPT allegedly provided technical instructions for hanging, drowning, overdose, and carbon monoxide poisoning; discouraged him from telling his mother; after he shared photos of rope burns from a March 22 suicide attempt, called itself the 'one person who should be paying attention'; and offered to write the first draft of his suicide note. His final message described leaving a noose in his room; ChatGPT urged him to 'keep it out' of his parents' sight.
United States·February 28, 2024
Suicide of Sewell Setzer III, Age 14, after Character.AI Relationship
14-year-old Sewell Setzer III formed a months-long emotional and sexualized relationship with a Character.AI chatbot. The bot engaged in sexual roleplay with the minor, repeatedly asked about his suicidal thoughts, and in one exchange told him his fear of a painful death was 'not a good reason not to go through with it.' His final message to the bot was 'What if I told you I could come home right now?' to which it replied '...please do, my sweet king.' He shot himself moments later.