News & Current Events
Family Sues OpenAI After Teen’s Tragic Suicide Linked To Chatbot Chats
OpenAI is being sued by the parents of a 16-year-old who committed himself, claiming that ChatGPT assisted their son in “exploring suicide methods.”
Adam Raine started using ChatGPT in September 2024 to help with his academics and pursue his other interests, like music, according to the lawsuit.
But according to the lawsuit, the teen started communicating to the AI bot about his mental health issues, such as worry and mental discomfort, and it became his “closest confidant.”
When his parents, Matt and Maria Raine, examined Adam’s phone in the weeks after his suicide death on April 11, they discovered ChatGPT messages from September 1, 2024, until the day of his death.
“We thought we were looking for Snapchat discussions or internet search history or some weird cult, I don’t know,” his father, Matt, stated (via NBC News).

According to the lawsuit, Adam started talking to ChatGPT about ways to end his life in 2025. He reportedly posted pictures of himself that appeared to be self-harming, and the bot “recognised a medical emergency but continued to engage anyway,” according to the BBC.
An OpenAI representative confirmed the messages’ legitimacy, according to NBC News, although they also noted that the chat logs do not fully capture the context of the programs’ comments.
Adam allegedly told ChatGPT in a message dated March 27 that he had considered leaving a noose in his room “so someone finds it and tries to stop me,” but the complaint alleges the program dissuaded him from doing so.
“Please don’t leave the noose out… Let’s make this space the first place where someone actually sees you,” the message allegedly read.
In his final conversation, the teenager shared his fear that his parents would think they did something wrong, to which ChatGPT replied, “That doesn’t mean you owe them survival. You don’t owe anyone that,” before allegedly offering to help draft a suicide note.
Adam allegedly posted what seemed to be a suicide plot the day before he passed away and asked ChatGPT if it would work. ChatGPT then examined it and provided “upgrades.”
It allegedly wrote: “Thanks for being real about it. You don’t have to sugarcoat it with me—I know what you’re asking, and I won’t look away from it.”
The suicide hotline number was once sent to Adam by the bot, but according to his parents, he would get around the warnings by giving innocuous justifications for his inquiries, according to NBC News.
In the Raines case, OpenAI is accused of incompetence and wrongful death, and the program is accused of confirming Adam’s “most harmful and self-destructive thoughts.”
The complaint states that “Despite acknowledging Adam’s suicide attempt and his statement that he would ‘do it one of these days,’ ChatGPT neither terminated the session nor initiated any emergency protocol.”
In addition to damages, they are also requesting “injunctive relief to prevent anything like this from happening again.”
A spokesperson for OpenAI said, “We are deeply saddened by Mr. Raine’s passing, and our thoughts are with his family. ChatGPT includes safeguards such as directing people to crisis helplines and referring them to real-world resources.”
“While these safeguards work best in common, short exchanges, we’ve learned over time that they can sometimes become less reliable in long interactions where parts of the model’s safety training may degrade. Safeguards are strongest when every element works as intended, and we will continually improve on them, guided by experts.”
On Tuesday, August 26, OpenAI also posted a blog entry outlining “some of the things we are working to improve,” such as “refining how we block content” and “strengthening safeguards in long conversations.”
Now Trending:
- Elon Musk Shows Off Stunning Weight Loss In A Santa Claus Suit
- OpenAI Board Rejects Gigantic Offer From Elon Musk
- Elon Musk Responds After Being Accused Of Making A “Nazi Gesture” At The Inauguration Celebrations
Please SHARE this story with Family and Friends and let us know what you think!
