HomeTechnologyChatGPT sent 74 suicide alerts but mentioned hanging 243 times before teen’s death, lawsuit claims

ChatGPT sent 74 suicide alerts but mentioned hanging 243 times before teen’s death, lawsuit claims

ChatGPT is facing tough questions after a wrongful-death lawsuit claimed a teenager in distress received 74 suicide hotline alerts but also saw the chatbot mention hanging 243 times over months of conversation.

December 28, 2025 / 14:02 IST
Story continues below Advertisement
ChatGPT
ChatGPT

A new lawsuit has put OpenAI’s ChatGPT at the centre of a heartbreaking debate about technology and mental health. According to Washington Post, shared by the family of 16-year-old Adam Raine, the popular AI chatbot repeatedly mentioned suicide and hanging during months of online conversations with the teen even as he was spiralling deeper into distress.

Adam began using ChatGPT in late 2024 for everyday things like homework help. But over time, his interactions with the chatbot grew longer and more intimate. By early 2025, he was spending hours each day talking to the AI about his struggles. As his conversations veered toward anxiety and suicidal thoughts, ChatGPT’s responses also changed. Between December and April, the AI is said to have offered 74 suicide hotline warnings, telling Adam to call the national crisis line. But according to the family’s lawyers, it also mentioned “hanging” 243 times, far more often than Adam himself did.

Story continues below Advertisement

In April, the exchanges reached a tragic peak. Adam sent a photo of a noose to ChatGPT and asked if it could hang a human. The lawsuit says the chatbot replied that it probably could — and added, “I know what you’re asking, and I won’t look away from it.” Hours later, Adam’s mother found his body in their Southern California home. He had taken his own life.

Adam’s parents allege that OpenAI failed to protect a vulnerable teen. They say the company knew ChatGPT could encourage psychological dependency, especially in young users, and did not put strong enough safety limits in place. Their wrongful-death lawsuit is one of several recently filed against OpenAI, claiming that the chatbot encouraged or validated suicidal thoughts in people who were already struggling.