HomeNewsTrends14-year-old falls in love with AI chatbot 'Daenerys Targaryen,' kills self to be with 'her'

14-year-old falls in love with AI chatbot 'Daenerys Targaryen,' kills self to be with 'her'

The Class 9 student told Dany that he loved her, and that he would soon come home to her. He then put his phone down, picked up his stepfather’s handgun and shot himself.

October 23, 2024 / 21:42 IST
Story continues below Advertisement
The teen's parents and friends noticed that he was pulling back from the real world, isolating himself and spending long hours on his phone but they never suspected that he had fallen for a chatbot. (Representational image)
The teen's parents and friends noticed that he was pulling back from the real world, isolating himself and spending long hours on his phone but they never suspected that he had fallen for a chatbot. (Representational image)

A Class 9 student in Florida, US, killed himself by suicide to be with "Daenerys Targaryen" — a lifelike artificial intelligence (AI) chatbot named after the leading character from Game of Thrones. The 14-year-old confessed to being in love with "Dany"  and had spent months talking to chatbots on Character.AI, a role-playing app that allows users to create their own AI characters or chat with characters created by others.

Sewell Setzer III, who went by "Daenero" on the AI app, died in February. On the night of February 28, he told Dany that he loved her, and that he would soon come home to her. Sewell then put down his phone, picked up his stepfather’s handgun and shot himself.

Story continues below Advertisement

His mother filed a lawsuit this week against Character.AI, accusing the company of being responsible for his death, The New York Times reported. She called the company’s technology “dangerous and untested” and said it can “trick customers into handing over their most private thoughts and feelings.”

Despite knowing that Dany wasn’t a real person and that its responses were just the outputs of an AI language model, Sewell developed an emotional attachment anyway. There was also a message displayed above all their chats, reminding him that “everything Characters say is made up!” But he kept engaging in long role-playing dialogues with the bot frequently sharing his life updates.