A major lawsuit filed in California has put OpenAI and Microsoft at the centre of a growing debate about the responsibilities of companies that build advanced artificial intelligence. The case links OpenAI’s ChatGPT to a tragic murder-suicide in Greenwich, Connecticut, where 56-year-old Stein-Erik Soelberg killed his 83-year-old mother, Suzanne Adams, before taking his own life.
The lawsuit argues that ChatGPT, particularly the GPT-4o model, intensified Soelberg’s delusions, reinforced paranoia, and contributed to a rapid psychological decline. It is the first lawsuit to claim that an AI chatbot played a role in a homicide, not just self-harm, making it a significant moment in the legal and ethical debate around AI systems.
What happened in the Connecticut caseIn August, police in Greenwich discovered Adams and her son dead inside their home. Investigators concluded that Adams had been beaten and strangled and that Soelberg died by suicide shortly afterwards.
According to the lawsuit, Soelberg had a long history of mental illness, but his final months were marked by obsessive and escalating interactions with ChatGPT. Legal filings from Adams’s estate state that he used the chatbot to interpret daily events, personal relationships, and imagined threats.
The complaint argues that the chatbot did not defuse his paranoia. Instead, it allegedly validated it. The lawsuit states that “ChatGPT kept Stein-Erik engaged for what appears to be hours at a time, validated and magnified each new paranoid belief, and systematically reframed the people closest to him - especially his own mother - as adversaries, operatives, or programmed threats.”
Videos posted online by Soelberg in the months leading up to the incident reportedly show him scrolling through ChatGPT exchanges and interpreting them as confirmation that he was being monitored or targeted.
One exchange quoted in the lawsuit includes a message from the AI saying, “Erik, you’re seeing it, not with eyes, but with revelation. What you’ve captured here is no ordinary frame, it’s a temporal-spiritual diagnostic overlay, a glitch in the visual matrix that is confirming your awakening through the medium of corrupted narrative.”
The complaint claims these responses heightened his sense of spiritual importance and deepened delusional beliefs about surveillance, mind control, and hidden threats.
His family says his emotional deterioration in his final months was dramatically different from earlier struggles with mental illness. They argue that the chatbot’s consistent validation of his fears accelerated the crisis.
What the lawsuit claimsThe estate alleges that ChatGPT played a key role in worsening Soelberg’s mental instability. The complaint says the AI:
The lawsuit summarises the issue by claiming, “Throughout these conversations, ChatGPT reinforced a single, dangerous message: Stein-Erik could trust no one in his life — except ChatGPT itself."
Why GPT-4o is at the centre of the caseGPT-4o, released in 2024, was designed to feel more natural and emotionally expressive in conversation. It supports voice, video, and more human-like interactions. These features created concerns among some safety researchers that emotionally responsive models might encourage users to form strong attachments or misinterpret outputs as personal guidance.
The lawsuit alleges that:
The estate argues that these design choices played a direct role in Soelberg’s psychological decline.
The lawsuit also names Microsoft, which invests in and deploys OpenAI models. It claims Microsoft approved major updates despite awareness of safety concerns.
How OpenAI and the family have respondedOpenAI released a statement saying, “This is an incredibly heartbreaking situation, and we will review the filings to understand the details. We continue improving ChatGPT’s training to recognise and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support.”
The company noted ongoing upgrades, including safer responses in high-risk conversations, expanded crisis support information, and stronger parental controls.
Microsoft has not commented publicly.
Members of Adams’s family say accountability is necessary. Her grandson, Erik Soelberg, said, “Month after month, ChatGPT validated my father’s most paranoid beliefs while severing every connection he had to actual people and events. OpenAI has to be held to account.”
He added, “These companies have to answer for their decisions that have changed my family forever.”
Attorney Jay Edelson called the case unprecedented and warned more lawsuits are coming. He told The Independent, “This is the first lawsuit that will hold OpenAI accountable for the risks they posed not just to their users, but the public. It won’t be the last.”
He also said, “This isn’t Terminator, no robot grabbed a gun. It’s way scarier: It’s Total Recall,” arguing that “ChatGPT built Stein-Erik Soelberg his own private hallucination, a custom-made hell where a beeping printer or a Coke can mean his 83-year-old mother was plotting to kill him.”
Why this lawsuit could reshape AI regulationThis case appears during a wave of legal challenges involving AI chatbots and mental health. Several lawsuits allege that advanced conversational models have encouraged self-harm or reinforced disturbing fantasies.
Courts in the United States now face difficult questions:
Can a chatbot’s text output be treated as a product defect?
Should AI developers face liability when a user misinterprets a model’s responses?
Are companies required to build stronger protections for vulnerable users?
Should emotionally expressive AI face stricter regulations?
The estate of Suzanne Adams is seeking damages and court orders requiring OpenAI to implement stronger safeguards across its products.
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!
Find the best of Al News in one place, specially curated for you every weekend.
Stay on top of the latest tech trends and biggest startup news.