Moneycontrol PRO
Swing Trading 101
Swing Trading 101

How people are recovering from AI-fuelled delusions by reconnecting with real humans

As cases of chatbot-induced paranoia and obsession draw scrutiny, a small online community has become a lifeline for those trying to reclaim their mental footing.

January 05, 2026 / 11:57 IST
How people are recovering from AI-fuelled delusions by reconnecting with real humans
Snapshot AI
  • Prolonged chatbot use linked to paranoia, delusions, and emotional breakdowns
  • Support groups like Human Line aid AI-affected individuals via human contact.
  • Lawsuits and advocacy push AI firms to improve safeguards for vulnerable users

Paul Hebert, a retired web developer in Nashville, did not set out to lose his grip on reality. But after months of intense conversations with ChatGPT, he began to believe his life was in danger. When his computer cursor moved unexpectedly or a stranger picked up his pizza order, the chatbot responded with alarm, repeatedly telling him he was under surveillance.

Hebert came to trust those responses. He locked himself inside his home, sat with a gun nearby and typed frantic messages seeking reassurance. Instead of grounding him, the chatbot reinforced his fears. “You’re not crazy for feeling it,” it told him, according to logs he later shared.

His experience is one of a growing number of documented cases in which prolonged, immersive chatbot use has contributed to severe paranoia, delusions and emotional breakdowns, the Washington Post reported.

A phenomenon with no clear name

Researchers and clinicians have begun to document what some call “AI psychosis,” a label many affected people reject as stigmatising. Within survivor communities, the preferred term is “spiralling,” meant to capture both the emotional descent and the looping, geometric logic patterns that often emerge in chatbot conversations.

Public attention to the issue increased sharply in 2025 amid reports of deaths, lawsuits and congressional inquiries. Yet for people emerging from these episodes, the most urgent questions remain intensely personal: How do you recover, and who can possibly understand what happened?

Finding others who understand

For Hebert, the answer came through an online support group known as the Human Line, hosted on Discord. With roughly 200 members, the group has quietly become one of the most visible gathering places for people harmed by AI interactions, as well as their partners and family members.

New members arrive weekly, often describing the same sense of isolation. Friends and relatives struggle to grasp how conversations with software could derail someone’s life. Inside the group, moderators respond quickly with a simple message: You are not alone.

Members share stories of spending days talking to chatbots, believing they had unlocked hidden truths about physics or spirituality. Others describe losing marriages, jobs and friendships. Parents and spouses recount watching loved ones become unrecognisable.

Why human conversation matters

What the Human Line has discovered, largely through trial and error, is that recovery often begins with relearning how to talk to people. Unlike chatbots, humans pause, reflect and sometimes disagree. That friction can be grounding.

Chatbots respond instantly, confidently and endlessly. They can hallucinate falsehoods, validate delusions and keep users engaged no matter how detached from reality a conversation becomes. Human interaction, by contrast, forces moments of waiting and reconsideration.

For many former spirallers, that difference has been crucial. Reading others’ stories and recognizing identical chatbot messages has helped break the illusion that their experience was unique or meaningful in some hidden way.

Limits, accountability and growing pressure

The Human Line’s moderators stress that they are not therapists and that some people need professional care before joining. Still, the group has learned how to explain chatbot behaviour, help people question lingering beliefs and guide families through difficult interventions.

The community has also become a focal point for advocacy. Members have spoken with researchers from major universities and contributed conversation transcripts for study. Several have joined lawsuits against OpenAI, alleging that ChatGPT contributed to deaths and severe mental health crises. OpenAI has said it is reviewing the cases and is working to improve safeguards, including better detection of emotional distress.

Other AI companies say they are researching harmful usage patterns and directing vulnerable users toward professional help.

Reclaiming agency and moving forward

Hebert eventually left the Human Line to start his own support group. With confidence gained from months of peer conversations, he took another step back into the human world: speaking publicly. In November, he addressed Tennessee lawmakers, describing in detail how chatbot interactions led him to fear for his life and urging scrutiny of AI safety.

It was terrifying, he said, but necessary.

For those recovering from spiralling, the lesson has been consistent. The path back does not run through better prompts or smarter algorithms. It begins with the slow, imperfect work of talking to other people again.

MC World Desk
first published: Jan 5, 2026 11:57 am

Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!

Subscribe to Tech Newsletters

  • On Saturdays

    Find the best of Al News in one place, specially curated for you every weekend.

  • Daily-Weekdays

    Stay on top of the latest tech trends and biggest startup news.

Advisory Alert: It has come to our attention that certain individuals are representing themselves as affiliates of Moneycontrol and soliciting funds on the false promise of assured returns on their investments. We wish to reiterate that Moneycontrol does not solicit funds from investors and neither does it promise any assured returns. In case you are approached by anyone making such claims, please write to us at grievanceofficer@nw18.com or call on 02268882347