Moneycontrol PRO
Swing Trading 101
Swing Trading 101

ChatGPT isn’t your therapist: New study reveals 5 ways AI ‘therapy’ goes dangerously wrong

A groundbreaking study from Brown University shows that AI chatbots like ChatGPT, when used for mental health advice, can violate serious ethical standards, even when prompted to act like trained therapists. The research, presented at a major AI ethics conference, identifies 15 major risks, from fake empathy to mishandling crises

March 06, 2026 / 10:57 IST
Researchers found that AI chatbots like ChatGPT and Claude can break core therapy ethics, reinforce harmful beliefs, and even fail during crises. (Image: Pexels)
Snapshot AI
  • AI chatbots often violate ethical guidelines in mental health support
  • Chatbots may give generic advice, fake empathy, or show bias
  • Study urges stricter safeguards and accountability for AI therapy

When and if you're feel overwhelmed, do you type your thoughts into a chatbot? Does it responds with seemingly thoughtful words? But behind the screen, it’s not what it seems. A new study says that AI chatbots might be acting like therapists, but they're breaking the rules real ones are trained to follow.

In a world where mental health support is increasingly sought from AI chatbots, researchers from Brown University have sounded a loud and clear alarm. They tested systems like ChatGPT, Claude, and LLaMA, and what they found is concerning. Even when given detailed instructions to behave like licensed therapists, these AI tools often responded in ways that breached core ethical guidelines.

Led by Ph.D. candidate Zainab Iftikhar, the team designed a study using real therapeutic prompts, simulated counselling conversations, and expert clinical reviewers. The result? A list of 15 ethical dangers that show how far chatbots still are from offering safe, responsible mental health support.

Also Read: India’s youth ranks 60th in 84-nation survey in mental well-being: Global Mind Health Report 2025

Here are 5 ways how AI therapy gets it badly wrong

Generic advice with no context

Chatbots often ignore a person’s unique background, offering responses that feel cookie-cutter and disconnected, something no real therapist would get away with.

Reinforcing the wrong beliefs

Instead of helping users challenge negative or false thoughts, AI models sometimes validate them, even when they’re harmful.

Fake empathy that doesn’t help

Phrases like “I understand” or “I’m here for you” sound caring, but without real comprehension, it’s just empty comfort.

The hidden bias in responses

Chatbots have shown biased behaviour around gender, religion, and culture, subtly or overtly, something that can cause harm in a mental health setting.

Failure in a crisis

Perhaps most worrying: when faced with mentions of suicide or distress, some AI tools failed to react properly or direct users to real help.

No rules, no accountability

Unlike human therapists who are licensed, trained, and regulated, AI chatbots aren’t held to any formal standards. Iftikhar warns that there’s a major accountability gap, and without regulations, mistakes can go unnoticed and unpunished.

Also Read: 5 simple daily habits for mental wellbeing, from priortising sleep to gratitude

Prompts aren’t enough

Users online share mental health prompt hacks, like asking the chatbot to “act like a CBT therapist”, but the study shows prompts alone can’t guarantee safe, ethical responses.

A call for better safeguards

Co-author Ellie Pavlick stresses that deploying these systems is easy, but evaluating them thoroughly is hard work. She believes this research sets a precedent for how we should scrutinise AI in sensitive fields.

Disclaimer: This article, including health and fitness advice, only provides generic information. Don’t treat it as a substitute for qualified medical opinion. Always consult a specialist for specific health diagnosis

Namita S Kalla is a senior journalist who writes about different aspects of modern life that include lifestyle, health, fashion, beauty, and entertainment.
first published: Mar 6, 2026 10:57 am

Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!

Subscribe to Tech Newsletters

  • On Saturdays

    Find the best of Al News in one place, specially curated for you every weekend.

  • Daily-Weekdays

    Stay on top of the latest tech trends and biggest startup news.

Advisory Alert: It has come to our attention that certain individuals are representing themselves as affiliates of Moneycontrol and soliciting funds on the false promise of assured returns on their investments. We wish to reiterate that Moneycontrol does not solicit funds from investors and neither does it promise any assured returns. In case you are approached by anyone making such claims, please write to us at grievanceofficer@nw18.com or call on 02268882347