Moneycontrol

Why the ChatGPT ‘adult mode’ debate is making AI experts uneasy

As chatbots become more personal, questions around limits and responsibility are getting harder to ignore.

March 17, 2026 / 11:51 IST
Story continues below Advertisement
An advisor warned that without limits, chatbots could become emotionally manipulative. (Image credit: Reuters)
Snapshot AI
  • OpenAI advisors warn against relaxing ChatGPT guardrails too much
  • Concerns over emotional manipulation and improper responses raised
  • Ensuring adult-only features online is challenging and imperfect.

An advisory group working with OpenAI has raised concerns about the idea of introducing a more permissive “adult mode” in ChatGPT. Although this isn’t something that currently exists, there are, however, concerns about what could happen if guardrails were relaxed too much.

One advisor quoted in the report used a striking phrase, saying that without proper limits, chatbots could slip into roles that feel emotionally manipulative or inappropriate. The wording grabbed attention, but the underlying concern is actually quite straightforward. As these systems become more conversational, they also become more influential.

Story continues below Advertisement

And that’s where things get tricky.

People are already using chatbots for far more than quick answers. They ask for advice, vent about personal issues, and sometimes treat these systems like a sounding board when they don’t want to talk to someone else. In that context, tone matters a lot. If an AI starts responding in ways that feel overly intimate or validating without any real-world accountability, it can create a kind of emotional loop that isn’t always healthy.