An advisory group working with OpenAI has raised concerns about the idea of introducing a more permissive “adult mode” in ChatGPT. Although this isn’t something that currently exists, there are, however, concerns about what could happen if guardrails were relaxed too much.
One advisor quoted in the report used a striking phrase, saying that without proper limits, chatbots could slip into roles that feel emotionally manipulative or inappropriate. The wording grabbed attention, but the underlying concern is actually quite straightforward. As these systems become more conversational, they also become more influential.
And that’s where things get tricky.
People are already using chatbots for far more than quick answers. They ask for advice, vent about personal issues, and sometimes treat these systems like a sounding board when they don’t want to talk to someone else. In that context, tone matters a lot. If an AI starts responding in ways that feel overly intimate or validating without any real-world accountability, it can create a kind of emotional loop that isn’t always healthy.
The advisory group’s concern is that a more “open” mode could amplify this. Not necessarily because of explicit content alone, but because of how easily conversations can drift into areas that require judgment, nuance and responsibility, things AI doesn’t truly possess.
There’s also the question of access. Even if such features were meant only for adults, enforcing that online is never foolproof. As the report points out, minors could still end up interacting with content or conversations that aren’t meant for them.
Right now, AI companies are in a bit of a bind. People don’t want stiff, robotic replies anymore, they want something that feels natural, almost like talking to a real person. But at the same time, there’s a lot of concern about what happens if these systems become too open or start saying things they shouldn’t.
So companies like OpenAI have mostly erred on the side of caution. They’ve kept fairly firm limits in place to avoid harmful or misleading responses. The problem is, that balance is getting harder to maintain. As these tools become more advanced and more human-like, the expectation is that they’ll also become less filtered and more free-flowing. And that’s where things start to get complicated.
What this debate really highlights is a bigger shift. AI is no longer just a tool you use. It’s something people are starting to relate to. And once that happens, the stakes change.
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!
Find the best of Al News in one place, specially curated for you every weekend.
Stay on top of the latest tech trends and biggest startup news.