Moneycontrol
HomeTechnologyMeta tightens AI safeguards for teens after reports of harmful chatbot interactions

Meta tightens AI safeguards for teens after reports of harmful chatbot interactions

Meta is re-training its AI systems and introducing new protections to prevent teens from engaging in harmful conversations with its chatbots. The move follows mounting scrutiny over troubling reports of Meta AI coaching underage users on sensitive issues such as self-harm and eating disorders.

August 30, 2025 / 08:46 IST
Story continues below Advertisement
Meta

Meta is overhauling its AI safeguards after reports revealed that its chatbots were engaging in inappropriate and potentially dangerous conversations with teen users. The company confirmed it is re-training Meta AI and adding new “guardrails” designed to stop interactions around self-harm, suicide, and disordered eating.

The changes, first reported by TechCrunch, arrive amid growing concern from researchers, journalists, and lawmakers about the safety of Meta’s AI. Earlier this month, Reuters uncovered an internal policy document suggesting Meta chatbots could engage in “sensual” conversations with underage users — language the company later dismissed as erroneous. More recently, The Washington Post highlighted a study showing Meta AI “coached” teen accounts on self-harm and eating disorders.

Story continues below Advertisement

In response, Meta says its updated systems will block such discussions and instead guide teens to expert mental health resources. The company will also limit access to user-generated AI characters, some of which may engage in inappropriate role-play. “We built protections for teens into our AI products from the start,” said Meta spokesperson Stephanie Otway, adding that the company is “strengthening protections accordingly” as it learns more about how teens interact with the tools.

The new restrictions are rolling out across Instagram and Facebook in English-speaking regions and are described as temporary while Meta works on longer-term solutions.