Discord, the American VoIP and Instant Messaging (IM) platform, has updated its child safety policies and is banning AI-generated CSAM (Child Sexual Abuse Material) and servers dedicated to teen dating.
In a blog post, Discord said that it has, "a zero-tolerance policy for child sexual abuse, which does not have a place on our platform or anywhere in society".
Also read | Who is Jack Teixeira, 21-year-old arrested for leaking Pentagon secrets
It has updated its policies to encompass, "any text or media content that sexualizes children, including drawn, photorealistic, and AI-generated photorealistic child sexual abuse material".
The platform came under scrutiny after an NBC investigation last month found that the platform was being used for child grooming, extortion and exploitation.
The investigation identified nearly 165 cases, including crime rings that were dedicated to extort money from children by threating them with AI-generated graphic images, commonly known as sextortion.
Also read | What is the BBC sex scandal and why has the presenter not been named?
In two of the more extreme cases, a teen was lured across state lines after extensive grooming on Discord, after which she was found locked in a backyard shed. In another case, a teen was kidnapped by 22-year-old male who groomed her after a meeting on Discord.
According to the National Center for Missing and Exploited Children (NCMEC), reports of CSAM on Discord increased by a staggering 474 percent from 2021 to 2022.
Besides updating its safety policies, Discord has also recently launched a new tool called Family Center that lets parents have supervision over child accounts.
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!
Find the best of Al News in one place, specially curated for you every weekend.
Stay on top of the latest tech trends and biggest startup news.