AI-generated nudity and deepfake porn is a growing concern on various platforms. Google has updated its Inappropriate Content Policy to prohibit promoting synthetic content that has been altered or generated to be sexually explicit or contain nudity. “We begin enforcing the policy update on May 30, 2024,” said the company.
There have been apps promoting deepfake porn and AI-generated nude content. Recently, Apple also banned three apps from the App Store in the same context. “We take violations of this policy very seriously and consider them egregious. If we find violations of this policy, we will suspend your Google Ads accounts upon detection and without prior warning, and you will not be allowed to advertise with us again,” said Google on a support page.
The policy isn’t new as such with Google not allowing any sexually explicit content on Play Store. However, there are apps which found a loophole and promote “face swapping” which isn’t sexually explicit but do promote nudity and deepfake porn on other platforms.
Google recently revealed that it blocked over 2.28 million apps from being published in 2023 due to policy violations. This is a significant increase from the 1.43 million apps, that were rejected in 2022. Google has also implemented different measures to control unknown background activities of such apps.
Additionally, Google has rejected more than 200,000 app submissions, to ensure proper use of sensitive permissions such as background location or SMS access, in 2023. The company also said that it worked with providers of software development kit (SDK), to limit sensitive data access and sharing and strengthen privacy.
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!