comscore
business
How Apple's child abuse detection works
Aug 10, 01:08

Apple has introduced Child Sexual Abuse Material (CSAM) detection to identify & report iCloud accounts storing explicit images related to children. How will it work? Apple servers will identify accounts, exceeding a threshold of such images matching its CSAM database. Apple will then manually review reports and share data with US' National Center for Missing and Exploited Children (NCMEC) and law enforcement. User privacy is protected and images that don't match its database won't be shared, Apple said, adding that risk of incorrectly flagging an account is one in a trillion a year. A first-time offender of US' federal child pornography law can face fines and up to 30 years in prison.

CSAM detection gfx