HomeNewsTechnologyApple pushes back against the backlash for its new CSAM scanning tool

Apple pushes back against the backlash for its new CSAM scanning tool

Apple has now put out a new FAQ on its site that attempt to answer the big questions being thrown at the CSAM scanning tech

August 09, 2021 / 18:59 IST
Story continues below Advertisement
Apple says it is not willing to give in to any government demands
Apple says it is not willing to give in to any government demands

Apple has attempted to answer some of the big questions being thrown at it after it unveiled a new scanning tool for CSAM images. The tool will be used to scan image hashes before they are uploaded to iCloud and then cross check them with known CSAM hashes provided by child safety organisations.

Another tool that Apple announced empowers communication safety for young children. This scans images sent and received via Messages, using Machine Learning to blur explicit ones. It can even warn parents if their child decides to send or view such a message.

Story continues below Advertisement

Both tools have faced heavy backlash online with the likes of Edward Snowden, Epic Games CEO Tim Sweeney and WhatsApp boss Will Cathcart raising their voices through tweets against the new tools.
In the new FAQ up on its site, Apple argues against the backlash. The first point researchers brought up was that this tool could be expanded beyond CSAM and be used to scan for other data.

Apple clearly states that the process they use, "is designed to prevent that from happening" and says they will continue to, "refuse" demands from government agencies as they have done so in the past. The CSAM tool is also built solely, "to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."

Story continues below Advertisement

The Cupertino giant also says that, "There is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. As a result, the system is only designed to report photos that are known CSAM in iCloud Photos. In most countries, including the United States, simply possessing these images is a crime and Apple is obligated to report any instances we learn of to the appropriate authorities."