Moneycontrol PRO
HomeNewsTechnologyApple pushes back against the backlash for its new CSAM scanning tool

Apple pushes back against the backlash for its new CSAM scanning tool

Apple has now put out a new FAQ on its site that attempt to answer the big questions being thrown at the CSAM scanning tech

August 09, 2021 / 18:59 IST
Apple says it is not willing to give in to any government demands

Apple has attempted to answer some of the big questions being thrown at it after it unveiled a new scanning tool for CSAM images. The tool will be used to scan image hashes before they are uploaded to iCloud and then cross check them with known CSAM hashes provided by child safety organisations.

Another tool that Apple announced empowers communication safety for young children. This scans images sent and received via Messages, using Machine Learning to blur explicit ones. It can even warn parents if their child decides to send or view such a message.

Both tools have faced heavy backlash online with the likes of Edward Snowden, Epic Games CEO Tim Sweeney and WhatsApp boss Will Cathcart raising their voices through tweets against the new tools.


In the new FAQ up on its site, Apple argues against the backlash. The first point researchers brought up was that this tool could be expanded beyond CSAM and be used to scan for other data.

Apple clearly states that the process they use, "is designed to prevent that from happening" and says they will continue to, "refuse" demands from government agencies as they have done so in the past. The CSAM tool is also built solely, "to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."

The Cupertino giant also says that, "There is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. As a result, the system is only designed to report photos that are known CSAM in iCloud Photos. In most countries, including the United States, simply possessing these images is a crime and Apple is obligated to report any instances we learn of to the appropriate authorities."

Apple also makes it very clear that they are not going to scan all the photos stored on an user's iPhone stating that, "By design, this feature only applies to photos that the user chooses to upload to iCloud Photos, and even then Apple only learns about accounts that are storing collections of known CSAM images, and only the images that match to known CSAM. The system does not work for users who have iCloud Photos disabled. This feature does not work on your private iPhone photo library on the device."

Invite your friends and family to sign up for MC Tech 3, our daily newsletter that breaks down the biggest tech and startup stories of the day

Moneycontrol News
first published: Aug 9, 2021 06:59 pm

Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!

Subscribe to Tech Newsletters

  • On Saturdays

    Find the best of Al News in one place, specially curated for you every weekend.

  • Daily-Weekdays

    Stay on top of the latest tech trends and biggest startup news.

Advisory Alert: It has come to our attention that certain individuals are representing themselves as affiliates of Moneycontrol and soliciting funds on the false promise of assured returns on their investments. We wish to reiterate that Moneycontrol does not solicit funds from investors and neither does it promise any assured returns. In case you are approached by anyone making such claims, please write to us at grievanceofficer@nw18.com or call on 02268882347