Apple has attempted to answer some of the big questions being thrown at it after it unveiled a new scanning tool for CSAM images. The tool will be used to scan image hashes before they are uploaded to iCloud and then cross check them with known CSAM hashes provided by child safety organisations.
Another tool that Apple announced empowers communication safety for young children. This scans images sent and received via Messages, using Machine Learning to blur explicit ones. It can even warn parents if their child decides to send or view such a message.
Both tools have faced heavy backlash online with the likes of Edward Snowden, Epic Games CEO Tim Sweeney and WhatsApp boss Will Cathcart raising their voices through tweets against the new tools.
No matter how well-intentioned, @Apple is rolling out mass surveillance to the entire world with this. Make no mistake: if they can scan for kiddie porn today, they can scan for anything tomorrow.They turned a trillion dollars of devices into iNarcs—*without asking.* https://t.co/wIMWijIjJk
— Edward Snowden (@Snowden) August 6, 2021
Apple clearly states that the process they use, "is designed to prevent that from happening" and says they will continue to, "refuse" demands from government agencies as they have done so in the past. The CSAM tool is also built solely, "to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."
The Cupertino giant also says that, "There is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. As a result, the system is only designed to report photos that are known CSAM in iCloud Photos. In most countries, including the United States, simply possessing these images is a crime and Apple is obligated to report any instances we learn of to the appropriate authorities."
Apple also makes it very clear that they are not going to scan all the photos stored on an user's iPhone stating that, "By design, this feature only applies to photos that the user chooses to upload to iCloud Photos, and even then Apple only learns about accounts that are storing collections of known CSAM images, and only the images that match to known CSAM. The system does not work for users who have iCloud Photos disabled. This feature does not work on your private iPhone photo library on the device."
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!
Find the best of Al News in one place, specially curated for you every weekend.
Stay on top of the latest tech trends and biggest startup news.