According to a report by Financial Times, Apple is planning to scan photos on iPhones in the US for child abuse images. The system Apple is going to use is called "neuralMatch" and it will scan photos stored on iPhones and back-up's on iCloud.
The scanning would be continuous meaning it will continue to operate in the background and the system will proactively alert a team of human reviewers if it believes it has found a match. Law authorities would then be involved.
The neuralMatch system has been trained using a database from the National Center for Missing and Exploited Children. The report says that the system will be limited to iPhones in the United States in the beginning.
It is a strange move from Apple, who have consistently used "privacy" as an advertising slogan, even going so far as to dedicate an entire section of its presentation at WWDC 2021 to security features.
While the intent here is good and child abuse is nothing to scoff at, the means that Apple is using to get to that end seem skewed. It's willing to let privacy take a backseat and openly admitting to technology that lets it scan data on users' iPhones.
This is a dangerous precedent to set. As if we don't have enough problems, already.
Other security researchers have spoken out against this too:
Regardless of what Apple’s long term plans are, they’ve sent a very clear signal. In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content.That’s the message they’re sending to governments, competing services, China, you. — Matthew Green (@matthew_d_green) August 5, 2021
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!
Find the best of Al News in one place, specially curated for you every weekend.
Stay on top of the latest tech trends and biggest startup news.