If Apple gets hacked. Creating a backdoor into such a far reaching detection system means it is possible Apple would not be aware of how its devices are being scanned and manipulated.
“Apple is inventing a world in which every product you purchase owes its highest loyalty to someone other than its owner. To put it bluntly, this is not an innovation but a tragedy, a disaster-in-the-making.”
To date, Apple has defended its CSAM detection system saying it was
poorly communicated. But in the last weeks researchers, who worked on a similar system for two years, concluded “the technology was dangerous” saying “we were baffled to see that Apple had few answers for the hard questions we’d surfaced.”
Hello friends, and welcome back to Week in Review. Last week, we dove into the truly bizarre machinations of the NFT market. This week, we're talking
techcrunch.com
from an other article
Snowden points out that the entire system is easily bypassed which undermines the stated aim behind its creation:
“If you’re an enterprising pedophile with a basement full of CSAM-tainted iPhones, Apple welcomes you to entirely exempt yourself from these scans by simply flipping the ‘Disable iCloud Photos’ switch, a bypass which reveals that this system was never designed to protect children, as they would have you believe, but rather to protect their brand. As long as you keep that material off their servers, and so keep Apple out of the headlines, Apple doesn’t care.”
And, for those of you already thinking ahead, Snowden points out there is an obvious next step to this process: governments compelling Apple to remove the option to Disable photo uploads to iCloud.
“If Apple demonstrates the capability and willingness to continuously, remotely search every phone for evidence of one particular type of crime, these are questions for which they will have no answer. And yet an answer will come — and it will come from the worst lawmakers of the worst governments. This is not a slippery slope. It’s a cliff.”