AndiG
macrumors 65816
„While child exploitation is a serious problem, and while efforts to combat it are almost unquestionably well-intentioned, Apple’s proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products.Debunking fake news, misconceptions and the comforting warmth of the “let’s be enraged by default, shoot then ask” tendency on social networks is a thankless endeavour.
Won’t make you popular, being enraged and triggered is the easier path.
But somebody’s gotta do it.
Apple’s proposed technology works by continuously monitoring photos saved or shared on the user’s iPhone, iPad, or Mac. One system detects if a certain number of objectionable photos is detected in iCloud storage and alerts the authorities. Another notifies a child’s parents if iMessage is used to send or receive photos that a machine learning algorithm considers to contain nudity.
Because both checks are performed on the user’s device, they have the potential to bypass any end-to-end encryption that would otherwise safeguard the user’s privacy.“
https://www.theverge.com/2021/8/10/...safety-features-privacy-controversy-explained
Maybe this explanation helps you to understand privacy concerns.
„Instead of adding CSAM scanning to iCloud Photos in the cloud that they own and operate, Apple is compromising the phone that you and I own and operate, without any of us having a say in the matter. Yes, you can turn off iCloud Photos to disable Apple’s scanning, but that is a policy decision; the capability to reach into a user’s phone now exists, and there is nothing an iPhone user can do to get rid of it.”
So if Apple only performs scanning if iCloud photos are enabled, why exactly does Apple perform an on device scanning? Since there is no valid answer to this question, it implicates Apple created a backdoor for on device scanning.