So CSAM will be everywhere on iphone or only imessage or ony iCloud?
Can you explain to me how does Apple know if the picture is or is not nudity if they are not scanning each and every photo and looking at them?
Scanning photos on your phone is not the same as CSAM. CSAM is the process of matching photos on your phone against a database of known abuse images.
Apple already scans images on your phone when it categorises images for the Photo app.
Secondly, the problem is not CSAM itself, which Apple is already doing server side, along with Facebook, Microsoft and Google; the problem is Apple’s
implementation of CSAM which runs the check on your device. Running it on device essentially opens up a nice backdoor on your device which governments will be looking to exploit.
The British government says that it wants the ability to scan encrypted messages for CSAM, even where end-to-end encryption is used ...
9to5mac.com
When Priti Patel backs you, then you know you’re doing something hideously wrong.
And given what we now know about the shady money swaps going on between Apple and China, then I’m pretty sure Apple will extend their on-device scanning to match whatever database the Party tells them to match against.
On a personal note, I had decided not to upgrade any of my devices, as I think Apple will switch this on as soon as they think the noise has died down. But now, I‘m also not upgrading my devices because the latest round of OS upgrades look like utter sh*t.
Ironically, CSAM provided a lucky escape: normally, I upgrade a day after the release. ??♂️
P.S.
I don’t believe Apple intentionally set out to build a back door into every device. I think they were so blinded by their desire to save money by shifting the cost of CSAM scanning to their customers, they lost sight of their much-vaunted stance on privacy.