Just because they’re not referencing it right now doesn’t mean they aren’t going to use it. What a stupid conclusion.
And then they did it again.But we do remember how Apple deliberately broke FaceTime on the iOS 6, right?
![]()
iOS 6 Users on Devices Able to Run iOS 7 Must Upgrade to Fix FaceTime
Apple today released a new support document, detailing the issue behind the recent problems some iOS 6 users have been experiencing with FaceTime....www.macrumors.com
The only difference is this would scan on the device, flag CP images and leave ALL your other non-CP related images encrypted. They’re still doing scanning in the cloud, they have to, EVERYONE does. This was just going to be a way for those folks without CP on their devices to have ALL their images encrypted in the cloud. This would enable Apple to say to any governmental agency, “No, I don’t have keys to decrypt any of those images and, by the scanning done on the device, I can attest that person isn’t harboring CP.”The tech has been proven enough for apple's purposes though - I guess they could release it silently and make it available to the US government (or others) ?
As far as I can see, they have only pulled the web site in an - for Apple - unusually quiet way. That does not mean, that the idea of introducing CSAM has been officially scrapped by the company (they’re just trying to hide the option in the shadows for now ;-)This was a mess from the beginning and pulling it was the only logical thing to do.
It’s pretty much a difference between having ALL your non-CP related images encrypted in the cloud or not.It’s a reminder to stay vigilant against the many all around us that continue to run amok, using underdeveloped brains without error correction in their attempts to use fear to become brokers of power and control.
Well said. Users aren't standing still either. I dropped Apple and switched to System76 following Apple's blunder. As a lifelong Macaddict no less.Trust will not build easy.
No, that’s just your lack of reading comprehension skills.Come on, I expected this from a commenter but even a MacRumors writer is too lazy to add the word "scanning"? Now you're suggesting that Apple was planning to include child porn in an iOS update.
1) I am talking about staying logged out of iCloud entirely, not just Photos.Unlike Android, this is not too difficult, my dear:
View attachment 1928877
No government will allow itself to get lost in the hash world of a local photo database (Mac) from the outside, that would be too intricate.
Dark are the souls of private people who find naked tortured children horny. But that's not really what this discussion is about.
Why are you making such an "enlightened" claim that Apple is already betraying its customers? Your personal gut feelings are of no interest to anyone here, feel free to bring facts.
If you end up like Julian Assange, you have my sympathy.
The code is already there on anything iOS 14.3 up. It’s the activation and use and in this realm whistleblowers at Apple will have to act, or when it becomes public being used in prosecution somewhere.Lulz if you actually think Apple has killed this program. They tried to be transparent, and people freaked out.
They're still gonna do this - they're just not going to tell you about it, anymore.
Cue 6 months from now, when somebody un-earths the code for this buried in 15.7 and everybody freaks tf out
Because it is literally common industry practice to do so. If a photo is uploaded to your servers, you fingerprint it for this. Apple was one of the last hosts to not do this (based on the comparative number of users reported annually for CSAM possession)Why are you making such an "enlightened" claim that Apple is already betraying its customers? Your personal gut feelings are of no interest to anyone here, feel free to bring facts.
If you end up like Julian Assange, you have my sympathy.
Courageous freedom lovers don't always have to win, but courage is something exhausting, strange, rare and great in mankind.
View attachment 1928861
Until someone gets arrested and sings like a canary, then the whole world would know what Apple is doing and ditch Apple in droves. I don't believe Apple would take that chance. That said, I still wont trust Apple until they actually say the on device scanning is gone and wont come back.Lulz if you actually think Apple has killed this program. They tried to be transparent, and people freaked out.
They're still gonna do this - they're just not going to tell you about it, anymore.
Cue 6 months from now, when somebody un-earths the code for this buried in 15.7 and everybody freaks tf out
I gain the lack of surveillance, that can potentially be abused, on my device! I don't mind them scanning on their devices, it's theirs and their responsibility.?♂️ I don't think everyone realizes that they will just do the entire CSAM matching on their servers for iCloud Photos right? You gain nothing by them abandoning the client "voucher" process if they just replace it with the exact same system that GooglePhotos/FB/MSTeams/Telegram/Reddit/etc use to process every photo uploaded.
Your never going to get end-to-end encryption anyway. Unless you elect some politicians that believe in freedom and privacy.Congratulations everyone, now what you've likely achieved is that Apple will just quietly scan iCloud Photos for CSAM server-side instead, and we'll probably never get end-to-end encryption.