Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Just because they’re not referencing it right now doesn’t mean they aren’t going to use it. What a stupid conclusion.
 
Apple does not end-to-end encrypt iCloud backups. The CSAM scanning thing was about that. Their overlords aka US Gov't said if you encrypt iCloud backups, then we want on-device CSAM scanning.
 
Last edited:
  • Like
Reactions: Unregistered 4U
But we do remember how Apple deliberately broke FaceTime on the iOS 6, right?
And then they did it again.

They have been doing these kind of stuff behind the scene and got caught.
 
  • Like
Reactions: IG88
This is a relief, but it raises disturbing questions of how such an idea even got as far as almost being unleashed on the world. How is it possible there is no reasonable group of folks at Apple to provide rational checks and balances on radical concepts like this? And shouldn’t Craig Federighi be held accountable for trying to promote it? I would go so far as to call him the equivalent of Paul Von Hindenburg, who made Adolf Hitler the Chancellor of Germany… with all the exuberance of good intention.

And then there were the handful of people in this and other forums on Reddit that rabidly supported the concept.

It’s a reminder to stay vigilant against the many all around us that continue to run amok, using underdeveloped brains without error correction in their attempts to use fear to become brokers of power and control.
 
Last edited:
The tech has been proven enough for apple's purposes though - I guess they could release it silently and make it available to the US government (or others) ?
The only difference is this would scan on the device, flag CP images and leave ALL your other non-CP related images encrypted. They’re still doing scanning in the cloud, they have to, EVERYONE does. This was just going to be a way for those folks without CP on their devices to have ALL their images encrypted in the cloud. This would enable Apple to say to any governmental agency, “No, I don’t have keys to decrypt any of those images and, by the scanning done on the device, I can attest that person isn’t harboring CP.”

Currently, if some entity wants to go through all your images to check to see if you were in the same location as a crime that occurred? Apple, because they have access to your images, has to provide that.
 
This was a mess from the beginning and pulling it was the only logical thing to do.
As far as I can see, they have only pulled the web site in an - for Apple - unusually quiet way. That does not mean, that the idea of introducing CSAM has been officially scrapped by the company (they’re just trying to hide the option in the shadows for now ;-)

Apple has NOT officially said, that CSAM is scrapped. Period.

When Apple has officially scrapped the project and promised never to resurrect it and never introduce a similar surveillance tool, I may decide, if I’ll trust the company (whether it’s Apple or an Israeli government supported commercial outfit really makes no significant difference in real life).

Tim has to go first. The bug Nirvana of Big Sur and later, as well as the hardware problems seen in M1 (various versions) also need to be sorted out. Preferably yesterday and not sometime in the unknown future, if ever.

Tim’s Apple reign really seems to have created a sorry mess of everything, growing worse each year.

Trust will not build easy.

Regards
 
It’s a reminder to stay vigilant against the many all around us that continue to run amok, using underdeveloped brains without error correction in their attempts to use fear to become brokers of power and control.
It’s pretty much a difference between having ALL your non-CP related images encrypted in the cloud or not.
 
Trust will not build easy.
Well said. Users aren't standing still either. I dropped Apple and switched to System76 following Apple's blunder. As a lifelong Macaddict no less.

Life on Linux is rolling along. For the most part it has faded into the background. But even consumer-focused Linux versions like Pop OS have a lot of room to grow before they're consumer ready.

If you can donate to Linux or even switch to System76 hardware with Pop OS (https://system76.com/) consider doing so. The consumer world needs viable open source alternatives to Apple and Windows. System76 isn't perfect, but it and companies like it building open source hardware and software are the future.
 
  • Like
Reactions: MuppetGate
Come on, I expected this from a commenter but even a MacRumors writer is too lazy to add the word "scanning"? Now you're suggesting that Apple was planning to include child porn in an iOS update.
No, that’s just your lack of reading comprehension skills.
 
Unlike Android, this is not too difficult, my dear:
View attachment 1928877
No government will allow itself to get lost in the hash world of a local photo database (Mac) from the outside, that would be too intricate.
Dark are the souls of private people who find naked tortured children horny. But that's not really what this discussion is about.
1) I am talking about staying logged out of iCloud entirely, not just Photos.

2) No idea what your point is, but I do know I don't want all my files systematically scanned and reported on the off chance that I may be a monstrous pedophile!

And as others have said, this will just leave the door open for all kinds of abuses. What about political material? Too subversive? Let us tell the authorities, for your safety!

Hopefully this is the last time we ever hear about this. If they implement anything like this, I will end my 30 years relation with Apple. Absolute insanity. Surprised this ever made it this far.
 
Last edited:
Lulz if you actually think Apple has killed this program. They tried to be transparent, and people freaked out.

They're still gonna do this - they're just not going to tell you about it, anymore.

Cue 6 months from now, when somebody un-earths the code for this buried in 15.7 and everybody freaks tf out
 
Why are you making such an "enlightened" claim that Apple is already betraying its customers? Your personal gut feelings are of no interest to anyone here, feel free to bring facts.
If you end up like Julian Assange, you have my sympathy.


 
?‍♂️ I don't think everyone realizes that they will just do the entire CSAM matching on their servers for iCloud Photos right? You gain nothing by them abandoning the client "voucher" process if they just replace it with the exact same system that GooglePhotos/FB/MSTeams/Telegram/Reddit/etc use to process every photo uploaded.

Same hashes, same content, same iCloud Photos servers, same result.

What we lose is the potential to have the pictures client-encrypted such that apple's servers can't even see their content*. Every single hypothetical dystopian scenario people keep claiming is exactly what is possible today, and the client check would have made it harder to implement instead of easier.

\* unless there are multiple client voucher failures, and a second fingerprint failure, and a manual review of only those photos.
 
Lulz if you actually think Apple has killed this program. They tried to be transparent, and people freaked out.

They're still gonna do this - they're just not going to tell you about it, anymore.

Cue 6 months from now, when somebody un-earths the code for this buried in 15.7 and everybody freaks tf out
The code is already there on anything iOS 14.3 up. It’s the activation and use and in this realm whistleblowers at Apple will have to act, or when it becomes public being used in prosecution somewhere.
 
Why are you making such an "enlightened" claim that Apple is already betraying its customers? Your personal gut feelings are of no interest to anyone here, feel free to bring facts.
If you end up like Julian Assange, you have my sympathy.

Courageous freedom lovers don't always have to win, but courage is something exhausting, strange, rare and great in mankind.

View attachment 1928861
Because it is literally common industry practice to do so. If a photo is uploaded to your servers, you fingerprint it for this. Apple was one of the last hosts to not do this (based on the comparative number of users reported annually for CSAM possession)

https://en.wikipedia.org/wiki/PhotoDNA "Bing and OneDrive, as well as by Google's Gmail, Twitter, Facebook, Adobe Systems, Reddit, Discord"

 
Lulz if you actually think Apple has killed this program. They tried to be transparent, and people freaked out.

They're still gonna do this - they're just not going to tell you about it, anymore.

Cue 6 months from now, when somebody un-earths the code for this buried in 15.7 and everybody freaks tf out
Until someone gets arrested and sings like a canary, then the whole world would know what Apple is doing and ditch Apple in droves. I don't believe Apple would take that chance. That said, I still wont trust Apple until they actually say the on device scanning is gone and wont come back.
 
?‍♂️ I don't think everyone realizes that they will just do the entire CSAM matching on their servers for iCloud Photos right? You gain nothing by them abandoning the client "voucher" process if they just replace it with the exact same system that GooglePhotos/FB/MSTeams/Telegram/Reddit/etc use to process every photo uploaded.
I gain the lack of surveillance, that can potentially be abused, on my device! I don't mind them scanning on their devices, it's theirs and their responsibility.
 
  • Like
Reactions: BulkSlash
"Protecting our children" is becoming such a scam expression, being used by corporations and governments to pretty much get permission to do whatever they want, because whenever you speak up against it, you're immediately against the children and on the side of pedophiles.

In Hungary they recently made a law to limit all content that shows homosexuality to 18+ audiences, under the guise of a "child protection law". They have lumped "homosexual propaganda" into the same bucket as "pedophilia" and whenever someone defends equal rights for people regardless of their sexual orientation, the government spokesperson can just reply with "We're just defending our children against pedophiles, don't you want that?".

Doing something unacceptable is still unacceptable even if you claim it's to protect children. People have to learn to decouple evil methods from good causes.

Advertising how pro-privacy you are, and refusing to unlock phones of murderers just to avoid setting a precedent, but building in a way to scan your users photo library with the threat of reporting them to the police... that's very very strange.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.