Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It still isn’t a public disavowal from Apple, which is what I want. Documentation can be re-added to the website at any time just as easily as the words were removed. Publicly making a statement is harder to get around.
True, that's the ideal way.
At the same time, unless that privacy guy who implied to "just don't do illegal things" is ousted, I don't think there will be an official disavowal from Apple. I mean Apple execs even had Craig to be one of the talking heads. Apple doesn't usually admit any wrongdoings.

At the same time, this is a good indication.
 
Congratulations everyone, now what you've likely achieved is that Apple will just quietly scan iCloud Photos for CSAM server-side instead, and we'll probably never get end-to-end encryption.

That’s just awesome! I have no problem if Apple scans for CSAM server side. It’s their cloud, servers and CPU cycles. Also server side scanning doesn’t violate the EU ePrivacy Directive. It’s pointless to cry over for E2EE if all the data is already scanned client side. If you want to keep any data truly private then keep it client side with no third party access so nobody can scan or use it in any shape or form.
 
They already do that
Why are you making such an "enlightened" claim that Apple is already betraying its customers? Your personal gut feelings are of no interest to anyone here, feel free to bring facts.
If you end up like Julian Assange, you have my sympathy.

Courageous freedom lovers don't always have to win, but courage is something exhausting, strange, rare and great in mankind.

monophy.gif
 
Last edited:
  • Disagree
Reactions: Evil Lair
I’m still skeptical. They were quiet when they put the hash database in with iOS 14.3, they just went into silent running with this move.
Still using iOS 14 until they shed more light on why they removed the page. As users we’ve seen too many companies promise us privacy or tell us they won’t sell us out to find out after a few months later they've sold the users out just for money.
 
I’m still skeptical. They were quiet when they put the hash database in with iOS 14.3, they just went into silent running with this move.
Still using iOS 14 until they shed more light on why they removed the page. As users we’ve seen too many companies promise us privacy or tell us they won’t sell us out to find out after a few months later they've sold the users out just for money.
I’m waiting for a public statement from Apple before upgrading as well.
 
I’m waiting for a public statement from Apple before upgrading as well.
- CSAM is dead?
- Yes
- Then we will play this marketing card, it proves that we are stronger than governments, and this strength will be enormously important to our shareholders. Then they can hope for an undisturbed implementation of further customer-oriented ideas
 
Might finally upgrade - after tech savvy people check their new os for underhanded dealings
These conspiracy theories always seem to forget the amount of brand damage it would cause if Apple were caught sneaking in surveillance technology without telling customers.

Apple has been very public stance pro-privacy and getting caught doing anti-privacy things would cause irreparable harm to the company and the damage would probably be measured in billions of dollars, not to mention scandals like this tend to cost CEOs their jobs.

Funny how all that never gets mentioned.
 
Last edited:
"Apple in August announced a planned suite of new child safety features, including scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM)" -- Isn't this wrong, incorrect? All companies must scan for CSAM on their servers. This is not new. Google, Microsoft, Apple, all do, have done this. The change was to move scanning tasks to apple devices as they upload to the cloud. Perhaps then end-to-end encryption would be possible which would make users info secure even from apple + world. But perhaps people thought that having done that other types of scanning would be forced upon Apple to perform, and freaked out. But "scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM)" does happen on all platforms.
 
  • Like
Reactions: ZigZagLens
Why are you making such an "enlightened" claim that Apple is already betraying its customers? Your personal gut feelings are of no interest to anyone here, feel free to bring facts.
If you end up like Julian Assange, you have my sympathy.

Courageous freedom lovers don't always have to win, but courage is something exhausting, strange, rare and great in mankind.

Apple has already admitted to scanning email, and stated they don't want CSAM on their servers. It's not a big leap of logic if client side CSAM isn't implemented, they'll just do what every other cloud provider has done for years and scan iPhoto images on the server (if they are not already).
 
  • Like
Reactions: ZigZagLens
All companies must scan for CSAM on their servers.
This is not true. If a company finds CSAM it must be reported, but they are not required to scan. If a company is coerced to scan, then they are acting on behalf of the government and CSAM cases run into legal issues with the discovered evidence.

With that said, companies scan because they don't want to be a safe haven for CSAM. So there is public and moral pressure to scan even though there is no legal reason that they have to scan.
 

Apple has already admitted to scanning email, and stated they don't want CSAM on their servers. It's not a big leap of logic if client side CSAM isn't implemented, they'll just do what every other cloud provider has done for years and scan iPhoto images on the server (if they are not already).
Yes, I agree, it was probably the only negotiable compromise.
But that's better than governments being able to get it themselves, isn't it?
Currently, governments have to try to attack Apple's data or openly call on Apple to hand over information. That's already an obstacle, don't you think?

And the hash idea is, of course, a joke. But I found it outrageous that Apple actively pursued the CSAM idea as a figurehead. I found that really bad and sad. Better to say nothing than obviously lie.


(youtube: 'apple' 'csam' 'official')
 


...
Apple initially attempted to dispel some misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more, in order to allay concerns.
...

I didn't misunderstand anything, and I doubt many people did. That was part of the PR disaster for Apple - arrogantly assuming that user's concerns were out of ignorance. So far as I can tell, generally those who knew more about what Apple proposed had greater concern, with exceptions of course, some of whom almost convinced me it'd be OK by having a decent discussion rather than dismissing my concerns.

Good news for all abusers?
Apple's CSAM spy system would have been easily circumvented by modifying images, leading to a a kind of arms race between Apple and those wanting to hide the images (see https://forums.macrumors.com/thread...esearchers-in-new-study.2317024/post-30610726).

Anyway, I hope Apple buries this and never resurrects the idea. What a stoopid blunder.
 
Yes, I agree, it was probably the only negotiable compromise.
But that's better than governments being able to get it themselves, isn't it?
Currently, governments have to try to attack Apple's data or openly call on Apple to hand over information. That's already an obstacle, don't you think?
The irony of this whole debacle is that Apple's client side implementation had the ability to be more private and secure than the server (especially if we assume this was step 1 in full E2E for iCloud photo). By scanning on the server it's easier for governments to target individuals, edit the list scanned arbitrarily, view photos, etc...

The main problem with the client side implementation was the slippery slope arguments. But, if someone's threat model is Apple scanning all photos on device regardless of iCloud status, it has always and still exists.
 
I hope Apple brings it back. We need to stop pedophiles and child sex abuse. Children need to be protected.

Has anyone said children shouldn’t be protected? You honestly think the best way to protect children and all the law abiding citizens is to scan every single Apple device in the world to catch the criminals? Any data scanning should be done server side if it legal to do so. Client side scanning of everyone’s data goes against the very nature and ethos of most democratic societies.
 
The irony of this whole debacle is that Apple's client side implementation had the ability to be more private and secure than the server (especially if we assume this was step 1 in full E2E for iCloud photo). By scanning on the server it's easier for governments to target individuals, edit the list scanned arbitrarily, view photos, etc...

The main problem with the client side implementation was the slippery slope arguments. But, if someone's threat model is Apple scanning all photos on device regardless of iCloud status, it has always and still exists.
Do you think that governments can penetrate Apple's server unnoticed (without man in the middle) and understand its structure? Do you have any sources?
 
better, this was bound to fail at the start, you'd only need one bad actor feeding apple's system wrong hashes and everyone is a potential suspect for whatever governmental purpose that bad actor wants to silence, like criticism, dissent, protestors in Hong Kong, LGBT minorities in certain regions you name it. Also, as an EU citizen, I'm glad, as this system Apple proposed, wouldn't have been allowed here anyway because of the strong protections in our GDPR privacy laws.


Not sure gdpr would have stopped it as initial identification was on the device that flagged it for additional attention when uploaded. All other online photo outfits scan everything for csam, Apple would have only scanned what was already flagged as suspicious.

Even if it was false alerts it would have been within the rules for scanning for csam.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.