True, that's the ideal way.It still isn’t a public disavowal from Apple, which is what I want. Documentation can be re-added to the website at any time just as easily as the words were removed. Publicly making a statement is harder to get around.
They already do thatCongratulations everyone, now what you've likely achieved is that Apple will just quietly scan iCloud Photos for CSAM server-side instead, and we'll probably never get end-to-end encryption.
Congratulations everyone, now what you've likely achieved is that Apple will just quietly scan iCloud Photos for CSAM server-side instead, and we'll probably never get end-to-end encryption.
Why are you making such an "enlightened" claim that Apple is already betraying its customers? Your personal gut feelings are of no interest to anyone here, feel free to bring facts.They already do that
I’m waiting for a public statement from Apple before upgrading as well.I’m still skeptical. They were quiet when they put the hash database in with iOS 14.3, they just went into silent running with this move.
Still using iOS 14 until they shed more light on why they removed the page. As users we’ve seen too many companies promise us privacy or tell us they won’t sell us out to find out after a few months later they've sold the users out just for money.
- CSAM is dead?I’m waiting for a public statement from Apple before upgrading as well.
These conspiracy theories always seem to forget the amount of brand damage it would cause if Apple were caught sneaking in surveillance technology without telling customers.Might finally upgrade - after tech savvy people check their new os for underhanded dealings
Way to not get the point of all this. I mean really, 10 out of 10 plus a gold star.Good news for all abusers?
Why are you making such an "enlightened" claim that Apple is already betraying its customers? Your personal gut feelings are of no interest to anyone here, feel free to bring facts.
If you end up like Julian Assange, you have my sympathy.
Courageous freedom lovers don't always have to win, but courage is something exhausting, strange, rare and great in mankind.
This is not true. If a company finds CSAM it must be reported, but they are not required to scan. If a company is coerced to scan, then they are acting on behalf of the government and CSAM cases run into legal issues with the discovered evidence.All companies must scan for CSAM on their servers.
Yes, I agree, it was probably the only negotiable compromise.![]()
Apple already scans iCloud Mail for CSAM, but not iCloud Photos - 9to5Mac
Apple has confirmed to me that it already scans iCloud Mail for CSAM, and has been doing so since 2019. It has not, however, been scanning ...9to5mac.com
Apple has already admitted to scanning email, and stated they don't want CSAM on their servers. It's not a big leap of logic if client side CSAM isn't implemented, they'll just do what every other cloud provider has done for years and scan iPhoto images on the server (if they are not already).
...
Apple initially attempted to dispel some misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more, in order to allay concerns.
...
Apple's CSAM spy system would have been easily circumvented by modifying images, leading to a a kind of arms race between Apple and those wanting to hide the images (see https://forums.macrumors.com/thread...esearchers-in-new-study.2317024/post-30610726).Good news for all abusers?
When can we install the camera in your home? We promise to never look at your spouse naked... it'll only be used to make sure you don't spank your kids.I hope Apple brings it back. We need to stop pedophiles and child sex abuse. Children need to be protected.
The irony of this whole debacle is that Apple's client side implementation had the ability to be more private and secure than the server (especially if we assume this was step 1 in full E2E for iCloud photo). By scanning on the server it's easier for governments to target individuals, edit the list scanned arbitrarily, view photos, etc...Yes, I agree, it was probably the only negotiable compromise.
But that's better than governments being able to get it themselves, isn't it?
Currently, governments have to try to attack Apple's data or openly call on Apple to hand over information. That's already an obstacle, don't you think?
I hope Apple brings it back. We need to stop pedophiles and child sex abuse. Children need to be protected.
Do you think that governments can penetrate Apple's server unnoticed (without man in the middle) and understand its structure? Do you have any sources?The irony of this whole debacle is that Apple's client side implementation had the ability to be more private and secure than the server (especially if we assume this was step 1 in full E2E for iCloud photo). By scanning on the server it's easier for governments to target individuals, edit the list scanned arbitrarily, view photos, etc...
The main problem with the client side implementation was the slippery slope arguments. But, if someone's threat model is Apple scanning all photos on device regardless of iCloud status, it has always and still exists.
better, this was bound to fail at the start, you'd only need one bad actor feeding apple's system wrong hashes and everyone is a potential suspect for whatever governmental purpose that bad actor wants to silence, like criticism, dissent, protestors in Hong Kong, LGBT minorities in certain regions you name it. Also, as an EU citizen, I'm glad, as this system Apple proposed, wouldn't have been allowed here anyway because of the strong protections in our GDPR privacy laws.