Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
We’re not stupid, we know this is a back door for government surveillance of our personal photos.
People who engage in CSAM don’t store the images in iCloud.

Good job on spectacularly destroying the privacy focus you’ve worked so hard to build.
if you actually are beginning to wear the tin foil hat that Apple has purposely opened a back door to iCloud data which Feds already had access to with a warrant, then please don't buy apple devices. it is better for your sanity if it satisfies your viewpoint that Apple is openly building backdoors to government
 
It isn't necessarily THIS system; it is the precedent that it sets for other systems. That is the point that Apple seems to be missing.
Absolutely! Well said.

Craig could publish the code, do a review, presentation, have color photos with circles and arrows and a paragraph on the back of each one explaining what it was about. It still doesn't make a difference – it is a pandora's box that Apple, of all companies on the planet, should not contemplate opening.
 
  • Like
Reactions: BurgDog
It does make me ask the question why aren’t they just doing that?

It’s either:

1. Because they genuinely can’t. Or if they can, not in a cheap and inexpensive way. In which case they’re harnessing your own devices processing power to save costs on server side processing.

2. Or they just can. But aren’t. And I’d want to know why.

I wish Craig was asked in that interview why that isn’t the case.

The most common theory is that they're prepping full E2EE for iCloud backups and photos which is why they're starting to rollout this tech. They want to play nice with their law enforcement friends (you know, the ones that got them to silently sign up to PRISM metadata collection and told them not to encrypt backups in the first place all without telling their customers) but still market themselves as the only "full end to end encrypted" smartphone service provider (despite E2EE being completely nullified by the use of a client side scanner)

Another theory is that they received pressure to implement more comprehensive CSAM scanning technology and thought the "on device" marketing term would play better with the general, uninformed public rather than saying "server side." It sort of worked considering how many people in these threads were under the false assumption that Apple cannot access your iCloud Photo library ("Apple can only access the unencrypted photos that get sent to them if they're a hit with the hash scanner" is something I've seen thrown around which is absolutely incorrect)
 
The most common theory is that they're prepping full E2EE for iCloud backups and photos which is why they're starting to rollout this tech. They want to play nice with their law enforcement friends (you know, the ones that got them to silently sign up to PRISM metadata collection and told them not to encrypt backups in the first place all without telling their customers) but still market themselves as the only "full end to end encrypted" smartphone service provider (despite E2EE being completely nullified by the use of a client side scanner)

Another theory is that they received pressure to implement more comprehensive CSAM scanning technology and thought the "on device" marketing term would play better with the general, uninformed public rather than saying "server side." It sort of worked considering how many people in these threads were under the false assumption that Apple cannot access your iCloud Photo library ("Apple can only access the unencrypted photos that get sent to them if they're a hit with the hash scanner" is something I've seen thrown around which is absolutely incorrect)
I dearly hope it’s the former and we do get E2E encryption for iCloud backups. It’s something I’ve wanted for years!

Right now it’s a choice between privacy with encrypted backups via a Mac or convenience with iCloud backups. Most will choose convenience. I’m guilty of that too. Even though privacy is really important to me.
 
  • Love
Reactions: DeepIn2U
By doing it this way, they can continue to make the claim that they can't access your stuff. Your iCloud photos are technically encrypted - they just hold the keys. By doing it this way, they don't have to decrypt anything in iCloud as they're receiving the malicious "package" in its own "safety voucher" package. Yes, they'll have to review that, but not at the expense of decrypting any other iCloud photos.

I'm not saying that makes it any better. But from their standpoint, where they want to die on the hill of "we can't access your stuff!", I can see some logic behind it.

I get it too. And this stuff, honestly doesn't bother me - I don't expect that any of my stuff on device or in the cloud is private. And I'm basically fine with that if we can stamp out CSAM and jail those that consume, distribute and make it.

But while, like you, I see Apple's logic, I think they have it reversed. By maintaining absolutely privacy for the cloud, they potentially reduce absolute privacy on the device, I think it makes the choice hard for consumers. I don't care that my cloud stuff is private - it's in the cloud and I have to assume that someone, somewhere can access it. If I wanted to keep something 100% private, I wouldn't use the cloud in the knowledge that while it remains on device it is safe.

Now - I know that Apple won't scan non-cloud connected phones. But the technology is there embedded in the phone OS just waiting on some exploit to expose millions of devices. They are human - they make errors such as the Lock Screen bug and they will do that with this, so that some nefarious teenager in some other part of the world starts scanning millions of phones by using tech built into the OS.
 
now this is getting silly.
Why? Why is asking for "facts" and citations suddenly silly?

Many of the arguments supporting this spyOS 15 CSAM scanning are predicated on trusting Apple. Apple themselves have, in my opinion, eroded that trust by implementing a scanning tool that literally no one has asked for. A tool which uses a third-party government sanctioned database to weed through user content. They didn't discuss it. They didn't reveal it at WWDC.

I agree that citingApple's source code is not going to happen. But it is necessary to ask if only to demonstrate that this spyOS is a closed source system designed to spy on its users. There is no way to spin this as "pro-privacy".

They have told you they are scanning your device looking for criminal activity. Supporting that is right in there with the "if you've got nothing to hide, you've got nothing to fear" camp. It's sickening.

They have told you they are scanning iCloud Photo uploads. What haven't they told you?
 
You can control it by turning iCloud Photo Library off.

How do you control the iCloud backup feature? It scans your device. How do you know what it backs up and where it's delivered? How do you know the data it scans and copies cannot be used to harm you?
Turn off iCloud backup and make local backups (which are more complete anyway) on your own machine. Either in iTunes or Finder if on 10.11 or later.
 
Last edited:
They are being way to coy explaining this, many unanswered questions about the process. It’s just too perfect for general surveillance by law and governments. Very surprised apple went this route, doesn’t make sense.
 
  • Like
Reactions: 09872738
"Never let a crisis go to waste"

Crazy how in a year where politicians have never been more openly supportive of tracking, censorship and thought policing of the populace, there are still people clamoring for this. Contact tracing and interception of vaccine misinformation via social media and SMS today, your associates and political memes tomorrow. If history has taught us anything, it's that once you open the door, leaders don't willingly relinquish power.
 
No one's "coming into your house"
Apple's not coming to my house, they're moving in.
Apple knows nothing about what's on your phone unless you attempt to upload a collection of CSAM to iCloud. They're not snooping around on your phone looking for it (on-device detection is not done by humans and none of that data leaves your phone except attached to illegal images).
I'm sorry but finding something means you're looking for it. And once you build the mechanism to look, PEOPLE determine what to look for.

Thus, It's not the finding part that folks have a problem with, it's the looking part.

I get that what Apple is saying is that your data is subject to (whatever technical/moral jargon you want to attach to it as an adjective) inspection, and that using Apple products implies consent.

I accept it, and will continue to use Apple products for other reasons until I hear they're looking at stuff that actually impacts me.

Thankfully, I don't suffer from pedophilia so in this case I have nothing to worry about in that regard.

So I hope Apple either removes this or someone makes them.
 
Last edited by a moderator:
With this logic, I could ask you for proof that they have not already been doing it for years.

Agreed. Which is why I find it so bizarre that people advocate for Apple's performance on privacy when they really have no idea. Usually, the argument is that Apple doesn't run a search engine or sell advertisements, so they must be better than these other guys. Apple's curtain is being pulled back a little bit on this one.
 
For a normal hash, yes. The hash they are using something different:

Thanks for the link. It'll be interesting reading.

Regardless, it starts with running an algorithm against private images. Don't care whether it knows what my images are, it still is looking. And, in the near future, there will be other databases. It is virtually a foregone conclusion this system will expand and/or be corrupted for an unintended purpose.
 
If you have to explain it over and over again, then that means it was a failed initiative to begin with. I also don’t appreciate that Apple is now assuming that all of their customers are guilty of something and need their photos scanned. I think most people would agree, that we don’t need Apple playing nanny here.
 
Are you able to cite the source where there’s a scan of photos when it’s not uploaded to iCloud Photo?

No, but I wasn't making the claim that Apple was doing anything, therefore nothing for me to cite. As I said, other posters are making claims that Apple does not do this or that but cannot back those statements up using anything other than Apple marketing.
 
  • Like
Reactions: PC_tech and Jonas07
Apple's not coming to my house, they're moving in.

I'm sorry but finding something means you're looking for it. And once you build the mechanism to look, PEOPLE determine what to look for.

Thus, It's not the finding part that folks have a problem with, it's the looking part.

I get that what Apple is saying is that your data is subject to (whatever technical/moral jargon you want to attach to it as an adjective) inspection, and that using Apple products implies consent.

I accept it, and will continue to use Apple products for other reasons until I hear they're looking at stuff that actually impacts me.

Thankfully, I don't suffer from pedophilia so in this case I have nothing to worry about in that regard.

So I hope Apple either removes this or someone makes them.

👋
 
Last edited by a moderator:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.