Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I read it...what are you inferring? Again, are we living in the real world where illegal activities should be reported? Or are you living in a fantasy land where Apple, governments or hackers are changing the whole hashed picture system Apple has set up to find an report legal yet unaccepted evidence??

It has been said/agreed to before by me and others...they can already do this. How does this particular option which Apple has added make that any easier??
Never mind. I see you’re a very glass half-full type of person that believes this will be limited to only illegal activities that you deem as illegal. I’m much more cynical. What’s legal here isn’t in other places and can get you killed.
 
One in a trillion chance of that happening.

Also, they already have the technology in place and running for years now. What's to stop them from adding other hashes to it now? Ever thought of that?

I would like to know how they came up with that number.
What I have been learning over the last few days about hashes says that number is likely false. It is far more common.

Worst case someone gets accused falsely and a life is ruined. Can they legally sue Apple?
 
Never mind. I see you’re a very glass half-full type of person that believes this will be limited to only illegal activities that you deem as illegal. I’m much more cynical. What’s legal here isn’t in other places and can get you killed.
I'm very glass half empty...ha.

My main point is that there are a ton of other ways that are easier to get that info off of a phone.
 
Never mind. I see you’re a very glass half-full type of person that believes this will be limited to only illegal activities that you deem as illegal. I’m much more cynical. What’s legal here isn’t in other places and can get you killed.
Again, you've been using iCloud all along and never had a problem with how they scanned your photos, but now you do. Seems weird.
 
Since 2019, every photo uploaded to iCloud was checked against the same list of known CSAM images using hashes. If they wanted to expand the list of hashes, they could've done that since 2019. The only difference now is that check is done while the photo is being uploaded, not after.

If true (2019?) why move it to the device? This is the one piece that makes little sense unless they are plan down the road for Apple to scan pre-encrypt for other activities like Messages or other communication / sharing activities.
 
Last edited:
How would "server-side" not be able to be abused though? Why couldn't governments force apple to add more hashes to the list and expand the search for other content?

It would not surprise me if the US or UK or ... has Google doing periodic searches for "X" on their server. Makes even more sense for Amazon to do it. I also suspect a lot of this is via "probable cause or a facsimile thereof".

I do not wish this activity, especially what has come to light post-Snowden days to migrate to my device.
 
It's so they can protect your privacy even more...you prefer someone scanning every single photo you have in iCloud, or looking for a very specific set of data that is only flagged AFTER or IF it is passed along to iCloud?

On the cloud.
I use Google and Amazon services so it is already being done there.
What I do not want is the "private cop employed by someone else" sitting in my house looking over my shoulder.

Note: as I have little faith (via experience) in iCloud, I only use it for notes and basic device backup. All the rest of my backups use other methods.
 
Last edited:
The problem with it is that every person running the newest OS will be having their device continuously audited 24/7 who knows how many times an hour.
It is no different than having to check into the FBI every few minutes to prove you've got nothing to hide in your photos app.
I can see a company with a company issued phone with a locked down MDM doing this (because companies love to be control freaks) but a device manufacturer?

If people don't have an issue with a continuous scan and audit of their photos forever - go for it. But other people do take issue with it in principle.
 
The problem with it is that every person running the newest OS will be having their device continuously audited 24/7 who knows how many times an hour.
It is no different than having to check into the FBI every few minutes to prove you've got nothing to hide in your photos app.
I can see a company with a company issued phone with a locked down MDM doing this (because companies love to be control freaks) but a device manufacturer?

If people don't have an issue with a continuous scan and audit of their photos forever - go for it. But other people do take issue with it in principle.
How is this any different from iCloud continuously checking your uploaded photos against known CSAM images? If you're not using iCloud, then there's no checking for anything EITHER WAY. If you use iCloud, your photos are being scanned RIGHT NOW on iOS 14. You still have the same chances of a false positive (1 in a trillion), and even THEN, that set of images (yes, it can't just be one image) is subject to review by a human before being reported. So tell me again, why this is a bad thing?
 
  • Like
Reactions: Runs For Fun
How is this any different from iCloud continuously checking your uploaded photos against known CSAM images? If you're not using iCloud, then there's no checking for anything EITHER WAY. If you use iCloud, your photos are being scanned RIGHT NOW on iOS 14. You still have the same chances of a false positive (1 in a trillion), and even THEN, that set of images (yes, it can't just be one image) is subject to review by a human before being reported. So tell me again, why this is a bad thing?
The difference with scanning on the cloud server side is that it is technically limited to what has been uploaded by the user. This new on device scanning system is only limited to scanning photos in the process of being uploaded as a result of a policy decision, technically it can scan anything on your phone. That opens the door for mass surveillance which is what people are legitimately worried about.
 
The difference with scanning on the cloud server side is that it is technically limited to what has been uploaded by the user. This new on device scanning system is only limited to scanning photos in the process of being uploaded as a result of a policy decision, technically it can scan anything on your phone. That opens the door for mass surveillance which is what people are legitimately worried about.
Apple said themselves that scanning doesn't take place until a file is actively being uploaded to iCloud. So if you aren't uploading photos, they aren't being scanned.
 
Apple said themselves that scanning doesn't take place until a file is actively being uploaded to iCloud. So if you aren't uploading photos, they aren't being scanned.
As I said, that’s a policy decision and not a technical limitation. As soon as the system is in place, the policy could change at any time and for any reason.
 
I don't want to have their bad stuff search list on my device coded or not. They can fight child porn on their own servers not on my device if they want. We have a legal system to deal with crime. Apple is not needed to do this.
I might abandon Apple altogether and move on to linux or similar. Without having anything to hide that is.
 
I don't want to have their bad stuff search list on my device coded or not. They can fight child porn on their own servers not on my device if they want. We have a legal system to deal with crime. Apple is not needed to do this.
I might abandon Apple altogether on move on. Without having anything to hide that is.
A bunch of 1's and 0's scares you huh? Good luck with your journey to finding another device or maybe just go completely off the grid for your own sake.
 
I am not scared I am opposing some company inspecting my private phone. It's like I buy a bookshelf at Ikea and from then on Ikea has somebody sitting in my living room 24/7 inspecting if I have child porn books in the bookshelf. (I don't)

I don't want to have to loose my privacy with everybody else just because some freaks collect child porn.
 
I am not scared I am opposing some company inspecting my private phone. It's like I buy a bookshelf at Ikea and from then on Ikea has somebody sitting in my living room inspecting if I have child porn books in the bookshelf. (I don't)

I don't want to have to loose my privacy with everybody else just because some freaks collect child porn.
Weird because the way it worked before is that you'd knowingly send that shelf to Ikea with all your stuff on it and they inspect it there for CP.
 
  • Like
Reactions: Runs For Fun
Weird because the way it worked before is that you'd knowingly send that shelf to Ikea with all your stuff on it and they inspect it there for CP.
This seems like the type of argument a person who cheats on their partner would make. "But I've been cheating on you for years! Why are you mad now? Isn't that kind of weird?"

Reminds me of life right after Edward Snowden/PRISM. Some people insisted on putting their head between their legs, closing their eyes and muttering "This isn't a surprise! You didn't know about this already?" over and over. Makes them feel better I guess.
 
Last edited by a moderator:
This seems like the type of argument a person who cheats on their partner would make. "But I've been cheating on you for years! Why are you mad now? Isn't that kind of weird?"

Reminds me of life right after Edward Snowden/PRISM. Some people insisted on putting their head between their legs, closing their eyes and muttering "This isn't a surprise! You didn't know about this already?" over and over. Makes them feel better I guess.
Better throw your phone away.
 
  • Haha
Reactions: crymimefireworks
I would like to know how they came up with that number.
If it's the chances of a "hash collision" - i.e. your cat photo having the same hash as a known illegal image - then its a well-defined mathematical property of the hash algorithm and will be tiny. One of Apple's blunders was talking about AI detection of nude photos sent to kids in the same breath - which is far more sketchy and prone to overstatement by the AI writers. Hash matching is not AI/Machine Learning.

Also, they're talking about human confirmation of any matches - done properly that should reduce the probability of false positives to zero - but the "done properly" bit is the kicker. Ideally, the confirmation should be a blind test in which the "matched" images were mixed in with a stream of other random images (matching and non-matching) - otherwise the tester will look at every image expecting to see a matching image. Even if they're comparing the image with the one it supposedly matching, you want them to be hitting "false, false, false, false, true, false, false, false...." rather than vice versa.

The greatest risk, really, is that the agencies responsible for supplying the list of hashes will get careless or over-zealous in what they deem "illegal" and the checkers will be obliged to report anything that matches, even if it appears to be a picture of a fully-dressed kid wearing a "The President is a Big Silly" T-Shirt.

As I said, that’s a policy decision and not a technical limitation. As soon as the system is in place, the policy could change at any time and for any reason.
iPhones are continually downloading software updates - Apple can implement any technology they want at any time. If Apple want to "spy" on your phone there hasn't been any technical issue stopping them for years apart from the law.

The real issue here is not the technology, but the fact that ticking "I accept" on iCloud now includes granting Apple permission to do on-device checking against a third-party hash list. There's no immediate practical upshot, any cloud service will check your photos anyway but you've crossed a line in the sand (and maybe waived a constitutional right in the US) by granting Apple that permission.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.