Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Define "many"?? :p

I would say "many" on here are jumping to conclusions not based on any facts whatsoever and are simply stating what "could" happen no matter how outrageous... a lot of "the sky is falling" type of sentiment.

I'm glad there are a "few" on here that understand what COULD happen, but are erring on the side of trusting the system to work as Apple has stated while also taking into account the tech, what is currently on the phone and Apple's own history around this type of data/tech.

Many as in what I am reading, listening to, and watching. The question "Why device side?" and "Why this design?" has been a repeating point.

For me I am in the camp of I want to know just what this is and in the meantime I will watch and proceed cautiously.
Until I understand this, any new purchases are on hold.

Except for my iPad Pro, Apple products are secondary to me. Android, Linux, and Windows are my main systems at present. My 12 Promax is a backup and my MBP is for contract / support activities.
 
This guy gets it. As of yet, there is no reason to believe this will be expanded on in the future. I'll worry about that if it happens, but for right now all we can do is go by what they say.

I still don't understand why it would be so much better if they unencrypted all of your photos in the cloud and scanned them there... on a system that can't be seen or controlled. You don't know what tech they're using or what database they're comparing to or anything.

I may be incorrect but Apple did state that this new design is not the end product. There is more to come. No mention of what that is.
 
I may be incorrect but Apple did state that this new design is not the end product. There is more to come. No mention of what that is.
Okay? What do you want us to do about it? Get all angry and quit Apple before we have all the information?

iOS 15 isn't even out, so people are getting bent out of shape over nothing. I'm sure when the feature is rolled out, it will be picked apart by every security researcher and curious programmer.
 
  • Angry
  • Sad
Reactions: Pummers and dk001
Okay? What do you want us to do about it? Get all angry and quit Apple before we have all the information?

iOS 15 isn't even out, so people are getting bent out of shape over nothing. I'm sure when the feature is rolled out, it will be picked apart by every security researcher and curious programmer.

Huh?
Not getting you at all.

Some people are getting "bent" out of shape.
Some people are accepting it as is.
Some people are asking for more information.

There is a lot we do not know about this new set of features and outside of the announcements, despite the wide spread concern expressed, Apple has remained silent.

I have expressed my stance and path forward on this along with many others. I'm not asking anyone to do anything other than asking Apple for more information.
 
  • Like
Reactions: Pummers
So nothing. It will be an on-device scanner. I regard that as unacceptable. Details are inconsequential--as is your and others' faith in Apple's promises. So, yes: I have already begun extricating myself from Apple's clutches.
...and I will respond as always with, "there are and have been identical on device scanners on the iPhone for years now", so that is not an excuse. The excuse (and I'm not saying you don;t have a right to have one) is more around what they are doing with that information.

BUT...my answer to that is "turning off iCloud Photos means no photos will be scanned in the way you object to." People then say, they shouldn't have this capability in the first place, in which I respond, "please see my first point up above."

Then, my only assumption is that people do not trust Apple...which again, is fine....BUT, "why now and not when they had this ability before?"

If a government (or any other entity) could force Apple to turn over any data (on device OR in iCloud) they could modify hashes, do scans, etc. for years now. Nothing Apple is doing now changes that fact...so, it comes down to trust. I trust Apple to do the right thing until they prove to me that they don't or can't.
 
It is a perfectly legit demand that people want their own private phones to be private. Private from software, scanning and searches whatever the "good" intention might be. If this red line needs to be trespassed there is a legal system to provide a search warrant. Otherwise everybody has to be regarded as innocent and this must not be proven all the time to Apple.

It's no argument to claim other parties have violated this right before. This is about the new apple software scanning private devices before they send pictures to apple.
 
  • Like
Reactions: PBG4 Dude
Literally makes no difference. Even if you have 30 of these non-porn images, how do you know that those hashes are the same as the ones in the CSAM database and also, will they pass through the second server-side perceptual hash as CSAM? I think not. Nice try though. Keep reaching.

Edit: The site is just a proof of concept using Apple's old code. Having these images on your device will do nothing. Also, I stand by what I said. Even if these false positives had the same exact hashes as a CSAM photo, the photo would not make it through the second server-side perceptual hash.
 
This is a slippery slope that we can't afford to go down. The technology behind this scanning method is only as good as the integrity and principles of the government overseeing the company deploying it. With the political climate being what it is, I sincerely hope Apple doesn't open this "pandoras box".
 
  • Like
Reactions: Pummers
This is a slippery slope that we can't afford to go down. The technology behind this scanning method is only as good as the integrity and principles of the government overseeing the company deploying it. With the political climate being what it is, I sincerely hope Apple doesn't open this "pandoras box".
So other cloud providers have been on a slippery slope for years. Why isn't anyone mad about that?
 
  • Haha
Reactions: Pummers
It is a perfectly legit demand that people want their own private phones to be private. Private from software, scanning and searches whatever the "good" intention might be. If this red line needs to be trespassed there is a legal system to provide a search warrant. Otherwise everybody has to be regarded as innocent and this must not be proven all the time to Apple.

It's no argument to claim other parties have violated this right before. This is about the new apple software scanning private devices before they send pictures to apple.
I would say on device scanning of every single one of my pictures so they can be matched with hashes of dogs, cats, buildings/places of interest, etc. are for "good intentions", but I didn't hear anyone freaking out about those before....and I can't even turn that scanning off! So, where was the outrage or the "assumption" before that additional hashed database items could be added so "someone" would know what they are when those pics are uploaded to iCloud??

All I have to do here is not use iCloud Photos and no (yes, I repeat, NO) scanning in your words takes place at all for that particular hash database. And the fact that the ability to do so is still "on your device" in case you turn iCloud Photos on doesn't change the fact that the phone is already doing this for other reasons.
 
The excuse (and I'm not saying you don;t have a right to have one) is more around what they are doing with that information.
Correct.

Then you skipped an argument or two...

BUT...my answer to that is "turning off iCloud Photos means no photos will be scanned in the way you object to." People then say, ...
... but that somewhat cripples the Apple Experience. (This is inarguable.)

Along those lines: As I noted in another post, on a related topic: I wonder if it ever occurred to Apple how doing a CSAM thing linked to iCloud usage might impact customer retention?

... they shouldn't have this capability in the first place, in which I respond, "please see my first point up above."
Except your first argument is a poor one, because the two do not have the same purpose, nor the same beneficiaries.

Then, my only assumption is that people do not trust Apple...which again, is fine....BUT, "why now and not when they had this ability before?"
Because, as previously explained, many of us believe this it a truly bad idea and, as such, reflects deficient thinking, decreased commitment to customer privacy and security, ulterior motives, or some combination of those.

I trusted Apple. Now I no longer do, so much.
 
  • Like
Reactions: Pummers and dk001
Literally makes no difference. Even if you have 30 of these non-porn images, how do you know that those hashes are the same as the ones in the CSAM database and also, will they pass through the second server-side perceptual hash as CSAM? I think not. Nice try though. Keep reaching.

Edit: The site is just a proof of concept using Apple's old code. Having these images on your device will do nothing. Also, I stand by what I said. Even if these false positives had the same exact hashes as a CSAM photo, the photo would not make it through the second server-side perceptual hash.

It appears you went into that post link assuming it was of no consequence or just wrong instead of taking a thoughtful look. Try a few of the links too.

It shows, based on what we know, that there are apparent design flaws in Apple's solution. If these are incorrect why can't Apple call them out in discussion?

Your comment on the second check makes no sense. if db001 has hash A1, the fake on your device has hash A1, and the second db has hash A1 ... What am I missing?
 
  • Like
Reactions: Pummers
Correct.

Then you skipped an argument or two...


... but that somewhat cripples the Apple Experience. (This is inarguable.)

Along those lines: As I noted in another post, on a related topic: I wonder if it ever occurred to Apple how doing a CSAM thing linked to iCloud usage might impact customer retention?


Except your first argument is a poor one, because the two do not have the same purpose, nor the same beneficiaries.


Because, as previously explained, many of us believe this it a truly bad idea and, as such, reflects deficient thinking, decreased commitment to customer privacy and security, ulterior motives, or some combination of those.

I trusted Apple. Now I no longer do, so much.
I'm sorry, but using iCloud Photo is not a "right" and does not cripple anything other than easier access to your photos across Apple devices or via iCloud on the web. There are 10o's of different options for you to back up photos that do not involve iCloud and are even available as apps on the phone that make accessing your photos across devices practically as easy as using iCloud (Amazon Photos, Flickr, Google Photos, etc.). Opening those is nearly no more effort than opening the photos app native to Apple devices.

At the end of the day, you don;t trust Apple because of the way they approached this, where I trust them even more.

Imagine if when introducing on-device scanning for the reasons I listed in my post (dogs, cats, flowers, POI, etc.) that Apple said, "We use software hard-coded on your device to scan every one of your pictures to identify these items" it would have been received any differently than this? Probably not...in trying to be so open about how this works and what they are doing, they called attention to it.

At the end of the day, it is optional and Apple will continue to scan all of your photos for dogs, cats, flowers, POI's and so forth.

Also, outside of those of us who care way to much about Apple and these things, I predict this will disappear from people's minds once they flash their fancy new iPhone 13 and watch as people line up to get one. Even when Bill Mahr mentioned it on his show last week as a major talking point, I asked my wife what she thought, and she had no clue what he was talking about (and didn't care). hah! I hate to say it, but 99% of the population will also not care and go about their lives. Yes, I do think they should "care", but to me, this is a positive step, not a negative one.
 
It appears you went into that post link assuming it was of no consequence or just wrong instead of taking a thoughtful look. Try a few of the links too.

It shows, based on what we know, that there are apparent design flaws in Apple's solution. If these are incorrect why can't Apple call them out in discussion?

Your comment on the second check makes no sense. if db001 has hash A1, the fake on your device has hash A1, and the second db has hash A1 ... What am I missing?
The second database is a completely different set of hashes that nobody has access to. The second database is a perceptual scan (meaning the photo not only has to match the first hash, but also has to physically look like the offending photo). Good luck trying to trick that one without having actual CSAM on your device.
 
The second database is a completely different set of hashes that nobody has access to. The second database is a perceptual scan (meaning the photo not only has to match the first hash, but also has to physically look like the offending photo). Good luck trying to trick that one without having actual CSAM on your device.

So you are saying:
Hash A1 (from the Center)
Hash A1 (from Apple calculated on our device)
both are compared.
If a match then
Hash B1 (from Apple calculated on our device)
hash B1 (from this other Center)
both are compared
If a match flag and store
If count is >29 send photo to Apple for visual verification

Kindly correct any portion I am incorrect on .....
 
and when it came out, as it eventually would, the pushback would have loud and fierce.

So you are saying:
Hash A1 (from the Center)
Hash A1 (from Apple calculated on our device)
both are compared.
If a match then
Hash B1 (from Apple calculated on our device)
hash B1 (from this other Center)
both are compared
If a match flag and store
If count is >29 send photo to Apple for visual verification

Kindly correct any portion I am incorrect on .....
Yes, once 30 matches are found, apple uses those 30 safety vouchers to unlock the photos and then they run through a server-side perceptual hashing system that compares them to a totally different database that only resides on their server and then if they somehow make it through that system as well, then it goes to human review for further confirmation.

I got all of this information directly from Apple. https://www.apple.com/child-safety/...del_Review_of_Apple_Child_Safety_Features.pdf

"Once Apple's iCloud Photos servers decrypt a set of positive match vouchers for an account that exceeded the match threshold, the visual derivatives of the positively matching images are referred for review by Apple. First, as an additional safeguard, the visual derivatives themselves are matched to the known CSAM database by a second, independent perceptual hash. This independent hash is chosen to reject the unlikely possibility that the match threshold was exceeded due to non-CSAM images that were adversarially perturbed to cause false NeuralHash matches against the on-device encrypted CSAM database. If the CSAM finding is confirmed by this independent hash, the visual derivatives are provided to Apple human reviewers for final confirmation. The reviewers are instructed to confirm that the visual derivatives are CSAM. In that case, the reviewers disable the offending account and report the user to the child safety organization that works with law enforcement to handle the case further." -- Page 13
 
  • Like
Reactions: kalsta
It appears you went into that post link assuming it was of no consequence or just wrong instead of taking a thoughtful look. Try a few of the links too.

It shows, based on what we know, that there are apparent design flaws in Apple's solution. If these are incorrect why can't Apple call them out in discussion?

Your comment on the second check makes no sense. if db001 has hash A1, the fake on your device has hash A1, and the second db has hash A1 ... What am I missing?
I'll just give one example of why this kind of site is ridiculous (which if you really look at it, is nothing more than an attempt to sell dumb t-shirts...there is no new information here...).

He uses the example of the kitten hashed image matching that of the dog hashed image....as if that means anything other than the fact that people can create these different images to "match". Yes, create them from a known image. They never talk about the chances that a random image you may have might match a known hashed image. It's so close to being impossible, they never discuss that...just that it CAN happen. And of course, show a dog and a cat...nochance those could be the same. Does he show you the hashes or how they were created to match? Does that process actually match what Apple is doing? Hmmm...no detail there for some reason.

Have you seen, read, heard of ANY image NOT being part of the CSAM database matched with a CASM image, much less 30?

Why not? Because, that database is about as secure (if not more) than your own phone much less iCloud.

Just because something CAN be done does not mean it is even remotely possible in this particular case.

Yes, someone COULD get access to the CSAM database...and then, grab 30 of those horrible images so they could create 30 innocent images with matching hashes....and then they COULD break into your phone somehow and upload those images....and then they COULD be flagged so Apple reviews them...only to find out what? That they aren't true matching images? You would never even know. It would be easier to go to child porn sites and grab HUNDREDS or THOUSANDS of images hoping they are known and then just upload those to your phone somehow (of course with no way of anyone tracking that you didn't do that).

It would be easier for me to find out where you live, grab a bunch of illegal stuff, put it in your house and simply call the cops on you. I guarantee you'd probably get in trouble for that. Why do people even consider the MINUTE chance that someone would or could use this system for anything other than catching predators/child pornographers? It simply makes no sense in the grand scheme of things.

I'll stick to my stance that this is about nothing more than trust. If for whatever reason, you don;t trust Apple anymore, there are other options out there. If you don't want to use iCloud, there are other options out there. If you want o have your stuff more secure and "private" than any other phone manufacturer or service provider out there, stick with Apple IMHO.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.