PreciselyBy extension: Those of us who do not agree with you do not "get it?"
PreciselyBy extension: Those of us who do not agree with you do not "get it?"
Define "many"??
I would say "many" on here are jumping to conclusions not based on any facts whatsoever and are simply stating what "could" happen no matter how outrageous... a lot of "the sky is falling" type of sentiment.
I'm glad there are a "few" on here that understand what COULD happen, but are erring on the side of trusting the system to work as Apple has stated while also taking into account the tech, what is currently on the phone and Apple's own history around this type of data/tech.
This guy gets it. As of yet, there is no reason to believe this will be expanded on in the future. I'll worry about that if it happens, but for right now all we can do is go by what they say.
I still don't understand why it would be so much better if they unencrypted all of your photos in the cloud and scanned them there... on a system that can't be seen or controlled. You don't know what tech they're using or what database they're comparing to or anything.
Okay? What do you want us to do about it? Get all angry and quit Apple before we have all the information?I may be incorrect but Apple did state that this new design is not the end product. There is more to come. No mention of what that is.
Okay? What do you want us to do about it? Get all angry and quit Apple before we have all the information?
iOS 15 isn't even out, so people are getting bent out of shape over nothing. I'm sure when the feature is rolled out, it will be picked apart by every security researcher and curious programmer.
So nothing. It will be an on-device scanner. I regard that as unacceptable. Details are inconsequential--as is your and others' faith in Apple's promises. So, yes: I have already begun extricating myself from Apple's clutches.iOS 15 isn't even out, so ...
...and I will respond as always with, "there are and have been identical on device scanners on the iPhone for years now", so that is not an excuse. The excuse (and I'm not saying you don;t have a right to have one) is more around what they are doing with that information.So nothing. It will be an on-device scanner. I regard that as unacceptable. Details are inconsequential--as is your and others' faith in Apple's promises. So, yes: I have already begun extricating myself from Apple's clutches.
A lot of good information in there.
Literally makes no difference. Even if you have 30 of these non-porn images, how do you know that those hashes are the same as the ones in the CSAM database and also, will they pass through the second server-side perceptual hash as CSAM? I think not. Nice try though. Keep reaching.
So other cloud providers have been on a slippery slope for years. Why isn't anyone mad about that?This is a slippery slope that we can't afford to go down. The technology behind this scanning method is only as good as the integrity and principles of the government overseeing the company deploying it. With the political climate being what it is, I sincerely hope Apple doesn't open this "pandoras box".
I would say on device scanning of every single one of my pictures so they can be matched with hashes of dogs, cats, buildings/places of interest, etc. are for "good intentions", but I didn't hear anyone freaking out about those before....and I can't even turn that scanning off! So, where was the outrage or the "assumption" before that additional hashed database items could be added so "someone" would know what they are when those pics are uploaded to iCloud??It is a perfectly legit demand that people want their own private phones to be private. Private from software, scanning and searches whatever the "good" intention might be. If this red line needs to be trespassed there is a legal system to provide a search warrant. Otherwise everybody has to be regarded as innocent and this must not be proven all the time to Apple.
It's no argument to claim other parties have violated this right before. This is about the new apple software scanning private devices before they send pictures to apple.
Correct.The excuse (and I'm not saying you don;t have a right to have one) is more around what they are doing with that information.
... but that somewhat cripples the Apple Experience. (This is inarguable.)BUT...my answer to that is "turning off iCloud Photos means no photos will be scanned in the way you object to." People then say, ...
Except your first argument is a poor one, because the two do not have the same purpose, nor the same beneficiaries.... they shouldn't have this capability in the first place, in which I respond, "please see my first point up above."
Because, as previously explained, many of us believe this it a truly bad idea and, as such, reflects deficient thinking, decreased commitment to customer privacy and security, ulterior motives, or some combination of those.Then, my only assumption is that people do not trust Apple...which again, is fine....BUT, "why now and not when they had this ability before?"
Objection! Begs the question.So other cloud providers have been on a slippery slope for years. Why isn't anyone mad about that?
Literally makes no difference. Even if you have 30 of these non-porn images, how do you know that those hashes are the same as the ones in the CSAM database and also, will they pass through the second server-side perceptual hash as CSAM? I think not. Nice try though. Keep reaching.
Edit: The site is just a proof of concept using Apple's old code. Having these images on your device will do nothing. Also, I stand by what I said. Even if these false positives had the same exact hashes as a CSAM photo, the photo would not make it through the second server-side perceptual hash.
Objection! "The Worse (or Worst) Negates The Bad" fallacy.Apple could've silently done this without us knowing, but they decided to tell us.
I'm sorry, but using iCloud Photo is not a "right" and does not cripple anything other than easier access to your photos across Apple devices or via iCloud on the web. There are 10o's of different options for you to back up photos that do not involve iCloud and are even available as apps on the phone that make accessing your photos across devices practically as easy as using iCloud (Amazon Photos, Flickr, Google Photos, etc.). Opening those is nearly no more effort than opening the photos app native to Apple devices.Correct.
Then you skipped an argument or two...
... but that somewhat cripples the Apple Experience. (This is inarguable.)
Along those lines: As I noted in another post, on a related topic: I wonder if it ever occurred to Apple how doing a CSAM thing linked to iCloud usage might impact customer retention?
Except your first argument is a poor one, because the two do not have the same purpose, nor the same beneficiaries.
Because, as previously explained, many of us believe this it a truly bad idea and, as such, reflects deficient thinking, decreased commitment to customer privacy and security, ulterior motives, or some combination of those.
I trusted Apple. Now I no longer do, so much.
The second database is a completely different set of hashes that nobody has access to. The second database is a perceptual scan (meaning the photo not only has to match the first hash, but also has to physically look like the offending photo). Good luck trying to trick that one without having actual CSAM on your device.It appears you went into that post link assuming it was of no consequence or just wrong instead of taking a thoughtful look. Try a few of the links too.
It shows, based on what we know, that there are apparent design flaws in Apple's solution. If these are incorrect why can't Apple call them out in discussion?
Your comment on the second check makes no sense. if db001 has hash A1, the fake on your device has hash A1, and the second db has hash A1 ... What am I missing?
Apple could've silently done this without us knowing, but they decided to tell us.
The second database is a completely different set of hashes that nobody has access to. The second database is a perceptual scan (meaning the photo not only has to match the first hash, but also has to physically look like the offending photo). Good luck trying to trick that one without having actual CSAM on your device.
and when it came out, as it eventually would, the pushback would have loud and fierce.
Yes, once 30 matches are found, apple uses those 30 safety vouchers to unlock the photos and then they run through a server-side perceptual hashing system that compares them to a totally different database that only resides on their server and then if they somehow make it through that system as well, then it goes to human review for further confirmation.So you are saying:
Hash A1 (from the Center)
Hash A1 (from Apple calculated on our device)
both are compared.
If a match then
Hash B1 (from Apple calculated on our device)
hash B1 (from this other Center)
both are compared
If a match flag and store
If count is >29 send photo to Apple for visual verification
Kindly correct any portion I am incorrect on .....
I'll just give one example of why this kind of site is ridiculous (which if you really look at it, is nothing more than an attempt to sell dumb t-shirts...there is no new information here...).It appears you went into that post link assuming it was of no consequence or just wrong instead of taking a thoughtful look. Try a few of the links too.
It shows, based on what we know, that there are apparent design flaws in Apple's solution. If these are incorrect why can't Apple call them out in discussion?
Your comment on the second check makes no sense. if db001 has hash A1, the fake on your device has hash A1, and the second db has hash A1 ... What am I missing?