Really...when did the actions of a rogue employee scale up to the general business practices of Apple?[...] poor girl's personal photos should've been a wake up call to Apple's stance on privacy.
Really...when did the actions of a rogue employee scale up to the general business practices of Apple?[...] poor girl's personal photos should've been a wake up call to Apple's stance on privacy.
My concern with this, and people have never really addressed other than being patronizing because I am ASKING for what I am missing here. We have two scenarios.The way I heard it described on a podcast was like this:
Apple is not doing a visual scan of your photos. They're not looking at the actual contents of your photos.
They are, instead, comparing hashes of *known* CSAM images. These are photos that have already been labeled as child porn.
So there's no danger of Apple flagging a photo of your child in the bathtub or whatever.
With all that said... no one knows what else Apple could do in the future. Perhaps they could start scanning the actual contents of your photos. So I can see why people are freaked out.
But as others have said... all of the big companies are doing similar things. So I dunno.
The only recourse is to vote with your dollars. But I'm not sure the company is out of touch with reality. Unlike the app store anti-trust investigations, Congress will not step in to stop Apple. I guarantee it.[...] It's insane, an extraordinary abuse of power, and shows how out of touch with reality this company is.
We all knew Google was untrustworthy, but if you advertise privacy and then don’t provide privacy, that’s when people speak out. Google is creepy and now Apple just became as creepy.
Big questions for sure. You might find this thread instructive: https://news.ycombinator.com/item?id=28106867My concern with this, and people have never really addressed other than being patronizing because I am ASKING for what I am missing here. We have two scenarios.
ExactlyApple will refuse any such demands…
Yeah, well, the NCMEC (as being well-connected to the government) may not.
Once the hash is in the database, it becomes merely a question of having an entry (exploit) into the user’s device to check for any matches. The three-letter agencies probably have that already - or will buy it (such as Pegasus).
I just deactivated iMessage and messages to iCloud.
As it's been stated before (several times) Apple currently will not view photos. Just scan for the hash of photos that some poor Law Enforcement Agent had to view, flag as illegal and add to a database of similar images. But this has so many ramifications:
1. It doesn't matter what Google, Adobe, or other companies are doing - in terms of this conversation. This is about Apple rolling out this 'feature' on your Apple device.
2. Chester the Molester isn't going to keep his illegal collection of photos on iCloud anyway. The percentage of offenders who do this must by infinitely small. I'm sure Apple already has these statistics and is aware of this. Why subject 99.999 % of population to this? Do police stop all cars on the road and check IDs to try to catch the 0.01%?
3. For the last two decades, we've seen tremendous loss of freedoms begin with the 'to protect the children' line... You can't really rally against that, are you a monster?! They know it, we know it, they know we know it. It's common practice cause it works.
4. This doesn't even 'actively protect children' as the photos must already be cataloged and reside in a database of known perv images. For active abusers - there won't be any known hash to scan for. So they're only looking for images that are already widely shared.
5. Many think that what's on our phone is an extension of what's in our brain - it holds our thoughts (if you write them down), photos, locations, preferences, searches, contacts, feelings and life experiences of our friends that communicate with us etc. While no court order can get into your brain, they can with your devices. But with this new 'feature' - a court order isn't even needed. They automatically scan for illegal hashes or whatever.
6. This isn't really a 'slippery slope' - we've seen this technology used again and again in nefarious ways by govt agencies. By allowing a company to scan your private data 24/7 for what the govt today deems is illegal - is a big gaping hole in the floor called 'the future.' Who is to say what the govt will find as illegal in 2yr, 5yr, 10yr?
7. The trend line isn't good. We already see people that promote views that are against the govt's line getting removed from social platforms (at the governments request) - and labeled as 'extremists.' Don't want a vaccine in 1999? You're just a fool - today, you're now an extremist. Where will it end?
8. Imagine waking up one day and finding that your iPhone updated and iCloud was turned on by default with the new update? (oops - we're sorry). Or additional 'spyware' was installed, or the govt sent Apple new things to search for? Law Enforcement needs a warrant to search your property today - well, that just flew out the window, as all your data has already been captured.
9. Does anyone 'really' believe that the NSA would not utilize every bit of data it can get it's hands on for their own purposes? Do you really think they're girl-scouts selling cookies for merit badges? Will scanning of your hard drive come next? In the coming years we probably won't even have hard-drives - everything will be cloud based (already headed that way), so essentially you've just handed over your freedom of privacy for the future.
10. Apple has made a big deal of privacy in the last few years. Funny (but sad) to see it was all BS.
Agreed...so what's next for you? What are you changing based on this?I do
Exactly
How long before USA and/or China demands Apple to start scanning for content other than CSAM on all iPhones (not just the ones with iCloud Photo Library enabled), or they would not be allowed to sell devices in those markets? These two countries represents two-thirds of Apple's revenue and therefore have a lot of leverage on the company.
What is a "collision"? A match or not a match?My concern with this, and people have never really addressed other than being patronizing because I am ASKING for what I am missing here. We have two scenarios.
Picture 1: I stand in front of my house, thumbs up next to my mailbox.
Picture 2: I keep the pose because my eyes were a bit closed and they took another picture seconds later.
Picture 1 and 2 are VERY VERY CLOSE in terms of a pixel match.
Okay lets walk through the scenario. Assume Picture 1 made it on CSAM. Apple's claims and documentations state that any modifications to picture 1 would result in a collision. Its not a 1:1 hash match to the entry in CSAM. We on the same page so far? Picture 1 gets flagged but picture 2 does not. Correct in that assumption yes? From what people have said, this is indeed correct.
Okay, so I launch Photoshop and adjust picture 1 to match exactly like picture 2. I spend hours to the point where it matches picture 2 pixel for pixel. Still with me? I now have three pictures. Picture 1, modified Picture 1 and Picture 2.
According to Apple any crops, color adjustments, rotations, transforms and edits to the picture will still result in a collision. So are they essentially saying now my modified picture 1 will result in a collision? Yet people are somehow stating picture 2 will NOT be a collision? 1/1trillion odds it will be a collision?
This is a clear contradiction according to Apple's own statements (modification of image results in collision) and people's own explanation of the logic (picture 2 CANNOT be matched).
The reason this type of scenario is concerning is that there might be quite a bit of legit pictures that have the same lets say "pixel attributes" that might produce the same hash as a matched image.
And please, do not respond with just "I don't understand". This is EXACTLY why I am posting this scenario because if someone DOES know why picture 2 truly will not match but modified picture 1 does and there is no concern here, then please explain it. I looked through Apple's document (modifications will be a match and its using AI to get the match). I am looking to understand this type of scenario. How can two identical images not match, yet any modifications to the matched images will match?!
It's about bashing Apple. No one yells at Google for their GMail scanning for child sex abuse imagery. But Apple is for some reason different.
Maybe it’s because many of us eschewed Google for Apple because they assured us they think different, have a business model not based on intrusion, and they see privacy as a human right.
The others don’t go around pretending they care about your privacy as a selling point.Exactly.
Everybody is hating on Apple when just about everybody else has been doing it too - and possibly for longer. Apple is an easy target, apparently.
My simple question to these questions; explain to me how you could have a picture so close to one that has been marked as child pornography without it actually being child pornography? Now explain how you could have more than one picture like this to the point where it would actually trigger a review from an Apple employee? And then try to imagine this will happen one in one trillion times and even comprehend what that number means in real world terms.My concern with this, and people have never really addressed other than being patronizing because I am ASKING for what I am missing here. We have two scenarios.
Picture 1: I stand in front of my house, thumbs up next to my mailbox.
Picture 2: I keep the pose because my eyes were a bit closed and they took another picture seconds later.
Picture 1 and 2 are VERY VERY CLOSE in terms of a pixel match.
Okay lets walk through the scenario. Assume Picture 1 made it on CSAM. Apple's claims and documentations state that any modifications to picture 1 would result in a collision. Its not a 1:1 hash match to the entry in CSAM. We on the same page so far? Picture 1 gets flagged but picture 2 does not. Correct in that assumption yes? From what people have said, this is indeed correct.
Okay, so I launch Photoshop and adjust picture 1 to match exactly like picture 2. I spend hours to the point where it matches picture 2 pixel for pixel. Still with me? I now have three pictures. Picture 1, modified Picture 1 and Picture 2.
According to Apple any crops, color adjustments, rotations, transforms and edits to the picture will still result in a collision. So are they essentially saying now my modified picture 1 will result in a collision? Yet people are somehow stating picture 2 will NOT be a collision? 1/1trillion odds it will be a collision?
This is a clear contradiction according to Apple's own statements (modification of image results in collision) and people's own explanation of the logic (picture 2 CANNOT be matched).
The reason this type of scenario is concerning is that there might be quite a bit of legit pictures that have the same lets say "pixel attributes" that might produce the same hash as a matched image.
And please, do not respond with just "I don't understand". This is EXACTLY why I am posting this scenario because if someone DOES know why picture 2 truly will not match but modified picture 1 does and there is no concern here, then please explain it. I looked through Apple's document (modifications will be a match and its using AI to get the match). I am looking to understand this type of scenario. How can two identical images not match, yet any modifications to the matched images will match?!
Apple did put their foot down and refused the FBI's request to unlock the iPhone of the San Bernardino mass murderer...
But that was a few years ago.
Do you think their stance has changed now?
Apple can’t even get iTunes Match right yet everyone buys this “1 in a trillion” like it’s gospel and Apple can do no wrong.Again, you do realize that it would take more than one picture matching to flag an account…and the chances of more than one picture “matching”, being reviewed by an Apple employee and NOT being child pornography is a one in one trillion chance of happening?
Oh…and for the other idiots up above (not you), they are reviewing the flagged images. They do NOT have access to every pic in your iCloud account.![]()
This is spot on. What's more-Apple is the only tech company I would trust to execute something like this while at the same time maximizing privacy.Step 1. Apple does something new, but in a different way than the rest of the industry.
Step 2. The internet cries it’s evil and they will vote with their wallets and leave the ecosystem. Completely ignoring Apple does it in a better and privacy oriented way.
Step 3. Apple explains some better what it did.
Step 4. Everybody shuts up and the industry follows Apple’s lead.
For your lame example, I would say that means child pornographers would get away with the system, not innocent people getting flagged incorrectly.Apple can’t even get iTunes Match right yet everyone buys this “1 in a trillion” like it’s gospel and Apple can do no wrong.
The way I heard it described on a podcast was like this:
Apple is not doing a visual scan of your photos. They're not looking at the actual contents of your photos.
They are, instead, comparing hashes of *known* CSAM images. These are photos that have already been labeled as child porn.
So there's no danger of Apple flagging a photo of your child in the bathtub or whatever.
With all that said... no one knows what else Apple could do in the future. Perhaps they could start scanning the actual contents of your photos. So I can see why people are freaked out.
But as others have said... all of the big companies are doing similar things. So I dunno.
People on here aren't upset about the iMessage thing...and yes, those that may be should read more and stop commenting.This is spot on. What's more-Apple is the only tech company I would trust to execute something like this while at the same time maximizing privacy.
People should probably read more about it before peeing into their shoes. It's tied to a youth's phone and login being part of a family plan. It can be deactivated. Everyone just calm down.