It doesn't seem like anybody read any details about what Apple is actually doing here (not a surprise). This appears to include Snowden and the EFF (also not much of a surprise).
Apple is not scanning all your pictures looking for certain types of content that "might" be images of child sexual exploitation or abuse.
Rather, there is a database of known child sexual exploitation images maintained by law enforcement. Cryptographic hashes are generated for each of these known images/files.
The only thing Apple is scanning for are files that match the cryptographic hashes of known images of child sexual abuse. They are not looking at your images using machine learning, or anything close to that.
If this is something you feel is worthy of criticism, go to town. But criticize what they are actually doing, not some inflated imaginary version of what they're doing.
Oh, and as per yesterday's article on the subject -- Apple and all other major tech companies have already been doing this for years. It's nothing new.
I understand the details.
But if I don't have access to the source content that was used to create the hash database (which I shouldn't have because it's child porn), how can I trust the database? How can I know that they don't sneak that picture of the tianmen square in '89 in there, to find out if I'm secretly against the communist party? The only proper thing to do design the system that nobody can judge what's on my phone. I don't even use iCloud photos because I know it's not encrpyted and it could get hacked and leak at any time. But what they're doing is opening a channel where my phone can decide whether Apple or authorities should know about what private things I do on my device. because the Phone software is essentially a black box I can't just install something like Little Snitch that I can use to prevent that data from being sent to Apple.
It also doesn't matter that apple vows to only share content if it indeed contains child porn. They can just get a court order to share all the content anyway and then have to comply. This breaks the whole "but we don't even HAVE the data" that is one of the main reasons to get an iPhone in the first place. If the iPhone can, on its own, "decide", for whatever reason to send such an "alert", it can and will be abused.
Seems like Apple really IS trying to please their CCP masters in the mainland. Definitely won’t be updating to the new iOS and will be looking at other manufacturers in the future for mobile devices if Apple decides to follow through with this awful idea. I thought they were all about privacy and protection of information. Yes, child porn and exploitation needs to be addressed and dealt with. No, this isn’t the way to go about it. Aside from opening the can of worms that leads to scanning of all other types of pictures and abuse of powers, let’s say people intentionally trying to get someone in trouble…sends them a child porn photo to their device. The owner didn’t ask for it or even wants it but it was sent to them. Framed, essentially. And what about the adult movies with role playing where one or the other is ‘young’ looking and acts the part. Definitely not my taste and not my interest but there’s nothing illegal about that in many locales. How does the scanning software distinguish between genuine child porn and two consenting adults playing out fantasy roles. Like i, and many others have stated, Apple has good intentions but it’s a truly awful idea to go about it
The stuff you mentioned shouldn't be an issue because, again, that wouldn't be in the database and thus wouldn't be flagged. The database would only contain known, authentic material of child pornography. in theory.
Can they not do the same hash comparison in the cloud and then gather image data based off of that?
Yes, I don't understand. If iCloud photos aren't end-to-end encrypted anyway, why not do its in the cloud and let people opt out if they don't want their photos x-rayed.
iPhones already scans every photo today for content, faces etc. Creating an additional hash isn't resource heavy at all. Also iPhones do most of their image processing when connected to power and you aren't using your phone.
It's probably local-sensitivity hashing which doesn't require an exact match. Google is the leader here and even presented a paper (in China!) in 2008 on how to it, and they had implementations for at least 10 years.
The probability for a false positive is very low. And Apple has additional controls.
You have to have several matches. People who downloads child pornography usually have thousands and hundreds of thousands of pictures. Apple could easily set this to 50 to reduce the probability of false positives dramatically.
Yes, someone at Apple would be looking at the photos which have been flagged. Apple already has this power today if they want to. And if they're served with a search warrant they turn everything over if needed.
Google and Facebook have been doing this for a decade. How many governments are forcing Google and Facebook to do as you describe?
Facebook complies with a lot of requests for law enforcement to obtain chat conversations due to court orders. Apple, with iMessage, can say that they do not have access to those conversations, which is one of the big reasons to use iMessage over Facebook Messenger in the first place. If Apple now says that they'll put a software on your phone that will read your messages and alert an outside party of whatever dirty or politically inconvenient business you're doing, the whole thing breaks down. Unacceptable.