I understand that it doesn't do this at present - however this sets the company on a trajectory, no matter how much people want to ignore that fact. No company that makes a user concede ground with privacy ever gives it back, same with any government.
No, it doesn't as that health data was never visible by Apple ever, and they would need to specifically break things in order to make it readable by them. Unlike images, that data was never accessible to them.
And at the end of the day, somewhere down the line after you are flagged for review, an actual real life person can physically go through your photo library.
Apple always could go through your photos (and have if a warrant was provided), all they are doing now is if you
specifically have a
specific photo (not an approximation or slightly similar-out-of-context) on a list of circulated child pornography photos, after 30 hash hits your account gets flagged for real-human review — only then will the authorities get involved after a real person verifies — not before. So, look at what has to happen — a hash of image data has to match a pre-approved data bank of hashes, it has to have 30 hits before a human review is triggered (which I actually think is kind of low), and only then are the authorities involved and this is only because that data is already visible to Apple whenever they want. The scan for hashes is specific to images only, and even then only scanned by machine learning, so it's no different than when Siri scans your emails for dates you may want to add to the calendar, or numbers you want to add to a contact. Only after it meets an incredibly high threshold is a review even triggered, never mind law enforcement contacted, and even then honestly they could have been tracking your IP for the images acquisition and ordered a warrant regardless. If you don't want this — turn off iCloud image backup and try with Dropbox —which already does that.
A backdoor at the end of the day, no matter how secure is a backdoor.
It's not a back door as that implies that someone could get into your phone and they cannot — it's a closed system —they are reading hashes of what they already had access to, and that is a very important distinction. A Backdoor implies access that was previously not held, but Apple always could look at your photos if needed as they had the keys.
There are arguments to both sides, but this technology has been reviewed by several security researchers and they have condemned its use as a bad idea, and a bad sign of things to come.
They condemned it on the slippery slope theory but the reality is it's not about the tech but about what one day Apple may think is actionable content. This service is specific to Images — images that are not approximations but identical to what is being circulated. This system would need to be highly modified before it could be used for anything else and at that point they would need to let us know (like how we know about this incoming) and we can make more informed and rational decisions. Right now, everyone wants to blame Apple for what they
could do and condemn them for the worst-case scenario when they have had the best track record at actually securing individual privacy.
Assuming this system will be modified to mine other data completely betrays the reality that if Apple wanted to do any of this stuff they could have done it a long time ago, as they not only built the system — but house the data and had no requirement to encrypt it. if that is what they wanted to do they didn't need this as an in and it is naive to think otherwise.