The predators with CASM on their computers or iCloud now has a little bit of time to clean up their storage, or move to an Android phone.
They're already doing it on server side, like everyone else.Good! They should do something but on-device scanning is not it.
Apple already do scans on server side, like everyone else.The predators with CASM on their computers or iCloud now has a little bit of time to clean up their storage, or move to an Android phone.
You're not thinking creatively enough.According to their documentation, the implementation is even more secure than this (i.e., this situation, if Apple's rhetoric is to be believed, is itself not even plausible) --- specifically, if China wants to catch dissidents, they'd need to both 1) compel their local CSAM maintainers to upload dissident photos (easy), and 2) compel at least 1 foreign jurisdiction's CSAM maintainer to upload the same dissident photos (much harder).
While the details are murky, I'm guessing the intersection of >=2 jurisdictions' CSAM material is not just any two jurisdictions. If Apple were smart, to increase trust in the database B, they require >= 2 non-cooperative jurisdictions to agree. So China and Russia might not be able to team up to say "this is CSAM," but China and perhaps the US can. Again, this is my guess, but only a brief moment pondering the "increase safety of B" goal (https://www.usenix.org/system/files/sec21summer_kulshrestha.pdf) led me to this idea; my bet is Apple spent more than a brief moment, unless their idea from the get-go was malevolent.
Err, Apple is already doing the scans on the server side like everyone else. They shouldn't be scanning people's phones to begin with.I understand why this is a slippery slope but I don’t like the idea of child predators breathing a sigh of relief.
Countries like S. Korea already forced Apple to allow third party payments on the app store by law. All China needed to do is make some law about it and Apple will have no choice to concede if they want to continue doing business in China. And Apple will have no recourse as they set up the system in place themselves to be exploited.According to their documentation, the implementation is even more secure than this (i.e., this situation, if Apple's rhetoric is to be believed, is itself not even plausible) --- specifically, if China wants to catch dissidents, they'd need to both 1) compel their local CSAM maintainers to upload dissident photos (easy), and 2) compel at least 1 foreign jurisdiction's CSAM maintainer to upload the same dissident photos (much harder).
While the details are murky, I'm guessing the intersection of >=2 jurisdictions' CSAM material is not just any two jurisdictions. If Apple were smart, to increase trust in the database B, they require >= 2 non-cooperative jurisdictions to agree. So China and Russia might not be able to team up to say "this is CSAM," but China and perhaps the US can. Again, this is my guess, but only a brief moment pondering the "increase safety of B" goal (https://www.usenix.org/system/files/sec21summer_kulshrestha.pdf) led me to this idea; my bet is Apple spent more than a brief moment, unless their idea from the get-go was malevolent.
Huh? People and experts who are involved in the field themselves have said that this system won't do much. The root problem lays elsewhere. Yet Apple wanted to set up a mass on device scanning system, basically saying any iPhone users are assumed to be predators. It's obvious that the actual intention is not for child safety.I wonder how many additional children will be victimized from now until then? Apple the greatest company in history with the greatest humanitarian intentions forced to deal with grandstanding ignorant politicians and self centered selfish advocacy groups. It’s unbelievable!
If you have a i-Device, open up your photos app and look at the search icon on the bottom right (at least on my iPhone). Upon click it, you'll see a prompt to search for <Photos, People, Place...>. Go ahead and search for something. On same i-Device, from the home screen, swipe down, and start typing something --- results show up nearly instantly. The fact that you can search for these things is a result from your phone scanning your content and semantically tagging it. This is all built into iOS.
I'll admit that you're more creative than I in contriving a what-if.You're not thinking creatively enough.
If I was a state actor like China, first of all I would demand that Apple give me the rights to have CSAM reports come to my office. How could Apple refuse? It's a reasonable request, just like any law enforcement organization it would genuinely operate to review CSAM photos and push them forward or reject them.
However, since we know that the iOS14 implementation of the neural hashes can be collided, I'd alter genuine CSAM images to have the hashes of images I *want* to flag, like HK protests or Uighar muslim photos. With these booby trapped CSAM images I'd submit them to the NCMEC for approval.
Since we're talking about an adversarial government, they would have no problem creating terrible, novel CSAM for this kind of job. Once those images are in the NCMEC system, it's only a matter of waiting. Note that the Chinese would never be foolish enough to submit the images directly to the NCMEC.
Once the original non-CSAM images are shared by someone in China, the hashes would collide with the altered CSAM and trigger the ramp of the CSAM counter quite quickly. The vouchers for the images and the account would be flagged by the CSAM center in China, and although the genuine CSAM reviewer would reject the found images immediately, the IP and account would already be captured by Chinese intel in the pipe (man in the middle) or by the supervisor in the CSAM center.
Although this type of ruse might be eventually found and guarded against, there would be a few months before this type of operation would even be discovered.... would Apple or the US even disclose this type of breach? The Chinese government would muzzle Apple faster that you could say boo over National Security concerns. We all want to follow the law, right?
This. I mean Apple themselves already said that the hashes database is baked in ios15. So I don't think Apple would spend the time changing that last minute. All they need to do is to turn on the switch. Considering Apple will still be supporting ios14, it's best to just avoid ios15 altogether until Apple actually removes the database from the os.Just going to wait until everyone forgets and do a quite release a .1 update with some security enhancement etc..
This concern is potentially one of the reasons why it was slated for release only in the U.S.Countries like S. Korea already forced Apple to allow third party payments on the app store by law. All China needed to do is make some law about it and Apple will have no choice to concede if they want to continue doing business in China. And Apple will have no recourse as they set up the system in place themselves to be exploited.
I don't understand your point. The CSAM scanning would be something that happens on device and it could be easily turned off. However, you cannot easily turn off spotlight indexing or object/face detection on iOS. If Apple were malevolent in their intentions, then everything you have on your device is so much a greater treasure trove for spying than the CSAM scanning technique.Yes but that happens on-device and you can easily turn that off.
But the capabity is built into ios15, and Apple themselves said it is universal, meaning all iPhones around the world will have that system baked in if they use ios15. The system will be in place in all iPhones. Apple's pinky promise won't do any good when countries start making laws to take advantage of the system.This concern is potentially one of the reasons why it was slated for release only in the U.S.
I've moved my photos to Amazon Photos (free and unlimited space for photos with Prime membership). The service is very good and as by their terms of service they dont sell or user photos metadata to profiling you. I'll soon cancel my iCloud membership. I was about to change from Spotify tp Apple Music and this was a gigantic RED FLAG! So I'll reduce my Apple dependency due to this. Can't trust Apple anymore until they CANCEL this and not PAUSE!I’ve spent a few weeks trying to figure out how to cut ties completely with iCloud… maybe more. I love Apple for privacy, not because I want them to be police. I already have tax dollars paying for police.
As I've said several times, we don't know the true security implications of the system until there are quality peer-reviewed studies on it. Until then, everything you or I conjecture about is to varying degrees meaningless.But the capabity is built into ios15, and Apple themselves said it is universal, meaning all iPhones around the world will have that system baked in if they use ios15. The system will be in place in all iPhones. Apple's pinky promise won't do any good when countries start making laws to take advantage of the system.
They won't even find the actual criminals. All they would catch would be just random perverts. Knowing this system, the actual abusers will go underground and/or make new content. So Apple essentially are encouraging new abuses.I've moved my photos to Amazon Photos (free and unlimited space for photos with Prime membership). The service is very good and as by their terms of service they dont sell or user photos metadata to profiling you. I'll soon cancel my iCloud membership. I was about to change from Spotify tp Apple Music and this was a gigantic RED FLAG! So I'll reduce my Apple dependency due to this. Can't trust Apple anymore until they CANCEL this and not PAUSE!
You should never spy billions of users in order to find thousands of criminals!
Err, the fact that Apple has a system set up so your device can do hashes scanning locally to be matched to the cloud already speaks for itself. Prior to this, Apple could say to countries like China to pound sand. But with this system in place, Apple has no excuses. All China needed to do is to have a law forcing Apple to put in their hashes database. Apple wouldn't know what's in them either. And since icloud servers for China are already within China (and falls under Chinese law), it would be foolish if you think this system cannot be exploited. And I'm sure Apple still wants to do business there so they will surely accommodate whatever required by their laws.As I've said several times, we don't know the true security implications of the system until there are quality peer-reviewed studies on it. Until then, everything you or I conjecture about is to varying degrees meaningless.
With that said, the system, as described, would require multiple (likely non-cooperative) jurisdictions to agree about what is CSAM. China making a law to "take advantage of the system" would require Apple to remove the existing system and replace it with something else. That's now putting the burden of engineering some new spyware to fit China's needs. If China is capable of forearming Apple to do this, then 1) why haven't they already (or have they!?) and 2) why not just use the existing semantically-tagged data on the phone as is? This vector is not a good one for spying purposes, at least if we take Apple's documentation at face value.
Me neither. I was about to upgrade to iPhone 13 and now I'm researching android AOSP forks focused on privacy like GrapheneOS, e.Foundation, CalyxOS, etc.. I prefer to have a Pixel 4 ou 5 phone with a degoogled Android version than having my phone scanned without my authorization just because I want my photos stored on the cloud. I will stand against this and probably leave all the apple ecosystem that I have in my family. 4 iPhones, 1 ipad, 2 apple watches and this macbook that I'm writing on I'll move to microsoft windows or linux.All the news reports are saying that it is being delayed for a LATER rollout. Until this program, and anything like it, is CANCELED accompanied by a pledge never to surveil users wallets and purses, oops, I meant DEVICES, I will never buy Apple products again.
From what I have read, I don't think you have this right. In this scheme, there is a database of "hashes" downloaded to every device and every photo on your device is scanned for matches (with iCloud photos enabled, which most users have). All a totalitarian government would have to do is quietly demand that Apple add some additional hashes to the algorithm and report any matches back to the government. No one would detect it because the hash files are hidden from examination. That's how easy it would be.As I've said several times, we don't know the true security implications of the system until there are quality peer-reviewed studies on it. Until then, everything you or I conjecture about is to varying degrees meaningless.
With that said, the system, as described, would require multiple (likely non-cooperative) jurisdictions to agree about what is CSAM. China making a law to "take advantage of the system" would require Apple to remove the existing system and replace it with something else. That's now putting the burden of engineering some new spyware to fit China's needs. If China is capable of forearming Apple to do this, then 1) why haven't they already (or have they!?) and 2) why not just use the existing semantically-tagged data on the phone as is? This vector is not a good one for spying purposes, at least if we take Apple's documentation at face value.
In short: if Apple is going to cooperate with a government to conduct spying operations on its citizens, then there is no reason to expect it to be through the CSAM scanning technique, as it is quite possibly the worst way to do it. China et al. making laws that require Apple to spy on its users could happen with or without this CSAM technique in place, so the argument that this is the issue is moot.
That scanning is done on-device, and never leaves the device. It is done independently on each device you own, and not sent up to any cloud services (though the photos themselves do get passed up and down the cloud). The hashing and scanning they propose is actually required to maintain privacy. I acknowledge the scheme is clever, and does provide privacy protection if done as they lay out. But I oppose having any surveillance framework on a device that I've paid good money for.Exactly. Apple already scans images on the phone for content. That is exactly why you can search for a particular subject and it finds relevant photos from within your library. If Apple and the government want to do the things that many people here accuse them of wanting to do, the technology has been there for a while. There's no need to go about using this overly complicated method of hashing and scanning and hashing and scanning some more.