Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
According to their documentation, the implementation is even more secure than this (i.e., this situation, if Apple's rhetoric is to be believed, is itself not even plausible) --- specifically, if China wants to catch dissidents, they'd need to both 1) compel their local CSAM maintainers to upload dissident photos (easy), and 2) compel at least 1 foreign jurisdiction's CSAM maintainer to upload the same dissident photos (much harder).

While the details are murky, I'm guessing the intersection of >=2 jurisdictions' CSAM material is not just any two jurisdictions. If Apple were smart, to increase trust in the database B, they require >= 2 non-cooperative jurisdictions to agree. So China and Russia might not be able to team up to say "this is CSAM," but China and perhaps the US can. Again, this is my guess, but only a brief moment pondering the "increase safety of B" goal (https://www.usenix.org/system/files/sec21summer_kulshrestha.pdf) led me to this idea; my bet is Apple spent more than a brief moment, unless their idea from the get-go was malevolent.
You're not thinking creatively enough.

If I was a state actor like China, first of all I would demand that Apple give me the rights to have CSAM reports come to my office. How could Apple refuse? It's a reasonable request, just like any law enforcement organization it would genuinely operate to review CSAM photos and push them forward or reject them.

However, since we know that the iOS14 implementation of the neural hashes can be collided, I'd alter genuine CSAM images to have the hashes of images I *want* to flag, like HK protests or Uighar muslim photos. With these booby trapped CSAM images I'd submit them to the NCMEC for approval.

Since we're talking about an adversarial government, they would have no problem creating terrible, novel CSAM for this kind of job. Once those images are in the NCMEC system, it's only a matter of waiting. Note that the Chinese would never be foolish enough to submit the images directly to the NCMEC.

Once the original non-CSAM images are shared by someone in China, the hashes would collide with the altered CSAM and trigger the ramp of the CSAM counter quite quickly. The vouchers for the images and the account would be flagged by the CSAM center in China, and although the genuine CSAM reviewer would reject the found images immediately, the IP and account would already be captured by Chinese intel in the pipe (man in the middle) or by the supervisor in the CSAM center.

Although this type of ruse might be eventually found and guarded against, there would be a few months before this type of operation would even be discovered.... would Apple or the US even disclose this type of breach? The Chinese government would muzzle Apple faster that you could say boo over National Security concerns. We all want to follow the law, right?
 
  • Like
Reactions: bmot and Fakada
According to their documentation, the implementation is even more secure than this (i.e., this situation, if Apple's rhetoric is to be believed, is itself not even plausible) --- specifically, if China wants to catch dissidents, they'd need to both 1) compel their local CSAM maintainers to upload dissident photos (easy), and 2) compel at least 1 foreign jurisdiction's CSAM maintainer to upload the same dissident photos (much harder).

While the details are murky, I'm guessing the intersection of >=2 jurisdictions' CSAM material is not just any two jurisdictions. If Apple were smart, to increase trust in the database B, they require >= 2 non-cooperative jurisdictions to agree. So China and Russia might not be able to team up to say "this is CSAM," but China and perhaps the US can. Again, this is my guess, but only a brief moment pondering the "increase safety of B" goal (https://www.usenix.org/system/files/sec21summer_kulshrestha.pdf) led me to this idea; my bet is Apple spent more than a brief moment, unless their idea from the get-go was malevolent.
Countries like S. Korea already forced Apple to allow third party payments on the app store by law. All China needed to do is make some law about it and Apple will have no choice to concede if they want to continue doing business in China. And Apple will have no recourse as they set up the system in place themselves to be exploited.
 
I wonder how many additional children will be victimized from now until then? Apple the greatest company in history with the greatest humanitarian intentions forced to deal with grandstanding ignorant politicians and self centered selfish advocacy groups. It’s unbelievable!
Huh? People and experts who are involved in the field themselves have said that this system won't do much. The root problem lays elsewhere. Yet Apple wanted to set up a mass on device scanning system, basically saying any iPhone users are assumed to be predators. It's obvious that the actual intention is not for child safety.
 
If you have a i-Device, open up your photos app and look at the search icon on the bottom right (at least on my iPhone). Upon click it, you'll see a prompt to search for <Photos, People, Place...>. Go ahead and search for something. On same i-Device, from the home screen, swipe down, and start typing something --- results show up nearly instantly. The fact that you can search for these things is a result from your phone scanning your content and semantically tagging it. This is all built into iOS.

Yes but that happens on-device and you can easily turn that off.
 
Off topic....why is it taking more than 3 months to get a ruling in Epic vs Apple? The trial only lasted 3 weeks.
 
You're not thinking creatively enough.

If I was a state actor like China, first of all I would demand that Apple give me the rights to have CSAM reports come to my office. How could Apple refuse? It's a reasonable request, just like any law enforcement organization it would genuinely operate to review CSAM photos and push them forward or reject them.

However, since we know that the iOS14 implementation of the neural hashes can be collided, I'd alter genuine CSAM images to have the hashes of images I *want* to flag, like HK protests or Uighar muslim photos. With these booby trapped CSAM images I'd submit them to the NCMEC for approval.

Since we're talking about an adversarial government, they would have no problem creating terrible, novel CSAM for this kind of job. Once those images are in the NCMEC system, it's only a matter of waiting. Note that the Chinese would never be foolish enough to submit the images directly to the NCMEC.

Once the original non-CSAM images are shared by someone in China, the hashes would collide with the altered CSAM and trigger the ramp of the CSAM counter quite quickly. The vouchers for the images and the account would be flagged by the CSAM center in China, and although the genuine CSAM reviewer would reject the found images immediately, the IP and account would already be captured by Chinese intel in the pipe (man in the middle) or by the supervisor in the CSAM center.

Although this type of ruse might be eventually found and guarded against, there would be a few months before this type of operation would even be discovered.... would Apple or the US even disclose this type of breach? The Chinese government would muzzle Apple faster that you could say boo over National Security concerns. We all want to follow the law, right?
I'll admit that you're more creative than I in contriving a what-if.

A few issues (according to their documentation, again, up to the reader to believe it or not) -- the process of revealing the >= 30 hits on the CSAM database relies on a handshake between Apple's servers and the vouchers from your phone -- i.e., China would also require access to Apple's proprietary server-side code, which I'm guessing Apple wouldn't be keen to fork over.

To your booby-trapped CSAM images argument, I see at least three possible issues:
  1. Assuming they are able to do this (see (2)), there is still the issue that you would need to have near-exact replicas of the images that they want to flag -- i.e., you could attend the same protest, take pictures of the same people (from perhaps a slightly different angle), and still, with very high likelihood, produce a different hash.
  2. Talking about an adversarial government who would have no problem generating new CSAM images does not immediately make the problem of training a GAN to do this easy. I would argue that they would have quite a hard time doing this, without many millions of novel CSAMs. They could likely easily make semantically meaningless images matching the hashes of image they want to flag, but creating novel CSAM having a particular hash is a much harder problem. Not impossible, but I gather very difficult without an absurd amount of data.
  3. Why the heck would China (or any other repressive country) choose this as the best spying vector? Aside from the absurd cost involved, it just about the least efficient way to get the job done. Your phone already semantically tags nearly everything on it --- it would be much easier to require Apple to just report whenever a user has content tagged with anything in <set of objectionable things>. If China has the ability to require Apple bend to its every demand, then there's no way they would choose the CSAM hashing vector for spying.
 
Just going to wait until everyone forgets and do a quite release a .1 update with some security enhancement etc..
This. I mean Apple themselves already said that the hashes database is baked in ios15. So I don't think Apple would spend the time changing that last minute. All they need to do is to turn on the switch. Considering Apple will still be supporting ios14, it's best to just avoid ios15 altogether until Apple actually removes the database from the os.
 
Countries like S. Korea already forced Apple to allow third party payments on the app store by law. All China needed to do is make some law about it and Apple will have no choice to concede if they want to continue doing business in China. And Apple will have no recourse as they set up the system in place themselves to be exploited.
This concern is potentially one of the reasons why it was slated for release only in the U.S.
 
Yes but that happens on-device and you can easily turn that off.
I don't understand your point. The CSAM scanning would be something that happens on device and it could be easily turned off. However, you cannot easily turn off spotlight indexing or object/face detection on iOS. If Apple were malevolent in their intentions, then everything you have on your device is so much a greater treasure trove for spying than the CSAM scanning technique.
 
This concern is potentially one of the reasons why it was slated for release only in the U.S.
But the capabity is built into ios15, and Apple themselves said it is universal, meaning all iPhones around the world will have that system baked in if they use ios15. The system will be in place in all iPhones. Apple's pinky promise won't do any good when countries start making laws to take advantage of the system.
 
I’ve spent a few weeks trying to figure out how to cut ties completely with iCloud… maybe more. I love Apple for privacy, not because I want them to be police. I already have tax dollars paying for police.
I've moved my photos to Amazon Photos (free and unlimited space for photos with Prime membership). The service is very good and as by their terms of service they dont sell or user photos metadata to profiling you. I'll soon cancel my iCloud membership. I was about to change from Spotify tp Apple Music and this was a gigantic RED FLAG! So I'll reduce my Apple dependency due to this. Can't trust Apple anymore until they CANCEL this and not PAUSE!
You should never spy billions of users in order to find thousands of criminals!
 
The upgrade cycle for our company phones is in a couple on month We've always gone with iPhones due to their stance on security and privacy. With the announcement of the OS having built-in capabilities of a watch dog policing our devices, iPhone 13 was off the list for our next phones. With todays announcement nothing changes. In order for iPhones to be considered, Apple needs to give assurance that we will not have a forced upgrade including this "feature" for the 2 year life cycle of this phone. If iPhone 13 comes with iOs 14 without CSAM and they promise to continue with security updates for iOS 14 for 2 years we would be okay with that. Apple better come up with a "clear" plan pretty quick or they are going to lose out on a lot of sales.
 
But the capabity is built into ios15, and Apple themselves said it is universal, meaning all iPhones around the world will have that system baked in if they use ios15. The system will be in place in all iPhones. Apple's pinky promise won't do any good when countries start making laws to take advantage of the system.
As I've said several times, we don't know the true security implications of the system until there are quality peer-reviewed studies on it. Until then, everything you or I conjecture about is to varying degrees meaningless.

With that said, the system, as described, would require multiple (likely non-cooperative) jurisdictions to agree about what is CSAM. China making a law to "take advantage of the system" would require Apple to remove the existing system and replace it with something else. That's now putting the burden of engineering some new spyware to fit China's needs. If China is capable of forearming Apple to do this, then 1) why haven't they already (or have they!?) and 2) why not just use the existing semantically-tagged data on the phone as is? This vector is not a good one for spying purposes, at least if we take Apple's documentation at face value.

In short: if Apple is going to cooperate with a government to conduct spying operations on its citizens, then there is no reason to expect it to be through the CSAM scanning technique, as it is quite possibly the worst way to do it. China et al. making laws that require Apple to spy on its users could happen with or without this CSAM technique in place, so the argument that this is the issue is moot.
 
I've moved my photos to Amazon Photos (free and unlimited space for photos with Prime membership). The service is very good and as by their terms of service they dont sell or user photos metadata to profiling you. I'll soon cancel my iCloud membership. I was about to change from Spotify tp Apple Music and this was a gigantic RED FLAG! So I'll reduce my Apple dependency due to this. Can't trust Apple anymore until they CANCEL this and not PAUSE!
You should never spy billions of users in order to find thousands of criminals!
They won't even find the actual criminals. All they would catch would be just random perverts. Knowing this system, the actual abusers will go underground and/or make new content. So Apple essentially are encouraging new abuses.
 
  • Like
Reactions: Fakada
As I've said several times, we don't know the true security implications of the system until there are quality peer-reviewed studies on it. Until then, everything you or I conjecture about is to varying degrees meaningless.

With that said, the system, as described, would require multiple (likely non-cooperative) jurisdictions to agree about what is CSAM. China making a law to "take advantage of the system" would require Apple to remove the existing system and replace it with something else. That's now putting the burden of engineering some new spyware to fit China's needs. If China is capable of forearming Apple to do this, then 1) why haven't they already (or have they!?) and 2) why not just use the existing semantically-tagged data on the phone as is? This vector is not a good one for spying purposes, at least if we take Apple's documentation at face value.
Err, the fact that Apple has a system set up so your device can do hashes scanning locally to be matched to the cloud already speaks for itself. Prior to this, Apple could say to countries like China to pound sand. But with this system in place, Apple has no excuses. All China needed to do is to have a law forcing Apple to put in their hashes database. Apple wouldn't know what's in them either. And since icloud servers for China are already within China (and falls under Chinese law), it would be foolish if you think this system cannot be exploited. And I'm sure Apple still wants to do business there so they will surely accommodate whatever required by their laws.
 
All the news reports are saying that it is being delayed for a LATER rollout. Until this program, and anything like it, is CANCELED accompanied by a pledge never to surveil users wallets and purses, oops, I meant DEVICES, I will never buy Apple products again.
Me neither. I was about to upgrade to iPhone 13 and now I'm researching android AOSP forks focused on privacy like GrapheneOS, e.Foundation, CalyxOS, etc.. I prefer to have a Pixel 4 ou 5 phone with a degoogled Android version than having my phone scanned without my authorization just because I want my photos stored on the cloud. I will stand against this and probably leave all the apple ecosystem that I have in my family. 4 iPhones, 1 ipad, 2 apple watches and this macbook that I'm writing on I'll move to microsoft windows or linux.
 
  • Like
Reactions: Euronimus Sanchez
For the sake of argument, let's assume Apple thinks 1/3 of those who might otherwise purchase an iPhone 13, would decide against it because of CSAM.

IMO, today's announcement by Apple is simply a carrot to those 1/3, to try to get them to (hopefully) buy a new iPhone 13 upon Release.

Trust in Apple is very-likely Out the Window with a certain % of them; have NO idea what that % might be.

CSAM is clearly Apple's BIGGEST screw-up under Tim Cook !

BTW, I am 100% against Apple's (described) CSAM implementation.

I was NOT going to update my apps to support iOS 15 because of it.

And, I strongly suspect Apple was seeing that with other App Devs as well.

That "may" have contributed to today's announcement.
 
As I've said several times, we don't know the true security implications of the system until there are quality peer-reviewed studies on it. Until then, everything you or I conjecture about is to varying degrees meaningless.

With that said, the system, as described, would require multiple (likely non-cooperative) jurisdictions to agree about what is CSAM. China making a law to "take advantage of the system" would require Apple to remove the existing system and replace it with something else. That's now putting the burden of engineering some new spyware to fit China's needs. If China is capable of forearming Apple to do this, then 1) why haven't they already (or have they!?) and 2) why not just use the existing semantically-tagged data on the phone as is? This vector is not a good one for spying purposes, at least if we take Apple's documentation at face value.

In short: if Apple is going to cooperate with a government to conduct spying operations on its citizens, then there is no reason to expect it to be through the CSAM scanning technique, as it is quite possibly the worst way to do it. China et al. making laws that require Apple to spy on its users could happen with or without this CSAM technique in place, so the argument that this is the issue is moot.
From what I have read, I don't think you have this right. In this scheme, there is a database of "hashes" downloaded to every device and every photo on your device is scanned for matches (with iCloud photos enabled, which most users have). All a totalitarian government would have to do is quietly demand that Apple add some additional hashes to the algorithm and report any matches back to the government. No one would detect it because the hash files are hidden from examination. That's how easy it would be.
 
Exactly. Apple already scans images on the phone for content. That is exactly why you can search for a particular subject and it finds relevant photos from within your library. If Apple and the government want to do the things that many people here accuse them of wanting to do, the technology has been there for a while. There's no need to go about using this overly complicated method of hashing and scanning and hashing and scanning some more.
That scanning is done on-device, and never leaves the device. It is done independently on each device you own, and not sent up to any cloud services (though the photos themselves do get passed up and down the cloud). The hashing and scanning they propose is actually required to maintain privacy. I acknowledge the scheme is clever, and does provide privacy protection if done as they lay out. But I oppose having any surveillance framework on a device that I've paid good money for.
 
  • Like
Reactions: Fakada and BurgDog
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.