Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Too much of this is largely uninformed people getting hold of crumbs of data and getting outraged, assuming they have a reasonably good understanding of the whole situation, and all the other moving pieces, when they very much do not.
Uninformed? The agency's name is NMEC, and their database contains pictures that *might be* CSAM. The vast majority is intimate selfies of teens.
Go read up on the facts.
 
Last edited:
You should absolutely destroy all your smartphones, tablets, home automation, and computers today, and never touch the Internet again, because everything that you're deathly afraid will happen probably already has. If the people you're paranoid are going to come after you really wanted to, the way that you think, it wouldn't be some upcoming thing, it would have already happened. Your best bet is to ditch all your electronics, sell all your major possessions for cash, cut up your credit cards, get a fake id, move somewhere far away where no one knows you, and try to keep your head down for the rest of your life. Go completely off-grid. Make sure to duck into the shadows whenever a helicopter comes over - oh, wait, modern drones at 1000ft are basically inaudible. Probably best to just stay inside, or maybe under tree cover in a forest.

So your argument is that since it is so easy for government to already invade our privacy there’s no point opposing technology that makes it even easier?
 
So basically you cannot make family pictures together with your kids or else run the risk of being flagged despite it being regular family pictures.

No algorithm works 100% and I'm sure many false positive will show up. And once you get accused despite being innocent, the damage is already done.
Go read up on the actual proposal before making easily disproven assertions. Unless you're taking family pictures that include your kids being raped, and then hand those out to pedophiles, the images won't be listed in the CSAM database. It's not any sort of AI or machine learning looking at the images trying to figure out if they're naughty, it's triggered by matches to already-circulating images of child sexual abuse.
 
  • Like
Reactions: Unregistered 4U
The technology opens up the possibility to scan for just about anything, that’s the issue.
OPEN UP??? :) Is it your thinking that this one technology, this ONE THING is preventing the government from scanning for just about anything now across all the different ways they have available to them for surveillance? The absolute BEST this is likely to do for the government is just to provide a less effective and slower way to get at information they already have access to.
 
  • Like
Reactions: CarlJ
And stop mentioning facts, when you don't even know the facts about Machine Learning, thinking it perfectly works on real world data.
Nobody is arguing that Machine Learning is works perfectly on real world data, they keep trying to tell you that pictures are not being looked at by AI or Machine Learning code in the way that you keep thinking, but you won't listen. The images being "scanned" are simply getting hashed and the hashes are getting compared to already-known, circulating CSAM images.
 
  • Like
Reactions: Unregistered 4U
OPEN UP??? :) Is it your thinking that this one technology, this ONE THING is preventing the government from scanning for just about anything now across all the different ways they have available to them for surveillance? The absolute BEST this is likely to do for the government is just to provide a less effective and slower way to get at information they already have access to.

I don’t see how the government having these capabilities is an argument for giving them another tool.

“Well this guy just murdered someone with an axe, don’t see the harm in giving him an AK47”.
 
So what you’re saying is that if the Chinese tell Apple to use a hashed database they supply by law, Apple will … refuse?
No, what I'm saying is that the way the system is engineered, it cannot function with a single database --- it only matches on an intersection of hashes from multiple competing jurisdictions. In other words, the Chinese can provide all the hashes it wants, but unless other jurisdictions also contain those hashes, then nothing will match.
 
easy way for a law breaking bureaucrat to just do whatever they want and Apple will have to comply.

That was kind of my point…….

For all the people out there who talk about how they put their trust in Apple to not use this for anything other than CSAM, my response is, well Apple is surely going to follow the law…whatever that might be…..
 
  • Like
Reactions: pdoherty and nt5672
I don’t see how the government having these capabilities is an argument for giving them another tool.

“Well this guy just murdered someone with an axe, don’t see the harm in giving him an AK47”.
The government has the capabilities NOW to have access to the images folks are saying this “will give” government a new tool for. “The government might use this for…” No. If the government has the equivalent of a network enabled search engine, they are not going to use the equivalent of a physical library card catalog, and perform the equivalent of walking around a big building to find the information.

The government already has an AK47, but folks are concerned because it’s being proposed that they will ALSO have access to 4 sharp popsicle sticks!
 
  • Love
Reactions: CarlJ
So your argument is that since it is so easy for government to already invade our privacy there’s no point opposing technology that makes it even easier?
Thinking that this makes it easier is pretty laughable. They literally already scan all of your images on your device for tagging of content. Ever wonder how your photo library is able to suggest grouping of photos based on faces? Need to find that one photo from a friend's birthday when your kid tripped into the cake? Search for cake, and viola! Chinese government wants to see if you have pictures from Tiananmen Square? Search for Tiananmen Square!

To invade your privacy with this method (assuming Apple is honest about the way the method is described) would be many orders of magnitude more difficult than existing technology that's existed in iOS for over a decade. For one, competing jurisdictions' hash databases need to appear in an intersection for a match to occur. Secondly, only near-exact matches would be flagged --- so, take the Tiananmen Square situation and apply it to CSAM detection: first, since it's perceptual hashing, a particular photo of Tiananmen Square would need to be hashed and uploaded. Next, jurisdictions that politically disagree with China would have to agree to upload a hash for the same particular photo. Then, you'd have to upload that particular photo to your iCloud account. Then, and only then, it would be flagged for review by a human to see if it's CSAM. If China has jurisdiction for the match, then they would finally see that someone had that particular photo from Tiananmen Square. Much easier for China to just require access to all tags already existing on all of it's citizen's phones, and then any photo from Tiananmen Square would be matched, not just a bespoke one.
 
  • Like
Reactions: CarlJ
I think most peoples’ concern is their device being scanned. I don’t think many people oppose Apple doing this in the cloud though as it is their storage, you’re just renting it.
 
No, what I'm saying is that the way the system is engineered, it cannot function with a single database --- it only matches on an intersection of hashes from multiple competing jurisdictions. In other words, the Chinese can provide all the hashes it wants, but unless other jurisdictions also contain those hashes, then nothing will match.
Apple claims that is how it’s setup, but if the Chinese law says that Apple has to use CCP provided image hashes?

You know Apple LOVES excusing whatever they do, with - ‘we follow all local laws’
 
I think most peoples’ concern is their device being scanned. I don’t think many people oppose Apple doing this in the cloud though as it is their storage, you’re just renting it.
Except, if I understand correctly, the pictures are scanned on your device right before being sent to iCloud. The alternative would be for them to be scanned right after they arrive on iCloud. But for that to happen, the pictures have to be sent to iCloud unencrypted. If they're scanned on your device right before sending to iCloud, they can be transmitted and stored encrypted, so there's not possibility of Apple or anyone else scanning them on iCloud. Given that the scanning is going to happen, would you rather it happen once, right before they're uploaded, or up in the cloud, where the pictures can be scanned as often as they (Apple, or some government agency) wants? Oh, plus, if they're uploaded unencrypted, there's always the possibility that some hacker will break in and download your pictures (or everyone's pictures). Having the encryption happen locally, before uploading, makes it harder for surreptitious scans to be done in the cloud for nefarious purposes.
 
  • Like
Reactions: Unregistered 4U
It could be that way, but I guess there’s the possibility of Apple implementing such technology without them announcing or making it public.

I say this, because while for Apple would be easier to drop plans on implementing such technology, I think the European Comission or the European Parliament are asking ways to detect this content in an easy way, as well as ending or limiting the E2E encryption. I don’t have the sources at hand but I’ve read news about it.

Bad times for privacy I’m afraid. Good times for governments who want more and more power over their citizens.
Between the US and the EU, if anything is clear, it is that the US has the upper hand when it comes to surveillance over citizens. Not just their own people, but the whole world. Did you not hear and see Edward Snowden? Of course, the US secret service does it a hundred times better, in every way. They are better in every aspect of it, both technically and in keeping it secret. But make no mistake. They are the worlds true dominant force.
 
  • Like
Reactions: arkitect
Apple claims that is how it’s setup, but if the Chinese law says that Apple has to use CCP provided image hashes?

You know Apple LOVES excusing whatever they do, with - ‘we follow all local laws’
This would require them re-engineering how the system works. What's preventing Chinese law from forcing Apple to engineer something today, with the additional requirement that Apple not disclose it? Better yet, use the existing metadata that's already on your phone without any re-engineering required.
 
Except, if I understand correctly, the pictures are scanned on your device right before being sent to iCloud. The alternative would be for them to be scanned right after they arrive on iCloud. But for that to happen, the pictures have to be sent to iCloud unencrypted. If they're scanned on your device right before sending to iCloud, they can be transmitted and stored encrypted, so there's not possibility of Apple or anyone else scanning them on iCloud. Given that the scanning is going to happen, would you rather it happen once, right before they're uploaded, or up in the cloud, where the pictures can be scanned as often as they (Apple, or some government agency) wants? Oh, plus, if they're uploaded unencrypted, there's always the possibility that some hacker will break in and download your pictures (or everyone's pictures). Having the encryption happen locally, before uploading, makes it harder for surreptitious scans to be done in the cloud for nefarious purposes.
Too much reason in your post.
 
I guess you must be smarter than “security researchers, the Electronic Frontier Foundation (EFF), politicians, policy groups, university researchers, and even some Apple employees.”

You also must have missed this part of the article:



I’m all for protecting children and anyone in general from abuse, but invading the privacy of the entire rest of the population to do it isn’t the way to go. You don’t let a someone into your house to check for any illegal substances or content just because you might have it.

Most of those people didn't know about how the specific technology worked. So I would say those of us who studied it knows more than most of those people.

Being a security researcher don't help you since what is needed is an understanding of some of the algorithms and the mathematics behind them.
 
You also must have missed this part of the article:



I’m all for protecting children and anyone in general from abuse, but invading the privacy of the entire rest of the population to do it isn’t the way to go. You don’t let a someone into your house to check for any illegal substances or content just because you might have it.


Apple didn't need CSAM Detection to scan an iPhone. They control all the firmware and software and it could do that in many different ways.

Both Photos and iCloud Backup has all the software needed for scanning the device already.
 
  • Like
Reactions: CarlJ
Imagine if the Chinese government could one day use this system to check who has the Tank Man image in their cloud storage and deport that person to a muder camp the next day without question.

They can do this on Android phones today without any problems. No Android phone in China uses Google's Android, so it would be just for the Chinese government to ask the manufactures of Chinese Android phones to implement a much more effective scan if they already haven't done so.

The CSAM Detection System was pretty inefficient for what an authoritarian government wanted. Much simpler and better options available.
 
I think there is potential for this system to go badly where innocent photos are viewed in a different context. For example…

Let’s say, you are a parent and you take completely innocent photos of your child.
Some parents take photos where the child may be nude (doesn’t everyone have the classic embarrassing “baby butt” shot in their childhood photo album?) but nobody is offended because everyone knows there is no malintent behind the image. It’s just a child in a state of undress.
So, you have a photo like this of your kid, or your kid in the bath, or your kid at the pool/beach, etc. And you post it to social media, and nobody thinks anything of it because to anyone with a properly working brain, there is nothing to think about it.


But THEN, some creeper trolls public social media accounts like Facebook and Instagram for pictures other people post of their children, sees a few that excite them for reasons of their own, saves the good ones to their computer and shares them online on some sicko forum, or trades them with other perverts, etc.

Now when one of them gets caught, or their website gets raided, etc. all their files get flagged as CSAM because of the context in which they were being distributed and viewed by these people, completely unbeknownst to you, the child’s parent, who now still has this original photo on their phone or in their iCloud. And the checksums match because it’s the same file. Do you see how this goes wrong?

I do not know nearly enough about the process in which material is determined to be CSAM but this scenario doesn’t seem implausible to me.

The images are usually more hardcore than (semi)nude photos of children in a bathroom or a beach. They are really hardcore.

Also, you would have to have many of them to match and it would have to pass a human test performed by Apple.
 
Let’s hope this gets introduced. Harmful material and the individuals who share it could be held to account.

Those who think Apple will be spying on their photos need to learn how hashing works.
This is like people who cheer when the government does something egregious just because their “team” does it. The point is that it introduces a power that can be used by anyone at anytime in the future. Right now it’s known pictures in the future it’s pictures that Apple doesn’t like or politicians don’t like. Do it for the children is a cliche because nearly everyone wants to protect children so lots of terrible things are introduced under that guise.
 
Does anyone know what happens if someone sends his enemy a bunch of CSAM pictures in iMessage? Suppose the target does not always check his messages and attachments and he just leaves them there. What happens then?

I am not using this as an argument against CSAM. I am only curious if anyone knows how something like this will be handled.

If there is not a technological or procedural system in place to handle this, then CSAM can be weaponized and become like swatting somebody.

Anyone knows what happens?

The CSAM Detection System didn't work on iMessage, only iCloud Photo Library.

The system for iMessage are only for minors and it just censor the image and the child are still allowed to override the censorship.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.