Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I hope everyone who has CSAM in their iCloud photos gets what's coming to them, which is Bubba in a 5 x 7 cell.

For the ignorant people who don't know how hashing works, take five minutes out of your day and learn about it. Take another five minutes and read a few victim impacts statements from children who are/were impacted by distribution of CSAM. If you're not victimizing children, then move along, you have nothing to worry about. Yeah, hashing iCloud albums will create a "back door". Ignorance at its best. More people that don't know how hashing works.

BTW, Facebook, Google, Dropbox and a host of others have been hashing images and videos for years now. And thank god, it's put a lot of pedophiles behind bars. Have you heard about any "back doors" from these providers?
Hashing expert here. Here’s where you miss the mark. You are essentially saying their entire argument is moot because they don’t understand hashing.

Hashing isn’t the issue here. It’s the list they compare it to. That’s where the vulnerability for abuse can happen. In one country, could they be comparing hashes to a list of banned memes? This system cannot exist on-device. I’m all for protecting apple’s servers, but my device should be private
 
I think now is the perfect time for Macrumors to suspend comments. When this story gets some traction we know it’s going to be the race to the bottom and will get nasty like previous stories about this.

Meanwhile +1 to Apple for keeping this going. I hate paedophiles.
*Correction*

Podofile is someone who likes feet.
 
Hashing expert here. Here’s where you miss the mark. You are essentially saying their entire argument is moot because they don’t understand hashing.

Hashing isn’t the issue here. It’s the list they compare it to. That’s where the vulnerability for abuse can happen. In one country, could they be comparing hashes to a list of banned memes? This system cannot exist on-device. I’m all for protecting apple’s servers, but my device should be private
You're right that understanding hashing isn't important, here. But you're missing the mark on what the issue is. Specifically, the hashes have to be agreed-upon from competing jurisdictions. "banned memes" would have be banned in non-cooperative (politically, etc.) jurisdictions for the hashes to be included in the scan --- so, it doesn't matter if e.g., Russia bans memes against Putin, unless the U.S. also bans those memes. This is the type of safety mechanism the researchers who spoke out against this said would need to be required for this type of system to remain safe against government abuse.
 
It's only terrible in the minds of those that don't understand the extensive technological process that Apple implemented to maintain privacy for end users.

It is a well-designed system that is worlds apart from how other companies handle photo gallery scanning.

As for false positives, if someone is intentionally 'planting' photos into someone's iCloud Photo Gallery, then there's another issue that needs to be addresses. The account is compromised, and that has nothing to do with this feature.

Regarding the possibility of back doors, this is just a matter of trust. Apple already clearly stated that they would not allow this feature to be abused by law enforcement agencies for any other purpose, and I choose to believe that. If they were going to allow that, they could do it in secret, and they never have to date.

I trust that Apple will follow the law. Presumably, none of us support corporations who thinks they are above the law and can do whatever they want.
 
  • Like
Reactions: strongy and BurgDog
… Are there legitimate concerns (like you say, abusers uploading CSAM intentionally to someone else's account)? Sure. But the backdoor and government abuse thing isn't actually a concern based on the actual implementation.
Based upon what Apple claims they are doing, behind the scene.

You love just saying ‘hash’ to swipe away any concern. Other images have hashes as well. And a CCP friendly company like Apple, would either be happy to or be forced at the end of a barrel, to compare hashes of forbidden images in China, that would lead to people getting murdered.

All the CCP has to do is make it a law, and Apple would eagerly comply, as they also often say - “we follow all local laws”
 
Come on Apple. Don’t leave unfinished business behind. Let go of CSAM. It is best for the business.
As much as I would like to see Apple to reverse its course and take a strong stance on privacy, I'm afraid the genie is out of the bottle - if only because Apple's implementation gave governments around the worlds ideas (as far as I know the concept has been picked up by politicians both within the EU and in the UK).

If Apple care's a least a bit about privacy, though, then they should move it to their service and only conduct the scan for shared albums/pictures to minimize the impact.
 
Let’s say, you are a parent and you take completely innocent photos of your child.
...
I do not know nearly enough about the process in which material is determined to be CSAM but this scenario doesn’t seem implausible to me.
My understanding (and I'm also not an expert in any of this), is that the pictures that they have in the CSAM database being used for this scanning purpose, are much, much worse than any picture a well-meaning parent would take. Yes, the pedophiles might end up with bath-time pics of kids scraped from Facebook. They won't be caught by having those, they'll be caught because of matches to extremely graphic pictures of child sexual abuse. The agencies involved are not looking to "catch" and prosecute people who have pics of their kids in the bath, it's not worth their time. They're going after the actual pedophiles.
 
*Correction*

Podofile is someone who likes feet.
No idea what you’re talking about. I never said podofile… 🤷🏻‍♂️ Maybe you need to check a dictionary. Clearly you’re confusing children with podiatry. Weird….
 
I hope everyone who has CSAM in their iCloud photos gets what's coming to them, which is Bubba in a 5 x 7 cell.

For the ignorant people who don't know how hashing works, take five minutes out of your day and learn about it. Take another five minutes and read a few victim impacts statements from children who are/were impacted by distribution of CSAM. If you're not victimizing children, then move along, you have nothing to worry about. Yeah, hashing iCloud albums will create a "back door". Ignorance at its best. More people that don't know how hashing works.

BTW, Facebook, Google, Dropbox and a host of others have been hashing images and videos for years now. And thank god, it's put a lot of pedophiles behind bars. Have you heard about any "back doors" from these providers?
Kind of funny that your whole comment is bashing "ignorant" people when you, yourself, don't actually know what you're talking about.

The issue isn't that iCloud photos are being checked for CSAM - it's that the "checking" is happening ON THE DEVICE. This is what people are worried about and why major cyber security experts sounded the alarm about it when it was announced. It is a system that can and will be used to spy on dissidents if it's ever actually released.
 
The "with privacy in mind" bit made me laugh. Here, let us constantly scan your device. whether you consent or not. It'll be in the back ground. You'll never know we're snooping through all your things. But don't worry. We'll do it with your privacy in mind!

😭
 
They did let go of CSAM
If they are an American data hosting company, they have not “let go” of CSAM. :) They just gave up on the plan they were going to use that would allow them to encrypt ALL the images they host. If anyone uploads an image to iCloud that matches the CSAM hash, you can bet they’re going to be reported.
 
  • Like
Reactions: strongy and CarlJ
You're right that understanding hashing isn't important, here. But you're missing the mark on what the issue is. Specifically, the hashes have to be agreed-upon from competing jurisdictions. "banned memes" would have be banned in non-cooperative (politically, etc.) jurisdictions for the hashes to be included in the scan --- so, it doesn't matter if e.g., Russia bans memes against Putin, unless the U.S. also bans those memes. This is the type of safety mechanism the researchers who spoke out against this said would need to be required for this type of system to remain safe against government abuse.
This doesn’t mean it won’t happen.
 
This system cannot exist on-device. I’m all for protecting apple’s servers, but my device should be private
The proposed system would exist on-device ONLY for folks uploading to iCloud. Because, as you say, the government only has say on the images that Apple hosts and the flagging allows them to keep all the images encrypted while still informing Apple (and the government) of a potential match. For anyone not using iCloud, there would be nothing running on-device.
 
  • Like
Reactions: CarlJ
I think there is potential for this system to go badly where innocent photos are viewed in a different context. For example…

Let’s say, you are a parent and you take completely innocent photos of your child.
Some parents take photos where the child may be nude (doesn’t everyone have the classic embarrassing “baby butt” shot in their childhood photo album?) but nobody is offended because everyone knows there is no malintent behind the image. It’s just a child in a state of undress.

This. Matter of time until the (innocent) non-nude photos of people's children suddenly get blurred and red-flagged. It could even be a clothed child wearing a salmon-colored (skin-colored) clothes, which could trick the CSAM algorithm into thinking the child was unclothed.

And then Child Protection Services and local police suddenly comes knocking at the Parents' door.... hmm.... could be a tragic scenario.
 
MacRumnors doesn't remain silent though. Good to have controversial topics every day to drive traffic. Haha!
 
I hope everyone who has is accused of having CSAM in their iCloud photos gets what's coming to them, which is Bubba in a 5 x 7 cell...
Fixed it for you, because THAT is what will happen. People will be arrested under false accusations. Maybe you will be among them. Me, I'm looking for an alternative to iCloud. Even if I have to build my own version.
 
This. Matter of time until the (innocent) non-nude photos of people's children suddenly get blurred and red-flagged. It could even be a clothed child wearing a salmon-colored (skin-colored) clothes, which could trick the CSAM algorithm into thinking the child was unclothed.
Matter of time, yes. And, that time is probably a few thousand years. :) While I know you’re likely not interested in knowing how it works, in case anyone else finding this thread is interested…

There are images of illegal activities that have been captured. A mathematical hash of these images have been created. All image host companies (including Apple) scans their repositories, not using an actual image or machine language algorithm (that may cause false positives), but specifically using the hash to see if any images match. The CSAM algorithm is not looking for “salmon-colored (skin-colored)” tones. It mathematically computes the image’s hash and determines if that hash exactly matches one of the illegal images.
 
  • Like
Reactions: CarlJ
My understanding...
And THAT is the problem! Everybody thinks they understand how all of this will work, and then we just fall into our old, bad habit of TRUSTING THE GOVERNMENT to have our best interests at heart.

But just you wait until you are accused of a crime and all you can afford is an attorney who only has a Samsung phone and doesn't even know what "cloud" or "hash" or "operating system" even mean.

Nice knowing ya, but hey you can rest easy: The Biden Administration will let you out of jail because Covid. Oh wait, unless you were found to be cavorting with the "wrong people". Even if you just spoke to them in line at Walgreen's while you were waiting for your pharmacist to fill your prescription for that special cream you need "down there", we have you on camera, colluding with known terrorists who go to PTA meetings and express anger with the school board teaching their 1st grade kids about sex. You're never going to see your freedom again!

It's easy to make up charges and go after somebody. Apple doing CSAM hashes will make it even easier.
 
Fixed it for you, because THAT is what will happen. People will be arrested under false accusations. Maybe you will be among them. Me, I'm looking for an alternative to iCloud. Even if I have to build my own version.
Not likely, but you’d have to have a better understanding of the method that’s currently being used by all image hosting companies to check for CSAM. You CAN look up how difficult it is to intentionally create a false match, using this method, though. Just remember that researchers are only able to create false hash matches when they know precisely down to the last pixel, the image they’re trying to match against. And none of the false matches they create, even KNOWING the original image would pass a human review.
 
  • Like
Reactions: strongy and CarlJ
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.