Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
They are not scanning iCloud. Not at all. This sort of false accusation is what the hubbub is about. People don’t understand how this system works.
They are scanning the images on your phone. They open the file, create a hash, and when that file is uploaded to iCloud, they flag it (if it was flagged on the device). If there are enough positive flags, then a human being will look at the content.

I fully understand how this system works. I've created similar systems (though not for images, it was for data protection purposes to ensure data being exfiltrated out of the organization doesn't contain any known prohibited content).

Regardless, whether they are scanning images in iCloud or images on your phone matters little. They are scanning images. And if you use iCloud, it's likely your phone has the same exact images as your iCloud gallery, so there's no point in scanning iCloud.
 
  • Like
Reactions: boswald
Ultimately, as it's written, everything Apple is doing with CSAM is probably no different than what they're already doing on iCloud, they're just moving the processing to the end user so that they can cut down on cloud computing costs. And I'm OK with that, really.
And I say the opposite is true. We should be removing their right and even ablity to scan any device.
 
It's multi-facited

1) I just don't like my phone spying on me. Full stop, I'm not OK with this.
2) Just because iCloud might be off today, doesn't mean it doesn't get enabled accidentally during an update
3) Just because it targets child porn in the USA, today, doesn't mean it won't target political rivals or "wrong think" tomorrow.
4) I don't know where these hashes come from. If I take my phone to China, and have a photo (or 30) of CCP opposition leaders on it, will I get arrested?

Ultimately, as it's written, everything Apple is doing with CSAM is probably no different than what they're already doing on iCloud, they're just moving the processing to the end user so that they can cut down on cloud computing costs. And I'm OK with that, really. What I'm not OK with is Apple building a framework that's ripe for abuse. What flies in one country doesn't fly in others (memes of the prophet Muhammad, Tiennamen Square, etc), and it's ridiculously easy to take an already implemented feature and co-opt it for something else.
Tell me you didn’t watch the interview with Craig Federighi without telling me you didn’t watch the interview with Craig Federighi
 
So you believe that Cook and other execs on a whim or strong personal beliefs decided to put this system in place all on their own. And felt confident that Apple customers would be just fine with it.

Really?
I don't "believe" either situation -- I see both situations as possible. Keep in mind, they could have implemented this and never told us about it -- iOS is closed source.
 
  • Like
Reactions: hlfway2anywhere
So the real issue is we need to fix how cloud storage agreements work. There is no excuse for this policy to exist as you quoted.

Sure there is. It's their servers (which are just computers) that you're storing things on. Their rules. No one put a gun to your head to make you agree to the terms. You have the choice to spend the money to maintain your own private server if you'd like or simply don't use cloud-based services.
 
  • Like
Reactions: videosoul
Weird, because I haven’t seen a single coherent explanation of why it’s a problem if your own device scans your photos for child porn, and only does so if you are trying to upload onto apple’s servers, and only produces information to Apple if you have at least thirty child porn photos that you are trying to upload.
Without legal probable cause AND a subsequent court order, the only cop allowed to search my safe deposit boxes, for ANY REASON, should be me.

The moral, highly technical, and/or clever mechanisms of how or why Apple (or anyone) is violating the above are completely IRRELEVANT.
 
What I find astonishing is so many here believe Apple executives just woke up one morning and decided to put this program in place all on their own, without any outside mandate or pressure from the government. And without pushing back.

When Craig was asked about any external pressure to put this feature in place in his interview with WSJ, he specifically denied any such pressure. If he is publicly lying then we have a problem.
 
Sure there is. It's their servers (which are just computers) that you're storing things on. Their rules. No one put a gun to your head to make you agree to the terms. You have the choice to spend the money to maintain your own private server if you'd like or simply don't use cloud-based services.
Apple put a gun to my head because they don't refund me if I choose to not use it.
 
They are scanning the images on your phone. They open the file, create a hash, and when that file is uploaded to iCloud, they flag it (if it was flagged on the device). If there are enough positive flags, then a human being will look at the content.

I fully understand how this system works. I've created similar systems (though not for images, it was for data protection purposes to ensure data being exfiltrated out of the organization doesn't contain any known prohibited content).

Regardless, whether they are scanning images in iCloud or images on your phone matters little. They are scanning images. And if you use iCloud, it's likely your phone has the same exact images as your iCloud gallery, so there's no point in scanning iCloud.

It’s not flagged on upload - at least, the flagging can’t be detected by Apple. Only if thirty images are uploaded can the flagging be detected.
 
Without legal probable cause AND a subsequent court order, the only cop allowed to search my safe deposit boxes, for ANY REASON, should be me.

The moral, highly technical, and/or clever mechanisms of how or why Apple (or anyone) is violating the above are completely IRRELEVANT.

Apple isn’t a cop. I don’t need a warrant to search your bags when you enter my house.
 
What I don't understand is why Apple chose to make this feature in the first place. They have always been about privacy, and even took measures to make it easier for them to fight government requests for information. These features don't benefit the end-users (although I suppose one can make a case about the sexting detection for parents) and it completely opens them up to government requests. Where did this come from?
 
4) I don't know where these hashes come from. If I take my phone to China, and have a photo (or 30) of CCP opposition leaders on it, will I get arrested?

You'd need to have a photo (or 30) exact photos of CCP opposition leaders (not just photos containing those CCP opposition leaders, but particular photos). Unless you're the one taking those photos, then disseminating them widely enough that China sees them and then forces Apple to upload the hashes to the database to match against, then you wouldn't get arrested. But seriously, the what-ifs are getting more and more far-fetched.
 
Does anyone know how long the hash is? Hopefully more than 8 characters but I doubt it. It’s gonna be a huge database we have to store on our iOS devices, how will that affect battery life, storage and performance if keep 50k photos on our devices? Will it have to scan every photo again when there is an update to the database? What about the environmental impact of all this unnecessary cpu cycles? Will people be encouraged to jailbreak their devices in order to disable this “feature”?
 
Nope - don't have access to their code or 100% understand every last detail of the technology (just like most of the software and tech on the iPhone already). Do YOU? YOU'RE the one accusing them of wrongdoing, not I, so the onus is on YOU to prove so or at least provide some evidence. In the real world, we don't convict and hang people based on irrational suspicion fueled by paranoia, at least not in the civilized parts of the world.
We don't have to know the exact implementation details to know this is rotten to the core. They've already said that hashes will be generated on the phone and compared against hashes of known child porn content. To create that hash, they have to read the 1's and 0's in the file. The software does this. If I had software on my computer that scanned my files without my consent, I'd call that software spyware.

And that's exactly what this is. Unless they give users a way to defeat this feature (which would render it wholly useless), then Apple has installed spyware on our phones.
 
Damage control. Damage control. Damage control.

As many others have pointed out someone at Apple decided the strategy for handling this would be to set the tone of the conversation around “The backlash is simply because you don’t understand the technology, idiot customer!” This is literally textbook gaslighting, a real prime example of an otherwise overused term.

Apple have a real “we’re sorry if you’ve misunderstood us” tone going on.

Classic sign of an abusive/unhealthy relationship.
 
Does anyone know how long the hash is? Hopefully more than 8 characters but I doubt it. It’s gonna be a huge database we have to store on our iOS devices, how will that affect battery life, storage and performance if keep 50k photos on our devices? Will it have to scan every photo again when there is an update to the database? What about the environmental impact of all this unnecessary cpu cycles?
It's probably no larger than a few MB, as most. Also, the language suggests that it will only be computing hashes of photos you elect to upload (i.e., it won't be rescanning your existing library, but new photos you want to upload). Also, a trained neural network works much like a function f(x) = h --- it's just computing some math on your image --- not a ton of work to do.
 
  • Like
Reactions: JosephAW
I'm shocked 99% of them actually don't bother getting to know how it works, they even talk about backdoors without even knowing how it would possible lol but sure, hop on the trend and say you don't like this feature
I totally read understood what they say about how it works, but I will NEVER allow anyone to scan my data and for sure I will NOT PAY anyone to be able to do so.
 
“The backlash is simply because you don’t understand the technology, idiot customer!”
The irony in Apple taking that stance on it is that I came across a thread here on MR a little while ago about how even APPLE employees are expressing concerns over it as well...

If Apple's stance is that customers are angry because they don't "understand", I'd love to hear their explanation for why their own employees don't like it. Do they not "understand" it either...?
 
When Craig was asked about any external pressure to put this feature in place in his interview with WSJ, he specifically denied any such pressure. If he is publicly lying then we have a problem.

If that's true, and there was zero pressure from the government (I don't believe that as there are many laws regarding child expoitation the government is invested in), he must have given an explanation on how this came about; the reasoning among the execs, thought processes, how it would weigh with Apple customers and how privacy is paramount - but really over-rated, result in a positive impact on revenue, Christian beliefs, ...whatever. What was it?
 
  • Like
Reactions: Philip_S
Tell me you didn’t watch the interview with Craig Federighi without telling me you didn’t watch the interview with Craig Federighi
Stop saying we don't understand the issue and Craig explains it all in this video and we're all being emotional children. At least we're not naive sheep. It's you who is misunderstanding the fundamental issue. Stop drinking Apple's BS-flavored Coolaid.

I've watched the video. It's complete white-washed BS that entirely skirts the fundamental problem with this technology, which is... There is nothing that prevents someone/anyone with the clout of strong-arming Apple to replace the subset of images they are scanning for with an entirely different set of criteria for a completely different purpose. What they're putting in place is the groundwork for any government agency to scan for any content they like.

The "auditing" and "transparency" schtick is complete crap. Who are the auditors and who audits them? There's one simple solution... Do not put this system in place, for any reason, no matter how noble. It's a slippery-slope down a razor-blade slide into a pool of alcohol.
 
Weird, because I haven’t seen a single coherent explanation of why it’s a problem if your own device scans your photos for child porn, and only does so if you are trying to upload onto apple’s servers, and only produces information to Apple if you have at least thirty child porn photos that you are trying to upload.
So... you'd be ok with a car that automatically reports you for speeding? It's the same concept. Your devices shouldn't be repurposed to work against you or monitor you. The rule to date was if/when you upload data to the cloud it is fair game to be scanned/analyzed.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.