Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I understand why this is a slippery slope but I don’t like the idea of child predators breathing a sigh of relief.

The off switch on the iCloud Photos feature already gave them the opt-out they needed. I doubt this changes anything for them. Whether they will toggle it back on now for awhile I don't know.
 
  • Like
Reactions: Pummers
Their mistake wasn’t technical. Their mistake was they didn’t foresee the public backlash. There is no spying on your device based on a consumer electronics co’s decision without spying on your device based on a consumer electronics co’s decision.

It doesn’t matter how you twist it technically, it’s about the precedent. It’s conceptual. It’s philosophical.

OK - I get that side of the discussion. I get that viewpoint somewhat.

BUT - EFF.org was not satisfied and if you read their statements about it, I think you would likely agree that Apple's idea was lacking and not thought out well enough. Interesting concept maybe, but not at all ready to be deployed.

My own opinion is that Apple executives should be realizing that they are throwing away all their positive privacy public relations goodwill in one swoop! It's a multi-billion dollar mistake - or maybe it could have been. I feel as if though this is some kind of infiltration of society stupidity inside Apple. I mean - they are a huge company, and with all the crazy stuff going on in the world, Apple has been affected too! They almost made a big mistake. Yes, yes, I get it - it is intended to protect the children. If you look at my first post on this subject I was on the other side of it, but then I realized that it's a backdoor slippery slope and I changed my mind. I hope Apple takes a good close look at how important privacy is.

I had a smart friend of mine say "most people won't notice or care about such privacy issues" - and that's true. It's only a small percentage of people who can be bothered to think about possible consequences. This is even more scary to me that the majority of society is oblivious to the deprecation of privacy over the last decades. Brave New World and "1984" are pretty tame compared to the real surveillance state we live in. Apple was and maybe still is uniquely positioned to be the last company who can honestly give us privacy and that's worth Billions of Dollars!
 
I'm surprised that Apple blinked on this one. Too bad it is a delay and not an outright cancellation. They'll probably make some meaningless changes and then quietly roll this out in the middle of the iOS15 lifecycle after most people have already upgraded.
 
None. As people have said over and over again it takes about a decades for any images of abuse to enter the CSAM database. So the system does nothing to stop active abusers.
A decade is a bit of an exaggeration, but it's definitely not something that happens as quickly as most people think.

Firstly, there has to be a critical mass of a photo caught circulating, but modern social media and "dark web" channels have sped that process up dramatically in recent years.

More importantly, however, CSAM circulates for years, and the disturbed people who collect this stuff can't ever get enough of it. There's a very high probability that anybody in this situation will have enough photos in their collection that also happen to be in the CSAM database, which is also likely why the threshold is only set at 30. I haven't ever heard of a case where a consumer of CSAM had fewer than several hundred photos in their collection.

Sadly, you're partially right that it's not going to do anything to stop "active abusers" — at least not directly. The animals who are creating CSAM are usually smart enough to avoid public online services in the first place, but even if 30 of their photos strayed into iCloud, they'd be too new to be caught by the CSAM Detection algorithm.

However, this is where old-fashioned detective work and forensics come in, and from the law enforcement agents I've spoken to, more often than not a collector of CSAM provides invaluable leads to track down the distributors and creators.
 
I always applauded Apple privacy measures but they are dropping the ball big time here with this child safety measure. I get the reasoning behind it but the last thing I want is Apple scanning my photos, documents, etc. Not that I need to hide anything, but this is the first step to go further and it's an infringement on my privacy rights.
I guess I'll keep my iPhone 7+ longer and avoid updating the iOS.
 
What you upload to iCloud, doesn’t stay on your iPhone. (Duh)

EDIT
Alternatively phrased,
“What happens on iPhone, stays on iPhone.

Unless you upload it to the cloud. Or offload it onto a computer. Or text it to a friend. Or share it on social media.

Because in all of these cases, you’re obviously sending it somewhere not on iPhone.”
 
Last edited:
  • Like
Reactions: Jayson A
Child predators probably are dumb but likely not this dumb. The rate of criminal prosecution resulting from this technique is likely zero.
Based on some podcasts I've listened to, they're not even that dumb. There was a server in Europe that was incredibly well guarded. Law enforcement officials finally figured out who was running it, and they had to take it over and also arrest an accomplice at exactly the same time. The serve had set up a sort of dead man's switch, where if the accomplice didn't post at a specific regular interval, everyone would know the site was compromised by the authorities. (The reasoning was that the cops would take over the site, but not also be able to access the account of the accomplice, who would then not be able to post.) They posted the all-clear, but just in time.

Then they spent months (the authorities did) facilitating the distribution of CSAM materials to keep the sting going. So essentially the law enforcement officials of Australia were distributing child sex materials for months and months after taking over the site, in the hopes of snagging the largest new-content providers.

It's a bit of a slippery slope there, allowing low-level offenders to traffic this material in the hopes of getting a bigger person. I don't know if they ever managed to catch one of these whales, and I don't think they prosecuted anyone smaller. When it was found that the site was owned by the officials, people just slipped away to other sites.
 
I'm surprised that Apple blinked on this one. Too bad it is a delay and not an outright cancellation. They'll probably make some meaningless changes and then quietly roll this out in the middle of the iOS15 lifecycle after most people have already upgraded.

I'm not surprised. Privacy is a huge deal for Apple given how they market themselves. When even Rene Ritchie is making lengthy YouTube videos discussing some concerns with the plans, you know there must be a problem on some level. Let's hope Apple have some solid ideas how to improve upon their initial plan to satisfy most critics.
 
If the CSAM detection feature leads to full E2EE in iCloud (and that's still a big IF), then it really has the potential to be a much bigger win for privacy in the end.

I get where you're coming from on this, but are you really gaining E2EE? I mean, I get it... technically speaking this would allow them to do that. But while you're gaining E2EE in iCloud, you're still losing out on one of the biggest benefits of E2EE - knowing that nobody can access your E2EE protected data.
 
What you upload to iCloud, doesn’t stay on your iPhone. (Duh)
Really, that's weird cause the last few thousand of my photos are both on iCloud and my device. It's only the older photos that have been moved to the cloud, but still there are thumbnails of them on my phone.

Also, the ability to scan a user's phone without them having an iCloud account is part of this.
 
Ya know…I’ll take the risk to save even one child….
Nobody is for not saving kids, I'm all onboard for more funding and more resources to help catch and convict child predators, but what is not needed is the degradation of everyones security and privacy. You start trading away your individual rights then soon you'll not have any. This is what this protest/fight is about. We've all seen this show before about "Will anyone think of the children!" outcry which has always been about being a code for invading your privacy and taking away your freedom and censoring, pick your poison pill it'll happen history is littered with it and littered with people like you who willingly go along with it. The powers that be, this is not about saving any children it is about controlling you and serving it up on a plate of propaganda where if you oppose it they can call you all sorts of nasty things.

So, ok fine you have no problem with it but for the rest of us that sees the danger we are being forced into it we have no option to refuse.
 
Ups crisis management in action. To much screeching voices.

iPhone season is coming. As I sad, they will make record sales, and after this CSAM is on again.
With "better implementation" which will not change the logical fallacy of hosting hashed sensitive material provided by third parties. Governments or private corporations.

For me this event triggered total distrust with anything "closed source". I will not buy Apple products again.
This corporation is too big and too powerful. Recent workforce scandals show that discrimination and abuse are part of Apple production culture. The perception of Apple as a user centric company is over for me.

And decision to put this backdoor on macOS was the final straw.

If some form of production arises that requires running macOS the only logical solution is to air-gap the machines as hard as possible.
 
Even in the worst plausible "what if" scenario — that Apple allows foreign governments to add their own "CSAM" entries of things that aren't technically CSAM — the photos still have to match photos that already exist somewhere else. No photo that you take or create yourself will ever match the algorithm, except in the case of false collisions, and you'd have to have at least 30 of those false collisions before anybody would even know about them.

In other words, let's say that China wanted Apple to scan for photos of known dissidents. The current implementation would only allow the scans to be for photos that were already out in the wild. If you took a photo of someone with your own camera — even if that were a person who was in one of the forbidden photos — that would never get flagged by these CSAM algorithms.

So, while I agree that it has the potential for abuse, it's not as insidious as what many seem to think it is, and it's certainly less so than a government agency being able to look at every single photo in your library.
According to their documentation, the implementation is even more secure than this (i.e., this situation, if Apple's rhetoric is to be believed, is itself not even plausible) --- specifically, if China wants to catch dissidents, they'd need to both 1) compel their local CSAM maintainers to upload dissident photos (easy), and 2) compel at least 1 foreign jurisdiction's CSAM maintainer to upload the same dissident photos (much harder).

While the details are murky, I'm guessing the intersection of >=2 jurisdictions' CSAM material is not just any two jurisdictions. If Apple were smart, to increase trust in the database B, they require >= 2 non-cooperative jurisdictions to agree. So China and Russia might not be able to team up to say "this is CSAM," but China and perhaps the US can. Again, this is my guess, but only a brief moment pondering the "increase safety of B" goal (https://www.usenix.org/system/files/sec21summer_kulshrestha.pdf) led me to this idea; my bet is Apple spent more than a brief moment, unless their idea from the get-go was malevolent.
 
So, while I agree that it has the potential for abuse, it's not as insidious as what many seem to think it is, and it's certainly less so than a government agency being able to look at every single photo in your library.
I suspect you're missing one of the big concerns: It's not this specific implementation that's necessarily the problem, although there has been some evidence to suggest it is, or could become, problematical. It's also the precedent of doing any on-device scanning for the purpose of ferreting-out illegal activity that's a problem.

If doing on-device scanning for CSAM is ok, then why not on-device scanning for prohibited <name your thing>? Weapons? Political gatherings? If scanning images that people plan to upload to cloud storage is ok, then why not scan images regardless of whether they're to be uploaded anywhere? If scanning for image matches is ok, then why not scanning for "hate speech?" (Some countries do have "hate speech" laws and there are people, right here in "the land of the free," that would like to see them here, too.) Or planned protests? Or...?

Yes, this is a slippery slope argument. But that doesn't necessarily make it fallacious, as some here are wont to claim.

Bottom line: It is felt by many, and by all security researchers and privacy advocates, that this crosses a line that should not be crossed. I agree. Emphatically.

Besides: Viscerally, having some kind of scanner not under my direct control, on my devices, is... icky
puking-01.gif
 
I get where you're coming from on this, but are you really gaining E2EE? I mean, I get it... technically speaking this would allow them to do that. But while you're gaining E2EE in iCloud, you're still losing out on one of the biggest benefits of E2EE - knowing that nobody can access your E2EE protected data.
What E2EE? You're making an unfounded assumption that apple would implement e2ee. And if there's a bypass, is it really e2ee anyway?
 
  • Like
Reactions: PC_tech
too little too late for their reputation but good to see, we now know apple is microsoft is google is apple is microsoft is google is apple

we are now forewarned, even if apple fixes this, nevertheless, good to see
 
Hence the hypothetical. I'd be happy to see an e2ee implementation, but based on feedback from Apple's CSAM techniques, it seems that it would be either a) impossible or b) highly disliked in western democracies.

I don't think the nuance of how this technique balances privacy concerns has ever been discussed in the security literature, which is probably the biggest reason for the large pushback from much of (not all) the security community.

The back-and-forth rhetoric here about people understanding the tech or not is overblown, whether you're an ex-software dev or not. I've contributed to it to a degree, as what we do know about the technique has been misrepresented countless times on these forums. I'm a current CS PhD student studying some of the concepts at play here, and the truth is that none of us really understand this well, as there have not been solid peer-reviewed literature about it for us to understand it well enough.

Had Apple presented the ideas openly, and allowed for public discourse (and peer-reviewed studies) to commence prior to setting some release date, then there could have been a less emotionally-driven conversation going on, but instead we have what we've seen the last several weeks, which is really a shame, as I see real value in what they're proposing (the value being in the hypothetical prospect of this being the enabling feature of e2ee).

This is because it‘s quite obvious that Apple works with the intelligence community very closely under the table. They get along pretty well, despite the grand PR display. Things like this are a part of their “you give me this, I give you that” deal making behind the scene.
 
Also, the ability to scan a user's phone without them having an iCloud account is part of this.
Not really.

I mean, sure, technically speaking, the algorithm is there and could keep on running even if iCloud Photo Library were disabled, but since it relies on using iCloud Photo Library to store the necessary "safety vouchers," alongside the photos, it's not as simple as just leaving it running. Where would this data even get stored for a user who doesn't have an iCloud account? How does Apple associate this information with an identity if there's no iCloud account at all? These are all solvable problems, of course, but there's more to it than just "leaving the algorithm running."

Now, if you assume Apple is lying, or that it could be forced to lie, then this new CSAM Detection feature changes nothing at all. You should have run away screaming and ditched all of your Apple devices when iOS 10 was first released five years ago.

Back then, Apple introduced a revolutionary new feature that would allow faces and objects to be indexed and catalogued directly on the iPhone. This happened whether you were using iCloud or not, and there has never been any way to turn it off. Apple made a big point of saying that this only happened on the iPhone, and that this information never left the iPhone. However, we only ever had Apple's word to take for this.

The bottom line is that if you're using an iPhone, you really have no choice but to trust what Apple says it's doing. These latest developments change nothing in that regard. If you think this is a matter of Apple "showing its true colours," that's fair enough, but if Apple is going to lie about how its CSAM Detection algorithms work now, there's no reason to believe it hasn't been lying to its customers about everything else.
 
I suspect you're missing one of the big concerns: It's not this specific implementation that's necessarily the problem, although there has been some evidence to suggest it is, or could become, problematical. It's also the precedent of doing any on-device scanning for the purpose of ferreting-out illegal activity that's a problem.

If doing on-device scanning for CSAM is ok, then why not on-device scanning for prohibited <name your thing>? Weapons? Political gatherings? If scanning images that people plan to upload to cloud storage is ok, then why not scan images regardless of whether they're to be uploaded anywhere? If scanning for image matches is ok, then why not scanning for "hate speech?" (Some countries do have "hate speech" laws and there are people, right here in "the land of the free," that would like to see them here, too.) Or planned protests? Or...?

Yes, this is a slippery slope argument. But that doesn't necessarily make it fallacious, as some here are wont to claim.

Bottom line: It is felt by many, and by all security researchers and privacy advocates, that this crosses a line that should not be crossed. I agree. Emphatically.

Besides: Viscerally, having some kind of scanner not under my direct control, on my devices, is... icky
puking-01.gif
Yes, the fundamental violation is on-device scanning. No "improvements" will make it not on-device scanning.

Additionally, there is the verification step once the scanner triggers enough "matches" -- your "matched" data is sent to Apple for review. Are enterprises really OK with this? The scanner thinks it's a match, so it's going to send potentially proprietary company information to ... someone ... at Apple? Obviously the theory is that it's only sending when there's high confidence it's CSAM and not something innocuous. But the existence of the review step indicates that this is not the same as certainty. And of course we all know there can't be certainty without human review. But this inherently means that eventually non-infringing material will be sent to Apple for review.
 
What E2EE? You're making an unfounded assumption that apple would implement e2ee. And if there's a bypass, is it really e2ee anyway?
Right. That's literally exactly what I was saying to the previous guy I was replying to. And he acknowledged that it was a speculative assumption. ;)
 
So what should they do?
Apple should scan for illicit material (just like what google, Facebook and Microsoft currently do) when it is uploaded to their servers. The only reason I can fathom for them not doing this is maybe because they are/were planning on doing end-to-end encryption for iCloud. But even with them doing end to end they would still need to have the keys to verify that the photos were CSAM..
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.