Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Do you have proof that turning off any feature actually turns it off? What I mean is... you've always felt like a feature is disabled when you disable it, right? How is this any different?

Disabling features does not remove the code from your device. Never has.

Spying code is not a "feature" in my book.

These claims that Apple has a moral and legal duty to prevent CSAM from being stored on their servers is a sham!

Every business would claim the same if asked, no business or person wants CSAM material on their property but:
  • My bank doesn't search items in my safe deposit box for CSAM.
  • Parking garages do not search my car for CSAM before I park in them.
  • Hotels do not search my suitcase for CSAM when I check-in.
  • Towns/Cities/Apartments do not search my moving truck for CSAM when I move in.
  • U-Store-It businesses do not search boxes for CSAM before they are stored in their facilities.
How is my phone any different? This is a gateway to further invasions of privacy.
 
Last edited:
Spying code is not a "feature" in my book.

These claims that Apple has a moral and legal duty to prevent CSAM from being stored on their servers is a sham!

Every business would claim the same if asked, no business or person wants CSAM material on their property but:
  • My bank doesn't search items in my safe deposit box for CSAM.
  • Parking garages do not search my car for CSAM before I park in them.
  • Hotels do not search my suitcase for CSAM when I check-in.
How is my phone any different?
Because Apple has God-complex

Seriously, they think they’re over democracy, individual rights and whatever stuff you could throw at them
 
ISTR it had, in some literature. (No, I do not have cites handy, and I'm not going to try to find them.)
I'd honesty be interested in this.

The problem is security researchers and privacy advocates feel this is such a fundamentally bad idea, its negatives outweigh any conceivable positives. I agree.
There are (at least) two major sentiments circulating from qualified security folk: 1) this was done by Apple in isolation, which makes it incredibly suspect (i.e., conservative but constrained that maybe it does what it says in a privacy-increasing way), and 2) what the hell does Apple think they're doing installing a backdoor on billions of devices?! (i.e., less constrained knee-jerk reaction without proper studies or peer-reviewed literature to back up their assertions).

The primary issue here is that with (1) people are theorizing, whereas with (2) people are asserting unknowns as facts, which, if done in a peer-reviewed setting, would be a great way to ensure you'd only publish on arxiv. While the primary thrust overall is that this was not a good way to do this, the nuance between how (1) and (2) approach it is: it was not good for Apple to develop this in isolation from the onlooking world vs. it's not good to do this, period. Both sentiments are negative, but they differ in what exactly about this is negative.

Even if Apple had been promising E2EE (which they have not), I would still see on-device CSAM-scanning as a net negative. And not even close.
I'm perfectly aware that Apple has not promised or even hinted at e2ee (hence, again, the hypotheticals); however, I disagree on whether this can be absolutely qualified as a net negative or not, yet. I'm reserving judgement until I see quality peer-reviewed literature on the subject.
 
Spying code is not a "feature" in my book.
I agree 100% with your sentiment, but @Jayson A's seeming sentiment is that there are other features on your phone that could be spying (in a much more effective way) than the CSAM detection, as presented.

E.g., you turn off location tracking, but rather than it turning off, it continues to track you, and reports that you attended <place of interest>, or, you turn off object/face recognition in your local photos library, but it continues to do so, and reports that you have photos of <object/person of interest> on your device. The sentiment is that we don't really know that these features are in fact disabled, whether Apple's CSAM detection plans were revealed or not. It could be that Apple, China, et al. have been spying on us for years without users knowing.
 
I find this very unlikely. Not one of these *******s is going to be so stupid to sync with a public cloud service, where the evidence is easily acquired via court order. (Court orders are fine, btw).

the type of animal we hunt is not dumb, or ignorant. They are smart, cunning, and that makes them dangerous. As such, they do not use something like iCloud Photo Library.
Not entirely true in my experience, and I've worked with law enforcement on these issues.

The monsters who are creating CSAM are generally smart and cunning, but those who are merely consuming it are usually not at all. Most of them don't even think they're doing anything wrong. After all, they're just downloading and collecting photos, and some of the photos may even seem relatively harmless to these people (not all CSAM is of disgusting sex acts).

Most consumers of CSAM are caught as a result of their own stupidity, which includes uploading and sharing photos on cloud services. Catching these idiots doesn't just help cut off the demand, but it often gives law enforcement critical leads on tracking down and arresting the actual distributors and creators.

I'm not saying that Apple isn't being a bit too clever for its own good here, although I do believe we'll never see full end-to-end encryption in iCloud without at least some attempts to find another way to address these kinds of scenarios.

Sadly, digital search warrants don't work like physical ones. If a cop shows up at your door with a search warrant to look for a dead body, they're only allowed to look in places that a dead body is likely to be — they can't open your briefcase or search through your laptop. However, if a law enforcement agency serves Apple with a warrant for your digital content, they'll get every single photo in your iCloud account, whether it's relevant to the investigation at hand or not.

If the CSAM detection feature leads to full E2EE in iCloud (and that's still a big IF), then it really has the potential to be a much bigger win for privacy in the end.
 
The thing is, the bulk of child porn is consumed in the US and produced in Japan and the Philippines. China culturally does not sexualize children. Neither did Japan, actually, until the anime and rorikon (lolita obsession) kicked in. This is a relatively recent social phenomenon. The US soldiers stationed in the Philippines are the biggest consumer of, not child porn, but child prostitution.

I researched this extensively when I vacationed in the Philippines and Thailand. I was appalled by the treatment of women and the condition of children there. Thailand was much better in comparison, but they have the issue of "transgender freak show". Boys of poor families are being sold off to be neutered and end up in circuses as ladyboys. those ladyboys don't just show you their bits and what they got and entertain you. They do things that are truly damaging to their bodies. Therefore, they only live a very short life. For example, things they inject to keep them look a certain way, and when they do their freak show, think about stainless steel ping-pong balls coming out of places that are not supposed to host 8 ping-pong balls. I will leave you to imaging the rest, and the type of freak show they do.

Edit:

To clarify, lolita is generally regarded as around 14 years old girls. Rorikon is much younger, around preteen girls.

I have to disagree. The latest examination by the EU stated the bulk of child porn existed in the EU followed by the US.
This is by over-all volume.
 
I'd honesty be interested in this.
I wish I could oblige, but, being able to satisfy your interest would have required I kept notes on which articles presented which arguments, etc. It never occurred to me to do that, and finding what I believe I recall what I read would require hunting it all down and reading it all over again. I'm simply not going to do that.

And yet what happens on device does not result in knowing whether a match has been found.
I'm not certain what point you're trying to make.
 
One more time for the people (in the front, I guess?): spying implies you have no idea it's happening -- this was, and never will be, spyware.
For those in the middle(?) this is an invasion of privacy regardless if it is known or not and you do not have a choice to opt in or out, you are being forced into it despite what you want with your privacy. Clear enough?

As far as part two goes, once you embolden a company/government and allow something like this there are and will be more steps taken later that will be spyware and most likely you will not know about it. History has proven this to be true again and again and I do not trust Apple will not do the same.

Put a stop to it now or you're saying you are willing to give up your privacy and freedom for "safety".
 
  • Like
Reactions: PC_tech and Pummers
Not on-device, they're not.
I did not say so. only that they scan whatever hits their systems. For external parties that is enough leverage to push against apple to do *something*. “Other large tech companies are fighting child porn Tim. Why does apple not fight this terrible thing?”. Pretty soon, investors, large customers, etc start getting to feel the pressure. It’s legal blackmail, without any legal mandate. (Such things should be enshrined in law, and overseen by the judiciary)

the battle against crime and particularly childpornography has been abused to chew at hard won privacy. While it is extremely unlikely to affect the perpetrator. (Because sick as they are, they are unfortunately not stupid)
 
Not entirely true in my experience, and I've worked with law enforcement on these issues.
From what I've read this seems to be true.

If I was engaging in an activity I knew to be illegal, the very last thing I'd do is keep evidence of it even on my computer, which is entirely under my control (TTBOMK), much less a mobile device--the innards of which are a complete mystery to me. But people engaging in criminal acts often behave very stupidly. Some might argue behaving in criminal manner suggests a propensity to be stupid, in the first place. Law enforcement types have often commented along the lines of "if criminals were smart it would make my job a lot harder."

If the CSAM detection feature leads to full E2EE in iCloud (and that's still a big IF), then it really has the potential to be a much bigger win for privacy in the end.
Matter of opinion. I disagree.
 
  • Like
Reactions: PC_tech
They should cancel it and apologise for insisting “customers simply misunderstand the noblest intention of Apple team”.

Once you scan for CSAM, it becomes obligatory to also stop drug use, terrorism, family abuse and piracy. And then other political and religious offences overseas if Apple wants to do business there, and to “abide by the local laws”.
 
For those in the middle(?) this is an invasion of privacy regardless if it is known or not and you do not have a choice to opt in or out, you are being forced into it despite what you want with your privacy. Clear enough?
If that's what is happening here, then absolutely clear. The issue is in the murkiness of all the "what if's" and what's actually presented. We won't (can't) know until there are quality peer-reviewed studies on the technique.
 
Because Apple has God-complex

Seriously, they think they’re over democracy, individual rights and whatever stuff you could throw at them
No. Apple does not. They are being forced to do this, because all tech companies are doing CSAM scans. (Just not on device, before it hits the cloud, because they process everything on cloud anyway, contrary to apple who processes everything on device for privacy reasons).

I can only see this as being external pressure. either directly, through shareholders, politicians (lots of senators angry at apple for not just decrypting iphones when the FBI asks), etc. Plenty of organizations and individuals who do not like apple’s stance on privacy.
 
They should cancel it and apologise for insisting “customers simply misunderstand the noblest intention of Apple team”.

Once you scan for CSAM, it becomes obligatory to also stop drug use, terrorism, family abuse and piracy. And then other political and religious offences overseas if Apple wants to do business there, and to “abide by the local laws”.
I seems the nuanced argument Apple is presenting is that CSAM is universally accepted (in places they do business, at least) illegal and immoral, whereas X = {set of possibly questionable things by some people} is not universally accepted as illegal and immoral. The argument that this becomes an obligatory slippery slope is non sequitur.
 
I wonder how many additional children will be victimized from now until then? Apple the greatest company in history with the greatest humanitarian intentions forced to deal with grandstanding ignorant politicians and self centered selfish advocacy groups. It’s unbelievable!
 

Attachments

  • CE7D77A9-8276-41DB-9346-5C3CA0C182E7.png
    CE7D77A9-8276-41DB-9346-5C3CA0C182E7.png
    200.2 KB · Views: 64
I wonder how many additional children will be victimized from now until then? Apple the greatest company in history with the greatest humanitarian intentions forced to deal with grandstanding ignorant politicians and self centered selfish advocacy groups. It’s unbelievable!

None. As people have said over and over again it takes about a decades for any images of abuse to enter the CSAM database. So the system does nothing to stop active abusers.
 
I'm puzzled by all the celebration. It's a delay, not a cancellation. "Delay to make improvements" in corporate-speak means, "wordsmithing the next press release to make the news more palatable to customers". It's what happens when the first attempt at addressing "confusion" was unsuccessful. ;)

I suspect many are amazed Apple listened and did anything (other than moving forward) at all.
Color me surprised.
 
TL:DR "We will wait for you to forget then quietly turn it on at the request of the CCP/FBI/EU/whatever"

Anyone who thinks this is actually about CSAM is a fool. This is a backdoor for government surveillance of the media you have on your device.
 
From what I've read this seems to be true.

If I was engaging in an activity I knew to be illegal, the very last thing I'd do is keep evidence of it even on my computer, which is entirely under my control (TTBOMK), much less a mobile device--the innards of which are a complete mystery to me. But people engaging in criminal acts often behave very stupidly. Some might argue behaving in criminal manner suggests a propensity to be stupid, in the first place. Law enforcement types have often commented along the lines of "if criminals were smart it would make my job a lot harder."
Yup, but I think with CSAM it's even more true. Most of these people don't consider themselves "criminals," and they also often start on a slippery slope where they feel that what they're looking at isn't really illegal. For example, one culprit defended their CSAM collection as being "artistic."

Then there's the whole sordid underbelly of the CSAM world that thinks the government has no business declaring this stuff illegal in the first place.

Matter of opinion. I disagree.
Fair enough. My own take is that having my entire iCloud Photo Library encrypted in such a way that nobody can ever see what's in it is worth running an algorithm on my device that merely checks to see if anything I'm uploading matches any existing photos.

Even in the worst plausible "what if" scenario — that Apple allows foreign governments to add their own "CSAM" entries of things that aren't technically CSAM — the photos still have to match photos that already exist somewhere else. No photo that you take or create yourself will ever match the algorithm, except in the case of false collisions, and you'd have to have at least 30 of those false collisions before anybody would even know about them.

In other words, let's say that China wanted Apple to scan for photos of known dissidents. The current implementation would only allow the scans to be for photos that were already out in the wild. If you took a photo of someone with your own camera — even if that were a person who was in one of the forbidden photos — that would never get flagged by these CSAM algorithms.

So, while I agree that it has the potential for abuse, it's not as insidious as what many seem to think it is, and it's certainly less so than a government agency being able to look at every single photo in your library.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.