Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I wonder how many additional children will be victimized from now until then? Apple the greatest company in history with the greatest humanitarian intentions forced to deal with grandstanding ignorant politicians and self centered selfish advocacy groups. It’s unbelievable!

You are conflating issues.
What this would catch are the idiots that look at those photos. It wouldn’t catch the more intelligent viewers nor the creators Of this stuff.
 
Perhaps now the needless shoe peeing by people who have no idea what they are yelling about, other than "IT'S A SLIPPERY SLOPE......" will calm down, or at least find something else in their lives to wet themselves over.
Sounds like you're the one wetting him/herself.

"No idea what they are yelling about" - Are you referring to "security researchers, the privacy whistleblower Edward Snowden, the Electronic Frontier Foundation (EFF), Facebook's former security chief, politicians, policy groups, university researchers, and even some Apple employees?"

You must know more than them, right?
 
People really don't understand this system...

Putting the automated scans of images being sent to children via iMessage to one side:

The on-device scanning for CSAM in iCloud Photo Library was a safer alternative to what every other image host does: Directly scanning uploaded photos on their service to detect known child pornography.

The difference with Apple's system is that instead of directly accessing users' unencrypted data on their servers, scanning was being done privately on-device.

I understand the slippery slope argument of Apple being forced to add other image hashes by oppressive regimes like China and Russia, but the relatively open manner in which they designed this system would have helped prevent that, or at least made it obvious.

Apple isn't wrong to want to keep CSAM off of their servers and services—they have a clear moral obligation to do this. In response, they developed a system that protected user privacy while ensuring that users hosting child pornography on iCloud could be rooted out and reported to the authorities. They even set a high threshold on the number of red flags that would have to arise before an investigation could be opened.

Their main mistake was forgetting that people don't read past the alarmist headlines.
 
Last edited:
I wonder how many additional children will be victimized from now until then? Apple the greatest company in history with the greatest humanitarian intentions forced to deal with grandstanding ignorant politicians and self centered selfish advocacy groups. It’s unbelievable!
Because they weren’t victimized before Apple, get real 🙄🙄🙄🙄
 
As a father, I share your contempt for anyone that would abuse or exploit children, but comments like this assume most of those scumbags are naive enough to store and share that material in ways that would make it readily detectable by something like CSAM. I believe most are not breathing a sigh of relief simply because this CSAM scheme was never a serious threat to what they do.
You'd probably be surprised how careless these scumbags are when it comes to the collections of CSAM they hoard. Most of them don't really believe they're doing anything morally wrong, especially the ones who are "just collecting pictures off the internet."

Years ago, I had the misfortune of having an employee in a youth organization I was in charge of that was charged and convicted with both the possession and "making available" of CSAM. His laptop was full of the stuff, and while cloud services weren't a big thing back then (it was 2004), I fully believe he wouldn't have cared about using them to store his stuff. Had he owned an iPhone, I'm almost sure everything would have been in his Photos app, and therefore in iCloud Photo Library by extension.

In fact, he was caught because he was showing off his new PDA (a Palm device) to a co-worker at his day job, which was an oncology nurse at a large and well-known children's hospital. The other nurse happened to accidentally stumble across a CSAM photo after only a few seconds with the device, and subsequently reported it to their superiors, who of course involved law enforcement.

While he was only a "consumer" of CSAM (the "making available" charge had to do with sharing them on a peer-to-peer file sharing service — Kazaa I think it was), but in the process of the whole case I had to deal with the detectives as well, since I was his direct supervisor in this particular youth organization. They were initially pretty convinced that they were going to find that he had done more than just "look at" photos, but nothing else ever came of it. However, they did tell me that the forensic details they gleaned from his collection helped lead to the arrest of at least one guy who was creating this crap in his basement. As far as the cops are concerned, they want to nail everybody, since the little fish often lead to the bigger ones.

As I’ve said in related threads, I’m fine with Apple scanning whatever we upload to their cloud. Just don’t perform ANY portion of the scanning/verification process on my device.
From the beginning, I've believed that this is the first step toward Apple improving end-to-end encryption in the cloud, and if the entire iCloud Photo Library were to be end-to-end encrypted, then it would be impossible for Apple to scan for CSAM anywhere else but on your device. That's a trade-off I'd be more than happy to make.

IMHO, Apple is morally and ethically obligated to keep this crap off its servers, but it also goes far beyond that. I have no doubt that U.S. lawmakers would go for the nuclear option and legislate end-to-end encryption out of existence if Apple ever tried to implement this without making allowances for dealing with CSAM. It's already the standard bogeyman being pulled out by U.S. Senators and district attorneys to attempt to pass laws against secure on-device encryption, but the fact that data in iCloud is generally accessible by warrant has made them unwilling to expend too much political capital on this. Close that door and it's all but guarantee that we'd see new laws requiring the FBI to have a back door into everyone's iPhones — both on-device and in the cloud.

Apple is walking a fine line and choosing its battles very carefully here. It's entire set of privacy initiatives could fall like a house of cards if it pushes too far.
 
I hope they at least keep the parental control part where parents can have it prescreen messages sent to their young kids. That might even be useful for adults who don't want unsolicited imagery. It's just the CSAM part that's controversial.

I'm with you on this, but it should be opt-in. I've gotten pictures of a random dude's genitalia before (via text) because someone transposed some numbers. I think this feature should be offered to everyone, kind of like Twitter's adult content filter that forces you to click if you want to see an image.
 
Good to hear. I hope this means they come to their senses and abandon the plans for on-device checking alltogether (I'm perfectly fine if they don't want to admit that this was a bad idea in the first place as long as they abandon the actual plan and/or resort to scanning in the cloud like everybody else).

The only downside I see is that now the new iPad Mini looks a lot more tempting again :)
 
While I agree with what you’ve said about server-side scanning, what you said about people sharing CSAM is not accurate. The initial makers and distributers of CSAM may be savvy enough to avoid being caught, but there are plenty of morons among the consumers that spread it around using Facebook, Whatsapp, Google, etc. It’s a huge problem on all social media and cloud networks, and FB alone has millions of cases to investigate per year.
Exactly this, and as I mentioned in a previous post, catching these morons often helps law enforcement actually nail the scumbags who are creating this stuff.

The bottom line is that the vast majority of the disturbed people who collect CSAM honestly don't think they're doing anything wrong. After all, "they're just pictures," so they don't make much of an effort to hide them.
 
The mother of all Trojans is still part of the new SpyOS!

It will be already implemented and there is no way back once you installed it.

They just want to prevent a huge drop in Sales over the holidays. In February they will press the button and start the global mass surveillance of their customers. First step is pictures, then comes voice, text, video and gps (if a crime happened close to you and you are not the victim, you are a suspect).
 
I hope they at least keep the parental control part where parents can have it prescreen messages sent to their young kids. That might even be useful for adults who don't want unsolicited imagery.
True, except that it won't be available for adults, at least not in its current form. It can only be applied to users who are under 18 years of age and part of a Family Sharing group where the parents have opted into the feature.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.