Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I understand why this is a slippery slope but I don’t like the idea of child predators breathing a sigh of relief.
I wonder how many additional children will be victimized from now until then? Apple the greatest company in history with the greatest humanitarian intentions forced to deal with grandstanding ignorant politicians and self centered selfish advocacy groups. It’s unbelievable!
Ya know…I’ll take the risk to save even one child….
The program, as it has been outlined so far, would have increased child abuse not decrease it. We have seen over and over that the best solution is to punish dealers and treat the sick.
 
I agree it should be cancelled completely, but this is good to see at least. It's better than Apple just ignoring it and us being forced into it.
 
I wonder how many additional children will be victimized from now until then? Apple the greatest company in history with the greatest humanitarian intentions forced to deal with grandstanding ignorant politicians and self centered selfish advocacy groups. It’s unbelievable!

If it was only politicians and advocacy groups delaying this I would be pissed. But since its also cyber security experts I am glad Apple is listening and will improve this whole thing.
 
So - anyone disturbed just a tad that there is a whole group at Apple that is studying how to identify Child Porn and how to program some computer to recognize it? That means that have to have examples of it....that means they have to study it, that means they have to develop requirements for this SW, that means they have to develop algorithms to figure out that this picture is child porn vs a kid taking a bath or in a swimming pool....

Then someone has to review these results to make sure they are correct and meet the requirements of the SW product.

What kind of staff are working this task?
By having and reviewing examples of the Kiddie Porn, they are breaking the very same laws.
Who is vetting these Apple employees?
This is making me queazy to think about.
 
It’s been a hot mess. Even good ol’ Craig (executive) admitted it.
To be fair, Craig Federighi said that Apple's announcement was a hot mess. Which it totally was. He never said the feature itself was a mess.

Apple's PR when it comes to things like these is a cross between being slightly tone deaf and also assuming that people trust Apple more than they do. It's pretty much the exact same thing that Apple did with "batterygate" a few years ago... buried a change in the release notes that it thought wouldn't be a big deal, since it was doing it with the noblest of intentions, not realizing that many people would take it completely the wrong way.

You'd think they'd have learned their lesson from that. The CSAM Detection is way more far-reaching and potentially insidious than slowing down iPhones with older batteries. Yet, they still totally screwed up the messaging on it.

I think Apple's executives live inside a utopian bubble. They honestly don't think they're doing anything wrong with the CSAM Detection. As somebody who has studied the encryption and technology being used, I'm inclined to agree, as it's really a much bigger win for privacy — assuming we can take Apple at its word about how the system is designed (and if we can't, then we shouldn't be using an iPhone anyway, as they could have been doing all sorts of things behind our backs for years).

However, it's extremely naive for Apple to expect people to just take their word for things like this, especially when the underlying technology is hard to understand — most people understandably just hear it as "Apple is spying on everything on my iPhone."

They saw that with the "batterygate" issue, and I'm sort of amazed that they're surprised that the same thing is happening with CSAM Detection on an even larger scale.
 
Last edited:
As a father, I share your contempt for anyone that would abuse or exploit children, but comments like this assume most of those scumbags are naive enough to store and share that material in ways that would make it readily detectable by something like CSAM. I believe most are not breathing a sigh of relief simply because this CSAM scheme was never a serious threat to what they do.

As I’ve said in related threads, I’m fine with Apple scanning whatever we upload to their cloud. Just don’t perform ANY portion of the scanning/verification process on my device.
While I agree with what you’ve said about server-side scanning, what you said about people sharing CSAM is not accurate. The initial makers and distributers of CSAM may be savvy enough to avoid being caught, but there are plenty of morons among the consumers that spread it around using Facebook, Whatsapp, Google, etc. It’s a huge problem on all social media and cloud networks, and FB alone has millions of cases to investigate per year.

So hopefully something will be done, just not on device?
 
i think every 5 to 10 years Apple gets a black eye. 2001 it was OS 10.0 bug that made iTunes deleting files.

2010 it was the iPhone 4 antenna gate requiring bumpers.

2016, keyboard gate

2021, CSAM

2025, Apple VR head sees through people’s clothes causing huge invasion of privacy

2030, Apple car crashes into living rooms because of Hey Siri bug that misinterpreted ‘take me home’
 
I hope they at least keep the parental control part where parents can have it prescreen messages sent to their young kids. That might even be useful for adults who don't want unsolicited imagery. It's just the CSAM part that's controversial.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.