Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It will protect children by catching perpetrators who make illegal child images and it will protect children by catching perpetrators who distribute illegal child images.

If a criminal intends to use their iphone to browse illegal images of children, something that they can currently do with ease as there are no checks in place, having CSAM on the iphone means they will get caught out.

No it won’t. These are older images. This will nab idiotic viewers at best.
Now LEO may be able to use someone caught as the start of a breadcrumb trail in an attempt to catch a creator.
However, this feature does nothing to prevent new material nor does anything to stop nor nab the creators.
 
As a father of two small children, a lot of you make me sick. Child abuse and the distribution of such material is a HUGE WIDESPREAD problem. After many years, Apple *finally* wants to implement a system that detects child abuse material on someone's phone to counter this problem. And you bunch of babies cry about your precious "privacy".

If you do not store child abuse images on your phone, how will this even affect you IN ANY WAY? And don't give me all this slippery slope BS about what this possibly *could* lead to. We are talking about a very specific piece of technology designed for one very specific purpose. When they are proposing 24/7 body cams for all adults or scanning phones for political content, we'll talk about that. BUT THEY ARE NOT. They are proposing detecting child abuse images on people's phones. They deserve applause.

You need to rethink that.
Take the term “CSAM” out of the equation. Would you want this feature on your device?
Insert any other “illegality” in its place. Folks are not looking beyond the “CSAM” term.

Now before you get all bent, I’ve raised four of my own and am a foster resource for abused teens.
Yes, I have some small idea.
 
CSAM is really bad cause it demonstrates that Apple doesn't care about privacy. Money talks.
Here’s what really terrifies me: there is no profit in digging in their heels and sticking with this program for so long. There’s no money in catching people with kiddie porn — their alleged goal.

So why were they so insistent for so long? Why even now is it just “on hold” instead of dead completely?

Clearly there’s more to this program and its motives than we realize.
 
Said this before will say it again for those who missed the first go around.

This is not a fallacy when there is history, a lot of history mind you, of "Will anyone think of the children!" being used to take away your privacy, freedom and has been used to censor. It is used because for those trying to manipulate the public it is easy to come back with, "What you are trying to protect pedophiles?" *Faux shocked face* when anyone fights back against what is really happening.

It is a fallacy when there is no evidence of anything bad happening by the action. I am especially suspicious because Apple has spent decades trying to convince us that they are super-duper concerned for our privacy and are now suddenly willing to throw all that hard work away. The powers that be at Apple aren't stupid... so what is really going on? Whatever it is it is shady and you should be concerned.

What evidence (of Apple going down this road)?
 
It's no fallacy. It's how freedoms and privacy are eroded, and how authoritarian, fascist governments come to power.



When Apple receives this much backlash over a feature intended to protect kids from sexual abuse material and prevent adults from distributing said material, you know it's doing something wrong.

There. Fixed it for you!

Apple isn't a government. If a government wanted to abuse the phone in your pocket then they would just do it. The existence of this system would have zero bearing on it.
 
Here’s what really terrifies me: there is no profit in digging in their heels and sticking with this program for so long. There’s no money in catching people with kiddie porn — their alleged goal.

So why were they so insistent for so long? Why even now is it just “on hold” instead of dead completely?

Clearly there’s more to this program and its motives than we realize.
I strongly suspect the government is involved in this. Still, the fact that they have at least delayed it bodes well. People need to hold off iPhone 13 because of this so Apple gets the message.
 
Ah, the slippery slope fallacy.

The reality is if they planned to "censor protected speech" or "threaten the privacy and security of people around the world" then they could do that anyway. With or without this technology.
You are gonna die someday anyway, I will just give you a glass of poisoned wine to help you with the process now.

Ah, the slippery slope of self-preservation.
 
If the purpose is to detect and combat child abuse, then HELL YES, ABSOLUTELY!!!

Sounds like you may not realize how big and widespread of a problem this actually is.

What else then? Is it only child abuse or would you like them to start looking for other kind of material as well. What about our financial analysis? Should they dig deep in that also? Prevent you from doing stupid things and all… What about traffic violations? I mean over 38000 dies on US roads very year. That’s way worse than all child neglect and abuse deaths which are around 1800 combined. While we’re at it we might want to start monitoring all the violent crimes. That’s millions of bad actors caught! Now that’s some big numbers right there. But then again, instead of catching those who have committed crimes we should try to prevent them. We should let an AI to monitor everything we do and once a threshold of future unwanted conduct probability is reach the right authorities are informed. I mean seriously guys. Forget CSAM! Let’s monitor everything. If you’re not a criminal then then you have nothing to hide! Right?
 
If Apple had gotten the EFF's concerns addressed and blessing before the initial announcement, I personally would have had most if not all of my skepticism eliminated.

If. Heh. If Ifs and Buts were fruits and nuts we'd have Christmas every day.

One thing I'm sure of though - there has to be a way to thread this needle.
 
At the moment, iCloud is only partly end-to-end encrypted. The rest is encrypted with Apple’s own key. Therefore, when law inforcement request it (with a warrant in US), Apple decrypts what they can and promptly provide it. The new CSAM solution will make it possible to end-to-end encrypt everything yet still satisfy any requirements for CSAM scans. This would be a massive win for privacy as, even with a warrant, Apple wouldn‘t be able to provide any data to law inforement. Well, except for CSAM offenders.

This is great, in theory. The only problem is Apple has given zero indication this is their endgame. I also don't know how helpful it will be to have true E2EE if there's a way "around it", so to speak.
 
It will protect children by catching perpetrators who make illegal child images and it will protect children by catching perpetrators who distribute illegal child images.

If a criminal intends to use their iphone to browse illegal images of children, something that they can currently do with ease as there are no checks in place, having CSAM on the iphone means they will get caught out.

Apple has made this too public. They should've enabled iCloud scanning like all of the other companies do, made a subtle quiet announcement about it, and left it at that. THEN they would've caught a lot more.
 
  • Like
Reactions: BurgDog
..is your right to privacy more important than the protection of a child? I think members here are too scared in being judged if they reply 'Yes' to that question.
YES to your question. Not scared about being judged as I am confident that the majority of people with sound judgement and the knowledge of how meaningless this plan from Apple is agree with me. Some will disagree but so be it.
 
  • Like
Reactions: BurgDog and nt5672
I am also a programmer for more than 28 years. You can be a developer and lack design system thinking and skills. That is one of the reasons why I pay salaries to dev's and not the other way around. The design fail in Apples approach is that they are introducing on device processing with third party non publicly auditable hashes. Period. The industry uses server-side processing with PhotoDNA from years. And this is enough.

1. The CSAM database comes from the same sources regardless of where the scan occurs, client side, server side, or a mix that Apple proposed

2. Server side scans are too much. Server side CSAM scan would prevent end to end encryption of iCloud
 
Maybe the defenders of privacy need to talk with the families, relatives and even the children themselves who have been abused and explain to them why your privacy is more important than their protection.

Victims of child abuse have come out against this. Not all, obviously, but some have.

And again, this isn't going to help that much with the way Apple has implemented and announced this so publicly.
 
  • Like
Reactions: Philip_S
Too many people here who have no actual idea on what Apple are proposing.

Too many people here are concerned about the rights of child abusers.

Too many people think that just because someone opposes this that they don't understand what Apple is proposing. That couldn't be farther from the truth.

I'm afraid the fact that you think someone who opposes this is concerned about the rights of child abusers shows that you're the one who has no actual idea what's going on here.
 
Getting fed up of fallacies being trotted out here.

First, this move is not legally required. Yes, Apple is not allowed to store illegal images on its servers, once alerted to their presence, but it is not required to actively scan for them.

Second, this will not necessarily prevent new instances of child abuse. This is about detecting past acts of abuse. Indeed the link between viewing child pornography and committing sexual offences against children is weaker than some seem to imagine (see https://doi.org/10.1186/1471-244X-9-43). I think we all want people to stop viewing and producing these images, but Apple's actions are about the former, not the latter.

Third, will people posting here please stop claiming that objections to this scheme are due to ignorance please? This forum is visited by many technologically well educated people, and EFF are no novices to the field.

Fourth, just because Apple could have abused its position in the past by conducting surveillance, but didn't, is irrelevant. Apple is not only telling us it is going to conduct surveillance on our phones soon, but seems to be proud of doing so.

Fifth, yes this might reduce the online presence of pedophiles, and that might protect some children at some point and find justice for others. However, there are other children at risk in authoritarian countries. Apple's scheme can be used to detect anything in any type of file. That might mean the faces of minorities (e.g., Uyghurs), gay people (yes, there has been an academic publication claiming to detect gay sexuality from the shape of faces), flags (e.g., BLM & rainbow flags, or even Confederate flags) memes, words, and even sounds (want to catch all the people recording a demonstration? simply play a series of loud sounds near the crowd and scan for that). The 'hash' in Apple's scheme is simply a perceptual summary, and that could be applied to anything.

This was a monumentally bad idea. It is the first step in extending AI-based surveillance to our mobile devices. Even if Apple abandons the idea, others will pursue it for far less worthwhile goals than detecting CSAM. Indeed, I honestly think the damage may have been done and it might be irreparable.
 
The numbers look weak for a petition. Just as I suspected — this seems to only be an issue for a loud minority, conspiracy theorists, and people who don’t quite understand the tech.

In comparison, the California recall petition received over 2M signatures.

The problem is the general public isn't aware of this. It's hardly been touched by any mainstream media, short of a passing article. Members of the general public who are aware, simply don't care because they're not going to change their device over something that doesn't inconvenience them.
 
  • Like
Reactions: Philip_S
You need to rethink that.
Take the term “CSAM” out of the equation. Would you want this feature on your device?
Insert any other “illegality” in its place. Folks are not looking beyond the “CSAM” term.

Now before you get all bent, I’ve raised four of my own and am a foster resource for abused teens.
Yes, I have some small idea.
I think you need to re-think this as well.

If your children are of the age when the internet and mobile phones with cameras were about, what if someone used their mobile phone and took indecent pictures of your children when they were younger and uploaded them to some cloud service, wouldn't you have wanted some type of system in place to prevent that from happening and to prevent the pictures from being distributed around?

Or are you still of the view that even if a system was in place to prevent that from happening, you would not want it because it intrudes on your privacy?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.