Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Every year many kids gets road-killed, they should prohibit cars and trucks.
Child safety first!

Btw. why not prohibit cameras?!
10 years of prison for owning a camera, thats what we need.
Okay to build on your stupid analogy. What if they built a system inside your car that would detect when you're about to run someone over and the brakes come on and say "are you sure you want to run over this person? This could send you to prison for a very long time and the cops will be automatically called if you continue" -- That's basically what this system is doing, and it's only for kids under 13 for pete's sake.
 
  • Haha
  • Angry
Reactions: dk001 and Shirasaki
Of course, anyone with half a braincell understands what they meant, but it's just a signal that says "this person is below average intelligence, don't even bother arguing with them".
Of course, words have meaning, and using a word in isolation while meaning something else is just a signal that says "this person is below average intelligence, don't even bother arguing with them".
 
As much as I understand certain criticisms of CSAM, and I'm not sure I'm sold on it personally, we're all naive for thinking governments don't already monitor this stuff one way or another. Various countries already employ many avenues of surveillance and have been doing so for decades.
 
  • Like
Reactions: tubedogg
Nothing new here. Governments (foreign or otherwise) already pressure Apple for access to our private data. At some point you either need to trust Apple’s public statements about this, or throw your phone away and go live in the wilderness. (A lot to be said for that actually!)
Sad, but true.
 
I understand wanting to filter material for children, but that really should be the parents’s responsibility. Companies should probably filter certain materials within their corporate scope of responsibility, for example if someone was breaking the law on Facebook and trying to prey on children, then I think Facebook should ban them. But trying to police the entire internet is an impossible task, and it isn’t the corporations’s responsibility to police it. They should report suspicious activity to the parents and authorities though. As the old saying goes, the road to hell is often paved with good intentions. Ultimately, the parental units are responsible for protecting their kids. It is far too easy to let kids entertain themselves on the internet so as not to be constantly bothered by them. However, if they brought them into this world, then they damn sure should watch over them and protect them. I think Apple is confusing good intent with responsibility. It is a thin line between good intent and encroaching into someone else’s business. Perhaps Apple should try to bring together many of the major tech companies with parents and legal advisors and law makers and approach it from that angle.
I agree with what you're saying, but there is an increasing trend over time of parents expecting everyone other than them parenting their child.
 
  • Like
Reactions: Wildkraut
Right. Look the US government is very flawed, but I’m not cynical about them abusing something like this. This feature probably won‘t affect any US citizens except actual child sex offenders.

But I’m looking at the bigger picture. I can see the governments of Russia, Iran, Saudi Arabia, etc. abusing this technology.
I don't understand how a country like Russia or China could use this technology if Apple owns it though? I suppose if those governments go to Apple and request information under law, Apple may have to provide it, but how is that any different than how things currently are? If Russia "subpoenas" person iCloud data, then it's already in the cloud for the taking. CSAM doesn't change that.
 
Last edited:
That's exaggerating a bit. I see your point, but someone would have to be under 13 AND receive a photo flagged as nude AND open the photo AND click through the warning before parents are notified. Not that it can't happen especially to young people who just blindly smash buttons, but I doubt the false positive rate among photos received by 12 year old children would be that high to be a huge concern. You also have to consider that it could actually protect children who legitimately get harassed and are too scared to talk about it.

That said, I do have total sympathy for LGBT children and think anyone who disowns or harms their child because of their sexual or gender identity is a horrible person and should not be allowed to have kids. But for now I'm slightly more positive given all the safeguards that have been put in.

And so far none of the documents say that the actual photo will ever be stored unencrypted on Apple's servers or sent to the parents - of course, parents can simply take the child's phone, but at least there aren't wider privacy implications of this (even though your account would have to be marked as 12 or younger for this system to kick in anyway, for ages 13-17 it's completely on-device and the parents are never notified).
I often read comments discussing LGBT+ kids being adversely affected by these child protection implementations. LGBT+ kids are more at risk of being victimized by predators, especially LGBT+ kids (remember, we are talking about kids under the age of 13) in repressive households or communities. Apple's child safety feature directly tries to prevent vulnerable kids from being groomed or victimized in private messages. We shouldn't suggest that LGBT+ kids are less deserving of being protected.

Are there bigoted parents and adults? Yes, there are. Almost all policies are a compromise of potential harms. I think it is best to err on the side of preventing active victimization o children. We can continue to work collectively to reduce the harm of repressive cultures that can also harm LGBT+ young adults as they explore and discover their sexuality.
 
So Apple waited til the heat died down a bit and is introducing the backdoor spyware we all knew was coming. Such a detestible company run by an equally despicable CEO. Apple is clearly no longer about privacy & security and in many ways never was
 
I agree with what you're saying, but there is an increasing trend over time of parents expecting everyone other than them parenting their child.
I’m for a license(like a driver license) to allow giving birth, that would safe many children from incompetent parents.
 
  • Like
Reactions: arkitect
Right. Look the US government is very flawed, but I’m not cynical about them abusing something like this. This feature probably won‘t affect any US citizens except actual child sex offenders.

But I’m looking at the bigger picture. I can see the governments of Russia, Iran, Saudi Arabia, etc. abusing this technology.
And yet there is precisely nothing stopping any country from requiring that today, with or without Apple's CSAM scanning tech (which isn't even really new, it's just fuzzy image matching). Look at how iCloud operates in China, or the lack of VPN apps (among many, many other categories) on the App Store there. Look at the various things that iPhones sold in Russia are required to do that others aren't, such as offering certain apps during setup.

The fact anyone sits and tries to argue that the only thing stopping Russia, or China, or anybody else from mandating scanning of user data on the device for any given purpose is that Apple hasn't yet actually implemented that on a device is absolutely insane. That would be the same thing as trying to argue that the only thing stopping governments from requiring surveillance cameras was facial recognition wasn't yet available. Have you ever been to the UK? I haven't, but I am keenly aware of the nearly ubiquitous CCTV coverage that existed long before facial recognition tech.

One thing is not being prevented by the other. Anybody who argues otherwise has no handle on the state of the world or how governments operate.

And for the record, as somebody else mentioned upthread, Apple long ago implemented actual ML scene and item recognition in photos. The fact that that went by without so much as a whimper from anybody, but as soon as they pivot to match against known CSAM images there is a "massive problem", is so ridiculous I don't even have the words to describe my utter disbelief.
 
  • Like
Reactions: Dwalls90 and kalsta
So Apple waited til the heat died down a bit and is introducing the backdoor spyware we all knew was coming. Such a detestible company run by an equally despicable CEO. Apple is clearly no longer about privacy & security and in many ways never was
The "backdoor spyware" that sends information nowhere except an autogenerated notification to a parent. Right. Even for this conversation, you are a shining star of having no idea what you're talking about.
 
  • Like
Reactions: Dwalls90 and kalsta
I just don't understand, apple says the CSAM stuff is done on device and known images are reported. But in that case, couldn't apple use "on device machine learning" as an excuse to carry out anything a government wants them to do by law? What's to stop a government from making a law that requires apple to report images that promote homosexuality? I'm usually not a slippery slope kind of guy, but this concerns me.
The CSAM detection as proposed cannot search for something as vague as "promoting homosexuality", it only searches for specific photographs. In addition, the device doesn't know if an image is CSAM or not, that detection happens on the server.

What's stopping the detection you've proposed? The answer is nothing, and the answer has been nothing for years. Apple's proposal doesn't change anything.
 
The CSAM detection as proposed cannot search for something as vague as "promoting homosexuality", it only searches for specific photographs. In addition, the device doesn't know if an image is CSAM or not, that detection happens on the server.

What's stopping the detection you've proposed? The answer is nothing, and the answer has been nothing for years. Apple's proposal doesn't change anything.
And what's stopping another country from building a database of photos they find troubling and telling apple "you must detect matches of these, you have to comply"?
 
  • Like
Reactions: daxomni
Hopefully no CSAM ever… The system is going to be exploited by some states one way or the other.
What is wrong with CSAM on Messages app? Or the disclaimer on the Siri and Search? How can they be exploited?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.