Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Sounds like more than just knee jerking.

The amount of people losing their minds over a feature meant to protect children from being sent or requested to send illegal materials is eye opening to how much society has underestimated the prevalence of child sexual exploitation.

Protect?
Chuckle. Not at all.
This will be another "info pop up" that for those kids that have it active they will just blow right through.
Pretty much toothless.

Or encourage them, if not there already, onto a different messaging platform.
 
Checking for nudity in photos is done on-device, with Messages analyzing image attachments. The feature does not impact the end-to-end encryption of messages, and no indication of the detection of nudity leaves the device. Apple has no access to the Messages.
You have proof of this… how?
I’m not touching this update until after the security and privacy groups have torn into it and given their feedback.
 
You have proof of this… how?

I'm just quoting from the article. Seem like facts to me.

Checking for nudity in photos is done on-device, with Messages analyzing image attachments. The feature does not impact the end-to-end encryption of messages, and no indication of the detection of nudity leaves the device. Apple has no access to the Messages.

I’m not touching this update until after the security and privacy groups have torn into it and given their feedback.
Whatever...

"The truth may be puzzling. It may take some work to grapple with. It may be counter-intuitive. It may contradict deeply held prejudices. It may not be consonant with what we desperately want to be true. But our preferences do not determine what’s true."

- Carl Sagan
 
  • Like
Reactions: NC12
this feature is an explicitly opt-in only for parents on their kids' devices.

i don't know how you get to this passive aggressive stance when the very clear use-case is to target the prevention of nudity being sent to/from minors, something that is a clear felony in the US, even if the subject material is an image of the sender themselves.

do you somehow have an issue with that?
The problem is that this isn’t for actual minors. This is for people whose account is linked to a family and marked as a minor.

Maybe by an abusive spouse. Maybe by overly controlling parents of adult children.
 
  • Like
  • Haha
Reactions: dk001 and kurai
The problem is that this isn’t for actual minors. This is for people whose account is linked to a family and marked as a minor.

Maybe by an abusive spouse. Maybe by overly controlling parents of adult children.

Is this even possible? From my understanding, it’s very difficult to change the age of your Apple ID once it has been set.

Reading the comments here, it feels like many people still either don’t understand the feature, or are choosing to not understand it.

Hate on Apple if you wish. At least hate Apple for the right reasons.
 
  • Like
Reactions: NC12 and CarlJ
The problem is that this isn’t for actual minors. This is for people whose account is linked to a family and marked as a minor.

Maybe by an abusive spouse. Maybe by overly controlling parents of adult children.

the problem is you're projecting something on to apple that isn't specifically their problem. this *is* for minors, as their terms have *specifically* described.

apple's use-case is specifically described. the illegal uses you've described are not relevant here. it's like saying cars are for drunk drivers.

this comment is nonsensical, i don't know what to say about it otherwise.
 
Is it too much to ask that at least the writers at the site properly distinguish between CSAM features and anti-CSAM features? This constant repetition that iPhones come with CSAM installed is going to bring the wrong kind of customers.

It is not, thank you for the reminder that we need to be more careful with our wording. I've clarified in the article that this is an anti-CSAM feature.
 
  • Like
Reactions: NC12 and Analog Kid
Protect?
Chuckle. Not at all.
This will be another "info pop up" that for those kids that have it active they will just blow right through.
Pretty much toothless.

Or encourage them, if not there already, onto a different messaging platform.
Except that parental controls also allow you to restrict what websites are available, and what apps can be downloaded from the App Store.
So for most children under the age of 13 who will have this enabled by their parents, it’s not gonna be just as simple as “download Facebook messenger instead.”
 
  • Like
Reactions: NC12
The problem is that this isn’t for actual minors. This is for people whose account is linked to a family and marked as a minor.

Maybe by an abusive spouse. Maybe by overly controlling parents of adult children.
Parental controls have been on the iPhone since 2008.
Find my iPhone has been on the iPhone since 2009.
Now there are such things as AirTags and an entire Find My ecosystem.
All of these things have the possibility of being abused.
Doesn’t mean that they shouldn’t exist.
You can injure someone by dropping a MacPro on their face, doesn’t mean that it shouldn’t exist.
Also, it’s almost impossible to reverse an Apple ID to say that you’re younger than 13 after it’s been created, and even if you create a new Apple ID, what are you gonna do the second that it turns 13. Make another one? It’s not the easiest system to exploit, it’s clearly meant for children and children only.
 
Except that parental controls also allow you to restrict what websites are available, and what apps can be downloaded from the App Store.
So for most children under the age of 13 who will have this enabled by their parents, it’s not gonna be just as simple as “download Facebook messenger instead.”

and it doesn’t take long for kids to figure out to get around that.
then again they tend to borrow each other devices - common.
 
Last edited:
Yuck! CSAM. First wave of Roll Out, Watch! Don’t let Apple use the word “Kids” as an excuse. This is CSAM. It’s all related, connected.

Is there a way to disable this feature? Or is it set by default by Apple?

SMH!


Update: 11:24AM PST.

So many disagreements… you guys don’t play around here. It's a tough crowd!

If this is all true… then this is a good addition. CSAM is way more controversial, honestly. I hope it is not connected to CSAM.
We’re not a tough crowd. You just spam replies to new articles without actually reading the article. In this case, it appears you only read the article title before making your first post.

You’re a frequent member of the community so please start reading the articles first.

Everyone else has called you out on misinterpreting the facts but I wanted to call you out on the dangers of commenting without actually reading the article.
 
  • Like
Reactions: Chunk72 and TruBleu
This constant repetition that iPhones come with CSAM installed is going to bring the wrong kind of customers
As we discuss this can we please be aware that CSAM is the acronym for child sexual abuse material. So CSAM's a naked picture of Bart Simpson, or some lolicon anime, or even porn/abuse of real children.

To say that iPhone will come with CSAM installed means Apple would be putting this content on the phone.

What people need to say is CSAM-Scanning, or CSAM-Hashing when referring to Apple's minority report software, and CSAM to refer to aforementioned naked Bart.

(And yes, in some countries 4 fingered yellow cartoons in sexual drawings are defined as CSAM and do carry a jail term)
 
As we discuss this can we please be aware that CSAM is the acronym for child sexual abuse material. So CSAM's a naked picture of Bart Simpson, or some lolicon anime, or even porn/abuse of real children.

To say that iPhone will come with CSAM installed means Apple would be putting this content on the phone.

What people need to say is CSAM-Scanning, or CSAM-Hashing when referring to Apple's minority report software, and CSAM to refer to aforementioned naked Bart.

(And yes, in some countries 4 fingered yellow cartoons in sexual drawings are defined as CSAM and do carry a jail term)

CSAM is NOT what you posted. It is far worse.
Think about what the acronym stands for.

Try ICMEC or NCMEC for a better definition.
 
  • Like
Reactions: NC12
CSAM is NOT what you posted. It is far worse.
Think about what the acronym stands for.

Try ICMEC or NCMEC for a better definition.

What can be worse than "porn/abuse of real children"?

As for the broader term, it does depend on the country whose hashes are being checked against. In some countries CSAM is defined in law to include naked Bart Simpson, fiction in text form, anime, and others.

Apple have said they won't just be using the USA based NCMEC database, but would include the hashes from multiple countries.
 
Perhaps, this might answer your doubts.


Yup it did. Both of those articles detail the three new features apple is adding. CSAM was delayed (hopefully to be cancelled) but one is not CSAM, even the articles you included say so.
 
Listen this child safety feature is a red herring. You realise that it will probably still scan everything regardless of what the switch has. It just means the warning message won't be thrown if the option is set so. Further, I am nearly sure there will be many false positives as when this sort of technology was tested by the UK police and thought pictures of the desert was nudity. See: https://petapixel.com/2017/12/20/uk-police-porn-spotting-ai-gets-confused-desert-photos/

Point is, you cannot rely on technology to be a babysitter. Maybe those old peoples phones you see advertised is more ideal for children. They can use a tablet at home where they can be supervised.

But at the end of the day, this scanning is going on if you like it or not. And it is always about the children, just like how politicians win elections. This is not a good thing only because of how it can be used. Now that it can scan it won't be hard for it to be weaponised, just as the CSAM can. The full trust some people have with Apple doing the right thing is bizarre as people that believed it with Google doing no harm and Facebook was just a way to play Farmville.
 
What can be worse than "porn/abuse of real children"?

As for the broader term, it does depend on the country whose hashes are being checked against. In some countries CSAM is defined in law to include naked Bart Simpson, fiction in text form, anime, and others.

Apple have said they won't just be using the USA based NCMEC database, but would include the hashes from multiple countries.

From your post: “So CSAM's a naked picture of Bart Simpson, or some lolicon anime, or even porn/abuse of real children.”

Sexual Abuse is the start.
 
The human brain is full of billions of brains cells. The amount of people who aren’t using them to fully read the article to the end and comprehend the differences is just sad. First serval posts on this article alone were not even accurate and were spouting arguments against a thing the article wasn’t even about and explicitly stated so.
 
Wow, maybe this comment section could be locked and the article amended to have giant bullet points and lettering to describe what is actually happening. We can have the CSAM convo on an article about CSAM.
 
  • Like
Reactions: FindingAvalon
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.