Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Make NO mistake, at some point, Apple will flip the switch & reverse it !

They will do it in the dead of night, when nobody is paying attention !
Did. Said it like it is.. They will force it. You'll see.

I’m sorry but the amount of misinformation is crazy here. CSAM and communication safety features are two separate things. I am very opposed to the CSAM scanning process, but don’t have any problem of giving to parents an option to prevent minors from seeing explicit unsollicited pictures.
 
Last edited by a moderator:
When will they ever
I’m sorry but the amount of misinformation is crazy here. CSAM and communication safety features are two separate things. I am very opposed to the CSAM scanning process, but don’t have any problem of giving to parents an option to prevent minors from seeing explicit unsollicited pictures.
All for protecting the kids, How about parents being parents and talk with their kids and have a few conversations educating them and not depending on giving up their privacy.. Apple will miss use this. Others will miss use this feature. It will happen. One way or another.
 
They announced two photo searching features, CSAM and this. They never changed the wording
Perhaps, this might answer your doubts.


 
  • Like
  • Haha
Reactions: NC12 and B4U
As predicted, Apple is going to sneak CSAM into iOS peacemeal after the backlash (and rightfully so) they received when they tried shoving it in all at once. People, the EFF and countless others need to keep Apple's feet to the fire and demand they abandon this backdoor spyware IMMEDIATELY & COMPLETELY. The pro-Apple crowd needs to wake up and realize what their beloved company is doing.

Apple can no longer be trusted with privacy or security.
 
Last edited by a moderator:
In the original article, and the code that we saw, the implementation sent notifications to parents when a child under 13 years old viewed a photo with nudity. That feature has been removed and that's not how it works, so that's what the difference is.

I’m curious how the code you saw knows the device owner is < 13 yoa to send a notification a parent?
 
Uggh people please read.
CSAM scanning is something completely different than this.
This is literally just a feature that you can turn on, if you so choose to, for children under the age of 13, so they can’t send or receive pictures of explicit body parts and such.
And it’s all done with machine learning, not hash matching.
So…
A: A won hundred percent opt in feature for only people under the age of 13.
B: not hash matching.
C: 99.99999% of people will never use it.
I am 100% against Apple‘s implementation of CSAM scanning that they talked about a couple months ago. This is something different, this is something that I’m not against.
 
Yuck! CSAM. First wave of Roll Out, Watch! Don’t let Apple use the word “Kids” as an excuse. This is CSAM. It’s all related, connected.

Is there a way to disable this feature? Or is it set by default by Apple?

SMH!


I'd first read the article before leaving a comment if I were you.
 
Uggh people please read.
CSAM scanning is something completely different than this.
This is literally just a feature that you can turn on, if you so choose to, for children under the age of 13, so they can’t send or receive pictures of explicit body parts and such.
And it’s all done with machine learning, not hash matching.
So…
A: A won hundred percent opt in feature for only people under the age of 13.
B: not hash matching.
C: 99.99999% of people will never use it.
I am 100% against Apple‘s implementation of CSAM scanning that they talked about a couple months ago. This is something different, this is something that I’m not against.
Sigh….for the last time….this isn’t CSAM.
They announced two photo searching features, CSAM and this. They never changed the wording

If this is all true… then this is a good addition. CSAM is way more controversial.
 
I’m curious how the code you saw knows the device owner is < 13 yoa to send a notification a parent?
When you create an iCloud account for a child, you have to put what their age is.
That’s how.
Also it no longer sends a notification to the parent, Apple abandoned that feature because they got feedback that it would be too risky
 
Make NO mistake, at some point, Apple will flip the switch & reverse it !

They will do it in the dead of night, when nobody is paying attention !
Yep, you're absolutely right. Also, Apple will use the GPS tracking inherent in the iPhone's design to hunt down and kill every single iPhone owner on the entire planet. Because reasons. Better get rid of your iPhone now!

My scenario is just a provable as yours.
 
Cool story. Your proof is what? Oh that's right. Apple said so.
And you are believing that Apple is adding these features now, because of things Apple has said.

If you don't trust Apple, how do you know they didn't add all sorts of creepy tracking/spying/reporting stuff five years ago, and they're just keeping it a secret - remember, the only reason you believe they haven't already done that is because Apple said so.

Why is everything fine up to this point, but NOW you think they're out to spy on you? What changed, and what makes what Apple did five years ago trustworthy? If you're that paranoid, better get rid of your iPhone now. It could already be spying/tracking/reporting on you.
 
I am completely against Apple's implementation of CSAM, however, now that they removed the notifying parents or anyone else BS, I am okay with this aspect of the OS. I am glad they listened to experts that said this could cause more harm to children than not having anything at all. Now hopefully they listen just as well about CSAM. I refuse to update to iOS15 or purchase any new iOS device until they explain how CSAM is going to be removed or changed (no in device policing is acceptable)
 
Someone help me out here...whether its Opt in or out....the feature is officially programmed and built in, thus it would come to down to trusting apple / backdoors, right?
Um… no?
Literally all this feature does is put up a warning when machine learning detects that a picture being sent or received by someone who has the feature enabled and is under the age of 13 receives an image that may have explicit content.
It doesn’t notify anyone, it doesn’t do any matching, no authorities are contacted, nothing like that.
It’s just a warning asking you “are you sure you want to open this picture.”
Mainly put in place so young people aren’t opening pictures of old people with their junk hanging about, something that’s unfortunately become a lot more common in the days of social media.
But yeah, no one should be worried about this, it’s using the same technology that knows that a face is a face in your photos app, no scanning or hash matching, and it’s not on by default for anyone, it has to be manually enabled by a parent of an iCloud account that’s under 13.
Once again, not the CSAM thing, that’s something much different and significantly more worrisome
 
I am completely against Apple's implementation of CSAM, however, now that they removed the notifying parents or anyone else BS, I am okay with this aspect of the OS. I am glad they listened to experts that said this could cause more harm to children than not having anything at all. Now hopefully they listen just as well about CSAM. I refuse to update to iOS15 or purchase any new iOS device until they explain how CSAM is going to be removed or changed (no in device policing is acceptable)
Finally, someone talking with some sense.
This feature? Using machine learning to stop a child from accidentally opening an inappropriate picture? Good.
CSAM matching on device? Bad.
Personally, if I were in Apple‘s shoes I would just cancel the on device CSAM matching feature they announced earlier this year, and keep things the way they have been since 2019.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.