Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
68,543
39,398


Update: We've learned from Apple that the Communication Safety code found in the first iOS 15.2 beta is not a feature in that update and Apple does not plan to release the feature as it is described in the article.


Apple this summer announced new Child Safety Features that are designed to keep children safer online. One of those features, Communication Safety, appears to be included in the iOS 15.2 beta that was released today. This feature is distinct from the controversial CSAM initiative, which has been delayed.

iphone-communication-safety-feature-arned.jpg

Based on code found in the iOS 15.2 beta by MacRumors contributor Steve Moser, Communication Safety is being introduced in the update. The code is there, but we have not been able to confirm that the feature is active because it requires sensitive photos to be sent to or from a device set up for a child.

As Apple explained earlier this year, Communication Safety is built into the Messages app on iPhone, iPad, and Mac. It will warn children and their parents when sexually explicit photos are received or sent from a child's device, with Apple using on-device machine learning to analyze image attachments.

If a sexually explicit photo is flagged, it is automatically blurred and the child is warned against viewing it. For kids under 13, if the child taps the photo and views it anyway, the child's parents will be alerted.

Code in iOS 15.2 features some of the wording that children will see.
  • You are not alone and can always get help from a grownup you trust or with trained professionals. You can also block this person.
  • You are not alone and can always get help from a grownup you trust or with trained professionals. You can also leave this conversation or block contacts.
  • Talk to someone you trust if you feel uncomfortable or need help.
  • This photo will not be shared with Apple, and your feedback is helpful if it was incorrectly marked as sensitive.
  • Message a Grownup You Trust.
  • Hey, I would like to talk with you about a conversation that is bothering me.
  • Sensitive photos and videos show the private body parts that you cover with bathing suits.
  • It's not your fault, but sensitive photos can be used to hurt you.
  • The person in this may not have given consent to share it. How would they feel knowing other people saw it?
  • The person in this might not want it seen-it could have been shared without them knowing. It can also be against the law to share.
  • Sharing nudes to anyone under 18 years old can lead to legal consequences.
  • If you decide to view this, your parents will get a notification to make sure you're OK.
  • Don't share anything you don't want to. Talk to someone you trust if you feel pressured.
  • Do you feel OK? You're not alone and can always talk to someone who's trained to help here.
There are specific phrases for both children under 13 and children over 13, as the feature has different behaviors for each age group. As mentioned above, if a child over 13 views a nude photo, their parents will not be notified, but if a child under 13 does so, parents will be alerted. All of these Communication Safety features must be enabled by parents and are available for Family Sharing groups.
  • Nude photos and videos can be used to hurt people. Once something's shared, it can't be taken back.
  • It's not your fault, but sensitive photos and videos can be used to hurt you.
  • Even if you trust who you send this to now, they can share it forever without your consent.
  • Whoever gets this can share it with anyone-it may never go away. It can also be against the law to share.
Apple in August said that these Communication Safety features would be added in updates to iOS 15, iPadOS 15, and macOS Monterey later this year, and iMessage conversations remain end-to-end encrypted and are not readable by Apple.

Communication Safety was also announced alongside a new CSAM initiative that will see Apple scanning photos for Child Sexual Abuse Material. This has been highly controversial and heavily criticized, leading Apple to choose to "take additional time over the coming months" to make improvements before introducing the new functionality.

At the current time, there is no sign of CSAM wording in the iOS 15.2 beta, so Apple may first introduce Communication Safety before implementing the full suite of Child Safety Features.

Article Link: Code for Apple's Communication Safety Feature for Kids Found in iOS 15.2 Beta [Updated]
 
Last edited:
I am hoping that there are far more details and explanations of what Apple is doing on device, and in the cloud for this feature before it is activated or officially offered to consumers. I get what they are trying to do, but for some there is a huge creep factor attached to this type of service / feature.
 
This should serve as a reminder to all that Apple not only has the ability to find child pornography, but any sort of pornorgraphy.

Perhaps they already do...

You can search for "Sunsets" and "Beaches" in Photos, maybe Apple is also keeping track of other types of images at the same time.
 
This should serve as a reminder to all that Apple not only has the ability to find child pornography, but any sort of pornorgraphy.

Perhaps they already do...

You can search for "Sunsets" and "Beaches" in Photos, maybe Apple is also keeping track of other types of images at the same time.
They can search it all, they own all data on the icloud and wants to own all the data on your device.
Not only images, all types of data, probably even incl. industry espionage.
 
It's sad that we live in a world that needs this kind of thing, but we do. Good for Apple for implementing this. I see absolutely ZERO downside to such parental controls.

EDIT: Ok, for the "disagrees" that are now coming in: If your, say, 10 year old son or daughter was being sent pornographic images, are you telling me you wouldn't want to know about that? Please in the name of all that is decent explain to me how this is a bad thing?

EDIT 2: All these dislikes, yet no one can give me a rational reason for attacking Apple for this. All you have is wacky conspiracy theories about how this optional parental control feature will somehow result in totalitarian governments running our lives.
??
 
Last edited:
CSAM is coming.

And just like Apple must have worked on CSAM for a long time without telling anybody, they might be working on a whole lot of other things too. And all those things are coming to our devices, like it or not.
People need to stop using "CSAM" to mean "CSAM detection". Let's expand the acronym in your sentence:

"Child sexual abuse material is coming. [...] Apple must have worked on child sexual abuse material for a long time"

So, what you're basically saying is Apple is dealing in child porn illegally.
 
At the current time, there is no sign of CSAM wording in the iOS 15.2 beta, so Apple may first introduce Communication Safety before implementing the full suite of Child Safety Features.
What would be "CSAM wording"? I thought the feature was silent.
 
Last edited:
I just don't understand, apple says the CSAM stuff is done on device and known images are reported. But in that case, couldn't apple use "on device machine learning" as an excuse to carry out anything a government wants them to do by law? What's to stop a government from making a law that requires apple to report images that promote homosexuality? I'm usually not a slippery slope kind of guy, but this concerns me.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.