Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
68,620
39,489


Starting with iOS 17, iPadOS 17, and macOS Sonoma, Apple is making Communication Safety available worldwide. The previously opt-in feature will now be turned on by default for children under the age of 13 who are signed in to their Apple ID and part of a Family Sharing group. Parents can turn it off in the Settings app under Screen Time.

communication-safety-feature-yellow.jpg

Communication Safety first launched in the U.S. with iOS 15.2 in December 2021, and has since expanded to Australia, Belgium, Brazil, Canada, France, Germany, Italy, Japan, the Netherlands, New Zealand, South Korea, Spain, Sweden, and the U.K. With the software updates coming later this year, Apple is making the feature available globally.

Communication Safety is designed to warn children when receiving or sending photos that contain nudity in the Messages app. Apple is expanding the feature on iOS 17, iPadOS 17, and macOS Sonoma to cover video content, and it will also work for AirDrop content, FaceTime video messages, and Contact Posters in the Phone app.

When the feature is enabled, photos and videos containing nudity are automatically blurred in supported apps, and the child will be warned about viewing sensitive content. The warning also provides children with ways to get help. Apple is making a new API available that will allow developers to support Communication Safety in their App Store apps.

Apple says Communication Safety uses on-device processing to detect photos and videos containing nudity, ensuring that Apple and third parties cannot access the content, and that end-to-end encryption is preserved in the Messages app.

iOS 17, iPadOS 17, and macOS Sonoma will be released later this year. The updates are currently available in beta for users with an Apple developer account.

Article Link: iOS 17 Expands Communication Safety Worldwide, Turned On by Default
 
I wish they had this for social media! It’s disgusting the stuff I see on a daily basis! On Twitter I got a suggested video (of all things) of someone getting stabbed? LIKE WHAT THE ACTUAL HELL! Why would I want to see anything like that! Or constant posts of nudity and porn, I’m never letting my kids anywhere near that insanity. Disgusting stuff everywhere, wish I could just use it full of content blockers and protect my sanity, but alas, the world in screwed and I either have to accept it or quit. Blows my mind those are the only options for users…
 
This is different than CSAM. This checks the photos on device, no info is sent to Apple.
Essentially it is a same machine looking for something, be it sensitive images (whatever that means) or CSAM or union activity it doesn't really matterm this machine looks for what someone told it to look for. Also for children notification is sent to the parents IIRC. Which infringes on the privacy of the children especially if it is a false positive. And of course you can sent it to anyone, Apple, the FBI, Europol, the YouTube.

So it might be a good tool for some but I think it has its issues and we shouldn't look away. So, it can improve safety but it also be able to be part of a surveillance environment, which would have be the fever dream of people like Erich Mielke.

Where this is happening doesn't matter.
 
  • Like
Reactions: Shirasaki
Essentially it is a same machine looking for something, be it sensitive images (whatever that means) or CSAM or union activity it doesn't really matterm this machine looks for what someone told it to look for. Also for children notification is sent to the parents IIRC. Which infringes on the privacy of the children especially if it is a false positive.
The two mechanisms are completely different. The CSAM scanning mechanism was never machine learning. It was looking for matches to a specific set of images already in the possession of NCMEC (National Center for Missing and Exploited Children), which is the only entity authorized to catalog such images. No “looking for things that look like naughty bits”, it was only looking for a specific set of images. The technical paper that explains the mechanism is freely available.

This mechanism is entirely different from the CSAM detection mechanism, and does look for nudity, with machine learning. If it finds something it thinks might be that, it tells the person holding the phone, right at the point of being about to view the image. The notion of sending messages to the parents was removed, very early on, when it was pointed out some kids might be unsafe situations (like, say, parents who would harm their kids if they found out their kid was gay). So, it isn't sending a notification to anybody, it’s just asking the kid if they really want to see the image - that’s all.
 
I’d argue pictures of those stupid challenges (blackout challenge for example) are more dangerous than nudity. Also intimidation of certain illegal activities such as excessive speed driving is also bad. Oh the other hand, doing that sort of filter would step in the toe of mass surveillance, so idk…
 
I’d argue pictures of those stupid challenges (blackout challenge for example) are more dangerous than nudity. Also intimidation of certain illegal activities such as excessive speed driving is also bad. Oh the other hand, doing that sort of filter would step in the toe of mass surveillance, so idk…
I partially agree, I wish Apple would just ban TikTok because of stupid crap like that. But I also think that making sure kids don’t see porn is important.
 
  • Like
Reactions: JapanApple
Is it possible to block only male nudity?
That'll be the day I sell all my AAPL.

I really appreciate them keeping privacy at the forefront, off their servers, and finally enabling something useful ON by default.

Wonder how many days after launch till there's enough feigned outrage for corporate to change course and toggle it OFF by default?
 
So that’s why iOS 17 needs A12 and above because of picture scanning. How it’s done in iPad OS17 which supports older SOCs?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.