Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
67,504
37,790


Apple today held a questions-and-answers session with reporters regarding its new child safety features, and during the briefing, Apple confirmed that it would be open to expanding the features to third-party apps in the future.

iphone-communication-safety-feature.jpg

As a refresher, Apple unveiled three new child safety features coming to future versions of iOS 15, iPadOS 15, macOS Monterey, and/or watchOS 8.

Apple's New Child Safety Features

First, an optional Communication Safety feature in the Messages app on iPhone, iPad, and Mac can warn children and their parents when receiving or sending sexually explicit photos. When the feature is enabled, Apple said the Messages app will use on-device machine learning to analyze image attachments, and if a photo is determined to be sexually explicit, the photo will be automatically blurred and the child will be warned.

Second, Apple will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies. Apple confirmed today that the process will only apply to photos being uploaded to iCloud Photos and not videos.

Third, Apple will be expanding guidance in Siri and Spotlight Search across devices by providing additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.

Expansion to Third-Party Apps

Apple said that while it does not have anything to share today in terms of an announcement, expanding the child safety features to third parties so that users are even more broadly protected would be a desirable goal. Apple did not provide any specific examples, but one possibility could be the Communication Safety feature being made available to apps like Snapchat, Instagram, or WhatsApp so that sexually explicit photos received by a child are blurred.

Another possibility is that Apple's known CSAM detection system could be expanded to third-party apps that upload photos elsewhere than iCloud Photos.

Apple did not provide a timeframe as to when the child safety features could expand to third parties, noting that it has still has to complete testing and deployment of the features, and the company also said it would need to ensure that any potential expansion would not undermine the privacy properties or effectiveness of the features.

Broadly speaking, Apple said expanding features to third parties is the company's general approach and has been ever since it introduced support for third-party apps with the introduction of the App Store on iPhone OS 2 in 2008.

Article Link: Apple Open to Expanding New Child Safety Features to Third-Party Apps
 
  • Angry
Reactions: a139
Third party Apps? Come on now.

This is just getting creepier and creepier.
Is it?
Personally, if I had a kid under the age of 13, and I had that restriction of sending or receiving explicit photos turned on, I would want it to be universal across all apps, first party and third party.
I wouldn’t want my kid sneaking onto Snapchat to send stuff or receive stuff they couldn’t send or receive on iMessage.
 
Is it?
Personally, if I had a kid under the age of 13, and I had that restriction of sending or receiving explicit photos turned on, I would want it to be universal across all apps, first party and third party.
I wouldn’t want my kid sneaking onto Snapchat to send stuff or receive stuff they couldn’t send or receive on iMessage.
I agree with you 100%. However, Apple is not obligated to scan our iPhones. There has to be an alternative way to fight against child pornography.

Privacy is being exposed to its fullest. Especially, now a third party will be involved. Imagine if the information/photos get leak by a third party? Who's held responsible for that.
 
Last edited:
Apple used to be about privacy and security. Not any more. Apple has no more highground to stand on.
What I don't get is they are still using this privacy and security high ground against Epic. They are now just talking out of both sides of their mouth. It's really getting worse and worse. I'm such a huge Apple fan and really in their ecosystem but damn it, this makes me want to move back to Linux more and more. I moved from Linux to Apple a few years ago.
 
It seems like a lot of people here in the comments are confusing the CSAM detection system with this child safety feature. This is only enabled for children as a parental control feature. I see nothing wrong or creepy about this but I do agree that the CSAM detection system is creepy.
 
Or you know, just don’t have iCloud photos turned on.
Or be like 99.999% of people, and don’t be worried about features that Will not ever apply to you
That's not how this works..... that's not how any of this works!

haha... anyways - this argument is super weak and just begging for exploits/issues. Typical weak-ass argument against mass surveillance. "I'm not hiding anything, why do I care if the police randomly pull me over and throw me out of my car and search it." This thinking rapidly escalates and it's a VERY slippery slope and hard to turn back from.
 
I agree with you 100%. However, Apple is not obligated to scan our iPhones. There has to be an alternative way to fight against child pornography.

Privacy is being exposed at it's fullest.
Apple *is not* scanning your photos.
That’s not how it works.
There’s a One and 1 trillion chance that anyone from Apple will ever see any of your images
 
Apple *is not* scanning your photos.
That’s not how it works.
There’s a One and 1 trillion chance that anyone from Apple will ever see any of your images
Quoted directly from Apple:

Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.

Sounds like scanning to me.
 
Last edited:
Apple *is not* scanning your photos.
That’s not how it works.
There’s a One and 1 trillion chance that anyone from Apple will ever see any of your images
Actually that's EXACTLY what it does. As pointed out below - they scan on-device before it goes into the cloud to "protect your privacy." They scan your photos "on device" before they go into the cloud. Apple doesn't give a **** if you have CSAM images on your phone (which is why this doesn't work if you keep icloud photos off) - they do however very much care if you store CSAM images on their servers - icloud photos. Thus, they are LITERALLY scanning your photos. Please properly inform yourself.
 
if one of those 3rd party apps happens to be from a developer that's 47% owned by a Chinese tech company... which is controlled by a Chinese state corporation with direct ties to the Chinese communist party.... well... they'll never use this as a backdoor into scanning iPhones of U.S users to detect unflattering images of Chinese communist leaders... right?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.