Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
68,179
38,960


Last week, Apple previewed new child safety features that it said will be coming to the iPhone, iPad, and Mac with software updates later this year. The company said the features will be available in the U.S. only at launch.

iphone-communication-safety-feature.jpg

A refresher on Apple's new child safety features from our previous coverage:
First, an optional Communication Safety feature in the Messages app on iPhone, iPad, and Mac can warn children and their parents when receiving or sending sexually explicit photos. When the feature is enabled, Apple said the Messages app will use on-device machine learning to analyze image attachments, and if a photo is determined to be sexually explicit, the photo will be automatically blurred and the child will be warned.

Second, Apple will be able to detect known Child Sexual Abuse Material (CSAM) images stored in iCloud Photos, enabling Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with U.S. law enforcement agencies. Apple confirmed today that the process will only apply to photos being uploaded to iCloud Photos and not videos.

Third, Apple will be expanding guidance in Siri and Spotlight Search across devices by providing additional resources to help children and parents stay safe online and get help with unsafe situations. For example, users who ask Siri how they can report CSAM or child exploitation will be pointed to resources for where and how to file a report.
Since announcing the plans last Thursday, Apple has received some pointed criticism, ranging from NSA whistleblower Edward Snowden claiming that Apple is "rolling out mass surveillance" to the non-profit Electronic Frontier Foundation claiming that the new child safety features will create a "backdoor" into the company's platforms.

"All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children's, but anyone's accounts," cowrote the EFF's India McKinney and Erica Portnoy. "That's not a slippery slope; that's a fully built system just waiting for external pressure to make the slightest change."

The concerns extend to the general public, with over 7,000 individuals having signed an open letter against Apple's so-called "privacy-invasive content scanning technology" that calls for the company to abandon its planned child safety features.

At this point in time, it does not appear that any negative feedback has led Apple to reconsider its plans. We confirmed with Apple today that the company has not made any changes as it relates to the timing of the new child safety features becoming available — that is, later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. With the features not expected to launch for several weeks to months, though, the plans could still change.

Apple sticking to its plans will please several advocates, including Julie Cordua, CEO of the international anti-human trafficking organization Thorn.

"The commitment from Apple to deploy technology solutions that balance the need for privacy with digital safety for children brings us a step closer to justice for survivors whose most traumatic moments are disseminated online," said Cordua.

"We support the continued evolution of Apple's approach to child online safety," said Stephen Balkam, CEO of the Family Online Safety Institute. "Given the challenges parents face in protecting their kids online, it is imperative that tech companies continuously iterate and improve their safety tools to respond to new risks and actual harms."

Article Link: Apple Remains Committed to Launching New Child Safety Features Later This Year
 
Those who are complaining obviously did not read how the technology works.

You have a higher chance of winning the lottery than Apple erroneously looking through your photos.
That's not the problem. The big issue is creating a method that allows Apple to review content that's supposed to be private. It opens the possibility for someone to add 'reasons' to review any content they want to accuse a user of creating, saving, or sharing. Once it has been created and is active, the risk increases that others may figure out how to piggyback on that system and use a modification for their own purposes. All privacy is at risk then. This also sets up a situation where a court can then say "see, Apple does have a method to examine contents without permission from the device owners, therefore we can order Apple to allow an investigatory agency permission to access the content."

Not good.
 
They're getting ahead of the govts. worldwide with this so they can dictate the terms. They know that giving the finger to govts. regarding phone access will end at some point.

Why I don't like here is just that..

Apple should refuse to help governments go through peoples files, for any reason (without just cause and a warrant) and force this to be a political issue in front of voters and politicians.

Corporations with any backbone - as I thought Apple had - should be pushing this up to a public policy debate.

This is a chicken ish approach by Apple
 
Last edited:
Those who are complaining obviously did not read how the technology works.
Oh, damn. Another case where all these experts with PhDs and professorships in relevant fields, giving talks at Def Con & co. and consulting for companies and governments around the world have not read about how the technology works. Problem solved. What a world we live in. 😂
 
They're getting ahead of the govts. worldwide with this so they can dictate the terms. They know that giving the finger to govts. regarding phone access will end at some point.
I don’t know about that. At the risk of shutting down this thread (sorry in advance), how are they going to resist the pressure to give this tech to China? All the CCP has to do is say we want this software engineering or you no longer have access to our market.

This will be reverse engineered by somebody for nefarious purposes.
 
this IS a backdoor for censorship of any kind, period. It must not be allowed to be implemented since the pattern it searches for is easily changed and then in some countries, gays will be reported, or certain religions or even worse, the opposition...

this must stop before it starts or I'll have bought my last Apple device for sure! Would be a shame!
 
As I’m learning more about how it is supposed to work, as far as I understand it, the current tech is fine by me and not as invasive as it’s made out to be by some. As to what the future additions to the system may be.. it can still go either way but it’s not my concern right now. No other comparable platform will offer more privacy, not until you ditch tech altogether. This is what I’ll pay for convenience. That’s my take so far.

Edit: to clarify I’m no expert on this subject, i can’t possibly know all ins and outs. The coming months will surely shed more light on this and I’ll be interested to see response from governments and the like
 
Last edited:
I don’t know about that. At the risk of shutting down this thread (sorry in advance), how are they going to resist the pressure to give this tech to China? All the CCP has to do is say we want this software engineering or you no longer have access to our market.

This will be reverse engineered by somebody for nefarious purposes.

The whole China angle is it's own issue.

They basically should have two sets of iOS and rules to deal with that country if they are going to keep operating there.

China doesn't even really hide what sort of regime they are running there.

The same isn't true of other first world countries --- who at least purport to be somewhat democratic and have priorities of privacy and liberty for citizens.
 
Is Tim Cook about to retire, and like a disgruntled employee walking away from a dumpster fire they started, leaving this mess for someone else to inherit?

This isn’t the same Apple I’ve bought from for almost three decades.
as said above, they are up to something or their hand is being forced. Caring for children? Then why do album covers with near nudity get prominence on the Music app browse page?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.