Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Too late.

Now that I've been woken up, there's no going back. In these short weeks I've realized how unsustainable it is to trust a company with these decisions.
Now that I've been woken up, there's no going back. In these short weeks I've realized how unsustainable it is to trust a company with these decisions.

I've also realized how weak our other options are like Linux.

Linux is not viable right now for regular use by regular people. But it's close.

The day will come when Apple does implement something like this. Our only solution is to build strong alternatives that are open source, end to end encrypted. On that day we need Linux and open source options to be ready.
 
Just more reasons I'll never buy an iPhone, or use the "cloud" for anything (especially Apple's iCloud).
People are so afraid that children are being exploited through their phones by predators. Hmmm.... Maybe
kids shouldn't have a PHONE.
We have completely lost our minds, worrying, wringing our hands about "our children", while we hand them a connection to the world wide internet that they can carry around in their pocket. Why don't you just give your kid a hand grenade and "trust them", while you're at it?
Does anybody honestly think that if a teenage girl starts conversing with some perv on her phone, that she's not going to be keenly aware of these "safety features" and thwart them? I give it a month before there are convenient apps a kid can throw on the phone that will effectively encrypt photo files so they won't be detected as "inappropriate" by this ridiculous "safety" feature.
 
  • Like
Reactions: Euronimus Sanchez
Absolutely correct. That’s why they’re taking “a few months to think about it more.”

As soon as they have everyone’s iPhone 13 money, they’ll resume the plan to scan everyone’s camera rolls.

After iPhone 13 comes iPhone 14.

I think they’re trying to let it die in terms of media exposure and depending on how it goes they’ll decide.

Of course short term it’s all about the new iPhone release.
 


Apple has delayed the rollout of the Child Safety Features that it announced last month following negative feedback, the company has today announced.

Child-Safety-Feature-Blue.jpg

The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

Apple confirmed that feedback from customers, non-profit and advocacy groups, researchers, and others about the plans has prompted the delay to give the company time to make improvements. Apple issued the following statement about its decision:

Following their announcement, the features were criticized by a wide range of individuals and organizations, including security researchers, the privacy whistleblower Edward Snowden, the Electronic Frontier Foundation (EFF), Facebook's former security chief, politicians, policy groups, university researchers, and even some Apple employees. Apple has since endeavored to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more.

The suite of Child Safety Features were originally set to debut in the United States with an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. It is now unclear when Apple plans to roll out the "critically important" features, but the company still appears to be intent on releasing them.

Article Link: Apple Delays Rollout of Controversial Child Safety Features to Make Improvements


If Apple can get Edward Snowden and the EFF to approve their plan - they can go for it.

Really they should have been asking the EFF for a critique prior to this obvious mistake!

I mean - Apple NOW realizes they made a mistake. Thank goodness for people and groups like EFF pointing out the weaknesses.
 
Just more reasons I'll never buy an iPhone, or use the "cloud" for anything (especially Apple's iCloud).
People are so afraid that children are being exploited through their phones by predators. Hmmm.... Maybe
kids shouldn't have a PHONE.
We have completely lost our minds, worrying, wringing our hands about "our children", while we hand them a connection to the world wide internet that they can carry around in their pocket. Why don't you just give your kid a hand grenade and "trust them", while you're at it?
Does anybody honestly think that if a teenage girl starts conversing with some perv on her phone, that she's not going to be keenly aware of these "safety features" and thwart them? I give it a month before there are convenient apps a kid can throw on the phone that will effectively encrypt photo files so they won't be detected as "inappropriate" by this ridiculous "safety" feature.

While I believe Apple is 100% wrong to deploy the camera roll scanning plan, I think your solution of just never giving a kid a phone is absolutely ridiculous.

There were some in the 1990’s who said “maybe you shouldn’t let your kid use the computer!”

There were some in the 1940’s who said “maybe you shouldn’t let your kid have a library card!”

Maybe work harder at being a better, more mindful, more conscientious, more influential parent so that your kids grow up with a healthy sense of mutual trust and independence?
 
If Apple can get Edward Snowden and the EFF to approve their plan - they can go for it.

Really they should have been asking the EFF for a critique prior to this obvious mistake!

I mean - Apple NOW realizes they made a mistake. Thank goodness for people and groups like EFF pointing out the weaknesses.
Their mistake wasn’t technical. Their mistake was they didn’t foresee the public backlash. There is no spying on your device based on a consumer electronics co’s decision without spying on your device based on a consumer electronics co’s decision.

It doesn’t matter how you twist it technically, it’s about the precedent. It’s conceptual. It’s philosophical.
 
I just don’t see how they can thread the needle with this. The goal is noble, but the reach is just really wide and even the most minor screw up or misflagged item - wouldn’t that really have the potential to cause innocent people some serious headaches?
 
I wonder how many additional children will be victimized from now until then? ...
I understand your and Apple's motivations to apprehend pedophiles, but the system does not prevent new abuse. It focuses on evidence of past abuse. And we should wonder how many children (and adults) will get abused if Apple's spying scheme is co-opted or copied by authoritarian governments.
People really don't understand this system...
...
Their main mistake was forgetting that people don't read past the alarmist headlines.
Many of the people who object to this know perfectly well what the system entails, including me. Moreover, because Apple cannot risk revealing all details of the system for fear of a arms race with pedophiles, informed speculation is unavoidable. So stop trying to explain away the objections as a lack of understanding. Perhaps you do not know enough about the subject to understand our objections.
The system sounded incredibly secure to me. I understand the concern about nefarious governments demanding that Apple scan for other types of content, but if China (for example) wanted to do that, they would and will do it anyway.
I think you miss the point. Apple just published a detailed blueprint on how to simultaneously spy on people but give them 'privacy'. Authoritarian governments will gobble that up and emulate it even if Apple decides to abandon this lunacy. Moreover, Apple proposed to use mobile phones as part of a distributed AI surveillance system to look for sanctioned content. When mobile phones become powerful enough, local AI might be powerful enough to allow authoritarian governments to monitor your pictures, browsing, texts, and even voice, in real time, for sanctioned content. That's not a slippery slope. That's a cliff. The technology is almost upon us. We have to hold the line now, or privacy will be a thing of the past. Apple should have never opened Pandora's Box. The company needs to hire people who aren't engineers and who can see two steps ahead in the social and legal domains.

Anyway, while I have no objection to the child safety features, I sincerely hope Apple just outright cancels the CSAM scanning. Until then, my MR signature stands.
 
I just don’t see how they can thread the needle with this. The goal is noble, but the reach is just really wide and even the most minor screw up or misflagged item - wouldn’t that really have the potential to cause innocent people some serious headaches?
How is it noble that a consumer electronics co goes into the law enforcing business as they please. What’s next, Walmart catching terrorists?

What happens to elected officials, individual rights, privacy…?

Seriously, what’s so hard to understand?
 
Good news. Thanks for listening.

What they’re listening to are reports that the number of people interested in switching from Android to iOS is falling.

The problem is the on-device scanning. Rene Ritchie suggested moving the scan to an intermediate server before sending the files that pass up to iCloud, but I suspect Apple was looking to move the cost of processing to their customers’ phones.

Frankly, I’m amazed that Apple didn’t realise how this would play out. All that goodwill, out the window …
 
I just don’t see how they can thread the needle with this. The goal is noble, but the reach is just really wide and even the most minor screw up or misflagged item - wouldn’t that really have the potential to cause innocent people some serious headaches?
No. Not at all.
 
I'm puzzled by all the celebration. It's a delay, not a cancellation. "Delay to make improvements" in corporate-speak means, "wordsmithing the next press release to make the news more palatable to customers". It's what happens when the first attempt at addressing "confusion" was unsuccessful. ;)
We will have to wait and then judge it based on the new information. No sense in worrying about it until we know what their next plan is. Maybe they’ll move it completely off device but use the same technology.
 
It's a poor solution to a serious problem, and Apple has had little success countering criticism. It's as if they didn't ask themselves any questions. As if they didn't prepare for any opinion besides unconditional praise. I don't understand how nobody at Apple said "Hey, we've been making privacy the centerpiece of our company's ethos for the last five years at least, we may get push back if we don't handle this right," or if someone did, why they weren't taken seriously. To not consult privacy advocates, to wait to announce it until shortly before its implementation, to have no talking points to effectively defend their decisions- everything about this seemed half-baked.

I fear it's too little too late, Apple has already revealed they have a surveillance tool ready in the wings. The only thing needed now is a government willing to force Apple's hand. Everybody should brace themselves for Apple's stock excuse: We are just complying with local law.
 
Last edited:
Except end-to-end encryption should prevent anything from being on Apple's serves to be identified as CSAM, because that's the point of end-to-end encryption. Researchers figured out a way to circumvent this, and then immediately advised companies to NOT do it.


Companies are required to report CSAM materials to the authorities.If Apple were to just shift them into a folder/album then Apple would be hosting child exploitation materials themselves. The mere transfer of those materials to their own storage would make them, probably overnight, the largest host of CSAM on the internet.

I don't really know a system that scans for attachments when uploading, aside from a file server itself, and in that case you're talking about a corporate file server. When you're sending work email through a work server, you're sending email on behalf of the employer, and they can do whatever they want.

But even in that case, the whole of the device isn't scanned for materials, and in that case encrypting the data prevents the email server from being able to scan it for viruses.

Looking at a file's size don'ts involve looking inside the file's content. Looking at the supported file format doesn't require the email to be read by someone. It's simply rejected.

In this case when data reaches a threshold it's passed to a human, to look at the content. What happens when instead of CSAM materials they read something that's anti-government in a country where that's illegal? There's nothing that stops a government from requiring a company to scan every piece of email or every attachment for anti-government sentiment. It already happens in China and Hong Kong.

Let's say, you are having sex with your 15-year-old daughter, and it's consensual. You are not raping her against her will, and she's not bothered by it. So, she will not tell anyone about your act, let alone the police. Let's further assume that your wife or her mother is out of the picture for the sake of this thought experiment.

So, everything is fine then?

No.

Disregarding incent as sex doesn't mean reproduction so the biological issue of incenting is less of a problem.

It's the moral, ethical and legal issues behind this act that makes it wrong. Just because she is okay with it, doesn't mean you should let her be okay with it. You are the adult here and her dad for christ's sake. You should be the one that she relies on and never betray or take advantage of her. Just because the law enforcement agencies have no way of knowing such acts, it doesn't mean you are legally immune should one day this comes out of dust. Let's say, her future husband finds out due to her strange proclivities in the bedroom, and dug deeper.
 
  • Wow
Reactions: Pummers
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.