Apple struggles this year in general.Apple is really struggling to deliver anything this iOS version
Apple struggles this year in general.Apple is really struggling to deliver anything this iOS version
Are you going to town hall meetings, too?You beat me to the punch! I DID actually send him an email...
That is probably why they are doing it.Great news and a small victory for the "screeching voices of the minority"!
My fear is they just want the iPhone 13 launch to happen without any bad press or having to launch it with 14.X.
I'll assume that you drive a vehicle but are not a paedophile? In which case I will also assume that you will no longer be getting behind the wheel, since statistically this will have a far greater chance of saving 'just one child's' life.Ya know…I’ll take the risk to save even one child….
Is he a reference point now, one ***** CIA agent?Ed Snowden, Permanent Record, pp. 208–209
Looks like they want to clear the path for iPhone season.
Absolutely correct. That’s why they’re taking “a few months to think about it more.”
As soon as they have everyone’s iPhone 13 money, they’ll resume the plan to scan everyone’s camera rolls.
Apple has delayed the rollout of the Child Safety Features that it announced last month following negative feedback, the company has today announced.
![]()
The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.
Apple confirmed that feedback from customers, non-profit and advocacy groups, researchers, and others about the plans has prompted the delay to give the company time to make improvements. Apple issued the following statement about its decision:
Following their announcement, the features were criticized by a wide range of individuals and organizations, including security researchers, the privacy whistleblower Edward Snowden, the Electronic Frontier Foundation (EFF), Facebook's former security chief, politicians, policy groups, university researchers, and even some Apple employees. Apple has since endeavored to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more.
The suite of Child Safety Features were originally set to debut in the United States with an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. It is now unclear when Apple plans to roll out the "critically important" features, but the company still appears to be intent on releasing them.
Article Link: Apple Delays Rollout of Controversial Child Safety Features to Make Improvements
Just more reasons I'll never buy an iPhone, or use the "cloud" for anything (especially Apple's iCloud).
People are so afraid that children are being exploited through their phones by predators. Hmmm.... Maybe
kids shouldn't have a PHONE.
We have completely lost our minds, worrying, wringing our hands about "our children", while we hand them a connection to the world wide internet that they can carry around in their pocket. Why don't you just give your kid a hand grenade and "trust them", while you're at it?
Does anybody honestly think that if a teenage girl starts conversing with some perv on her phone, that she's not going to be keenly aware of these "safety features" and thwart them? I give it a month before there are convenient apps a kid can throw on the phone that will effectively encrypt photo files so they won't be detected as "inappropriate" by this ridiculous "safety" feature.
Their mistake wasn’t technical. Their mistake was they didn’t foresee the public backlash. There is no spying on your device based on a consumer electronics co’s decision without spying on your device based on a consumer electronics co’s decision.If Apple can get Edward Snowden and the EFF to approve their plan - they can go for it.
Really they should have been asking the EFF for a critique prior to this obvious mistake!
I mean - Apple NOW realizes they made a mistake. Thank goodness for people and groups like EFF pointing out the weaknesses.
Yep. It will rollout with iOS 15.1 or 15.2. They're delaying it to avoid potential sales hits on the iPhone 13.Looks like they want to clear the path for iPhone season.
I understand your and Apple's motivations to apprehend pedophiles, but the system does not prevent new abuse. It focuses on evidence of past abuse. And we should wonder how many children (and adults) will get abused if Apple's spying scheme is co-opted or copied by authoritarian governments.I wonder how many additional children will be victimized from now until then? ...
Many of the people who object to this know perfectly well what the system entails, including me. Moreover, because Apple cannot risk revealing all details of the system for fear of a arms race with pedophiles, informed speculation is unavoidable. So stop trying to explain away the objections as a lack of understanding. Perhaps you do not know enough about the subject to understand our objections.People really don't understand this system...
...
Their main mistake was forgetting that people don't read past the alarmist headlines.
I think you miss the point. Apple just published a detailed blueprint on how to simultaneously spy on people but give them 'privacy'. Authoritarian governments will gobble that up and emulate it even if Apple decides to abandon this lunacy. Moreover, Apple proposed to use mobile phones as part of a distributed AI surveillance system to look for sanctioned content. When mobile phones become powerful enough, local AI might be powerful enough to allow authoritarian governments to monitor your pictures, browsing, texts, and even voice, in real time, for sanctioned content. That's not a slippery slope. That's a cliff. The technology is almost upon us. We have to hold the line now, or privacy will be a thing of the past. Apple should have never opened Pandora's Box. The company needs to hire people who aren't engineers and who can see two steps ahead in the social and legal domains.The system sounded incredibly secure to me. I understand the concern about nefarious governments demanding that Apple scan for other types of content, but if China (for example) wanted to do that, they would and will do it anyway.
How is it noble that a consumer electronics co goes into the law enforcing business as they please. What’s next, Walmart catching terrorists?I just don’t see how they can thread the needle with this. The goal is noble, but the reach is just really wide and even the most minor screw up or misflagged item - wouldn’t that really have the potential to cause innocent people some serious headaches?
Good news. Thanks for listening.
No. Not at all.I just don’t see how they can thread the needle with this. The goal is noble, but the reach is just really wide and even the most minor screw up or misflagged item - wouldn’t that really have the potential to cause innocent people some serious headaches?
We will have to wait and then judge it based on the new information. No sense in worrying about it until we know what their next plan is. Maybe they’ll move it completely off device but use the same technology.I'm puzzled by all the celebration. It's a delay, not a cancellation. "Delay to make improvements" in corporate-speak means, "wordsmithing the next press release to make the news more palatable to customers". It's what happens when the first attempt at addressing "confusion" was unsuccessful.![]()
This is more like the screaching major majority. I haven't heard a single person defend it. At best, I've heard people say something like "but think of the children", but even then it's been half-hearted at best.So Apple cares about the screeching minority after all?
Except end-to-end encryption should prevent anything from being on Apple's serves to be identified as CSAM, because that's the point of end-to-end encryption. Researchers figured out a way to circumvent this, and then immediately advised companies to NOT do it.
![]()
Opinion | We built a system like Apple’s to flag child sexual abuse material — and concluded the tech was dangerous
There’s a reason civil liberties groups are calling for Apple to abandon its system.www.washingtonpost.com
Companies are required to report CSAM materials to the authorities.If Apple were to just shift them into a folder/album then Apple would be hosting child exploitation materials themselves. The mere transfer of those materials to their own storage would make them, probably overnight, the largest host of CSAM on the internet.
I don't really know a system that scans for attachments when uploading, aside from a file server itself, and in that case you're talking about a corporate file server. When you're sending work email through a work server, you're sending email on behalf of the employer, and they can do whatever they want.
But even in that case, the whole of the device isn't scanned for materials, and in that case encrypting the data prevents the email server from being able to scan it for viruses.
Looking at a file's size don'ts involve looking inside the file's content. Looking at the supported file format doesn't require the email to be read by someone. It's simply rejected.
In this case when data reaches a threshold it's passed to a human, to look at the content. What happens when instead of CSAM materials they read something that's anti-government in a country where that's illegal? There's nothing that stops a government from requiring a company to scan every piece of email or every attachment for anti-government sentiment. It already happens in China and Hong Kong.