Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Nailed it and nailed it. My only caveat would be "turning Apple into something they are not." How could they have even considered this? It seems that something has gone very wrong in the company. I wonder if the newly hired "woke" employees finally gained enough clout to push the development agenda, while the Jobsian old guard like Cook, Shiller, Cue have lost energy and clout. They seem to have gone from making "insanely great" products that improved the lives of their customers, to being totally "woke" and pursuing all sorts of social justice issues. A little of it we could take, but this was a bridge WAY too far.
Someone doesn't know the definition of woke.
 
  • Like
Reactions: tpfang56
Apple terminates this not because it wants to but because it is unpopular and sales will go down. Sales go down, Apple will lose money and stock price will go down. Apple has no choice but to terminate this. Now iPhone 13 rollout can move forward.
 
  • Like
Reactions: Euronimus Sanchez
Companies are required to report CSAM materials to the authorities.If Apple were to just shift them into a folder/album then Apple would be hosting child exploitation materials themselves. The mere transfer of those materials to their own storage would make them, probably overnight, the largest host of CSAM on the internet.
I don't think that's true.

Apple has no way of knowing if a match is a real match. If legally the system has gotten too good to use this defence, all they had to do is to use differential privacy to "fuzz" or inject false positives into the system artificially, some of the time.

Also, I don't think Apple is doing anything if iOS and iPhone are in the users' hands. Apple is not storing or help to store anything.

Apple can just reject a photo because it suspected it to violate company policies, doesn't have to be CSAM.

I think the current law is, as long as you are not storing CSAM on my server, or let me know for sure that you are using my service (note not product), then I'm not liable. Just like Seagate is not liable if their hard drives are being used to store CSAM, Apple is not liable if their iPhones are being used to store CSAM.

If you want to transfer CSAM, from the technical point of view, you should use something like BitTorrent Sync, or SFTP on AES256 encrypted volumes.
 
Apple has given us no indication whatsoever E2EE was going to happen and, in fact, has recently backed-off plans to implement it. Certain three-letter agencies didn't like it.
Hence the hypothetical. I'd be happy to see an e2ee implementation, but based on feedback from Apple's CSAM techniques, it seems that it would be either a) impossible or b) highly disliked in western democracies.

I don't think the nuance of how this technique balances privacy concerns has ever been discussed in the security literature, which is probably the biggest reason for the large pushback from much of (not all) the security community.

The back-and-forth rhetoric here about people understanding the tech or not is overblown, whether you're an ex-software dev or not. I've contributed to it to a degree, as what we do know about the technique has been misrepresented countless times on these forums. I'm a current CS PhD student studying some of the concepts at play here, and the truth is that none of us really understand this well, as there have not been solid peer-reviewed literature about it for us to understand it well enough.

Had Apple presented the ideas openly, and allowed for public discourse (and peer-reviewed studies) to commence prior to setting some release date, then there could have been a less emotionally-driven conversation going on, but instead we have what we've seen the last several weeks, which is really a shame, as I see real value in what they're proposing (the value being in the hypothetical prospect of this being the enabling feature of e2ee).
 
  • Love
Reactions: jhollington
Project needs to be cancelled, not delayed.
I’ll gladly and enthusiastically support increased funding for targeted law enforcement action against the networks that distribute this harmful content.
What Apple proposed, though, was (is) a major violation of the basic privacy and integrity of their users devices, and is totally the opposite of apples commitment to privacy in recent years. And even though they claim it’s just for this one issue, governments will soon demand it be expanded to other types of objectionable content and speech they don’t agree with.
 
Last edited:
Just trash it completely. Now.

once the weapon is already made, there is no turning back. I'm sure in a non distant future, some governments will get their inspiration from this. China probably already have a "CSAM" feature in the works.

as Apple one day said in that terrorist case: "It would be wrong to intentionally weaken our products with a government-ordered backdoor. If we lose control of our data, we put both our privacy and our safety at risk."

 
  • Like
Reactions: crymimefireworks
Nailed it and nailed it. My only caveat would be "turning Apple into something they are not." How could they have even considered this? It seems that something has gone very wrong in the company. I wonder if the newly hired "woke" employees finally gained enough clout to push the development agenda, while the Jobsian old guard like Cook, Shiller, Cue have lost energy and clout. They seem to have gone from making "insanely great" products that improved the lives of their customers, to being totally "woke" and pursuing all sorts of social justice issues. A little of it we could take, but this was a bridge WAY too far.
I suspect outside pressure. Google, Dropbox, Facebook are all doing this. It’s in their “privacy reports”. But those companies have a fundamentally different business model and MO.

social Justice is fine, great even. But this isn’t social Justice. It’s treating your entire customer base as a suspect without a court order or legal mandate to do so.
 
  • Like
Reactions: PC_tech
Clearly she and I disagree...
,,,
Emotional viewpoints like this are not anchored in reason, fact ...
You mean like "it's for the children" and "If it saves just one..." arguments?

... or any sort of technical understanding.
I wish people would stop repeating this. Sure, some people clearly misunderstood how this system of Apple's was designed to work, but most people understood it quite well, and still objected to it.

There no privacy invasion for those that aren't storing questionable content. Zero. Zip.
That's kind of like saying it's okay to enact "hate speech" laws because nobody's freedom of expression will be infringed if they don't say or write anything that doesn't run afoul of what the arbiters of what constitutes "hate speech" judge is acceptable.
 
I find this very unlikely. Not one of these *******s is going to be so stupid to sync with a public cloud service, where the evidence is easily acquired via court order. (Court orders are fine, btw).

the type of animal we hunt is not dumb, or ignorant. They are smart, cunning, and that makes them dangerous. As such, they do not use something like iCloud Photo Library.

it would affect us, the innocent ones, due to the slippery slope you mention. It would open us to misuse of the technology while turning apple into something they are not. Evidence needs to be gathered via appropriate court orders. it is one of the Pilars of democracy. (And very inconvenient, I am sure).

Yeah, this tool will not be effective at deterring abusers. It’s safety theatre meant to ease people’s minds while doing little to actually combat the problem. It reminds me of the criticisms security advocates have about the TSA, that it’s all to create an illusion of security, and that we shouldn’t be okay with invasive tactics in service of performative and ineffective policies.
 
This is good news. I applaud Apple for listening to criticism instead of just forging ahead with their head in the sand. This was a dangerous precedent that would have redefined the meaning of ownership, and also ironically a threat to children's rights to privacy and security in the future.
 
  • Like
Reactions: dk001
Maybe a compromise would be giving parents a toggle to optionally enable the feature on a Childs phone? The receiving Childs phone would then scan incoming images from text, messaging apps, etc? This would help protect children while at the same time not subject every single iPhone to have to give up their privacy. Thoughts?
 
not good enough. need a hard CANCEL. this is just to get iPhone13 sales and get people onto the new OS. this release is important, and stockholders are going to be watching ALL these numbers carefully. if there is a visible dent in services uptake, or cancelations, or iMessage being turned off, or people not using good, using duckduckgo, or disabling iCloud photos, etc... all these will hurt their bottom line, investors will notice and demand a change. even a tiny hit is enough to have a CEO removed.

I am not buy a single new apple hardware or spending another dollar toward apple until this is cancelled. not 1 song, 1 subscription or 1 movie. not 1 piece of hardware. I will remove all ties I possibly can.
 
I don't think the nuance of how this technique balances privacy concerns has ever been discussed in the security literature, ...
ISTR it had, in some literature. (No, I do not have cites handy, and I'm not going to try to find them.) The problem is security researchers and privacy advocates feel this is such a fundamentally bad idea, its negatives outweigh any conceivable positives. I agree.

I see real value in what they're proposing (the value being in the hypothetical prospect of this being the enabling feature of e2ee).
Even if Apple had been promising E2EE (which they have not), I would still see on-device CSAM-scanning as a net negative. And not even close.
 
  • Like
Reactions: Pummers
Apple one day said in that terrorist case: "It would be wrong to intentionally weaken our products with a government-ordered backdoor. If we lose control of our data, we put both our privacy and our safety at risk."
"The only way to guarantee that such a powerful tool isn’t abused and doesn’t fall into the wrong hands is to never create it." - Apple, inc.

I think the nuance that Apple is going for is that it's both a) not a (government-ordered, or otherwise) backdoor, and b) the tool is not quite as powerful (i.e., tunable to other domains) as many think.

Whether these things are actually the case remains to be seen --- Apple's documentation on it suggests that (a) and (b) are true, but that's coming from Apple. As far as we know, Apple's scans of our photos (and other documents) that they've been doing for years are already being used to track down dissenters.
 
  • Like
Reactions: jhollington
I understand why this is a slippery slope but I don’t like the idea of child predators breathing a sigh of relief.

And in other news, Pedos breath a sigh of relief as the DOJ decides not to ban roads which have been proven to be used for criminal activity.

This comes hot on the heels after UK Prime Minister Boris Johnson throws a bone to child predators everywhere by rejecting a bid to ban TCP packets, which as we all know, are the backbone of communications used by the sex traffickers world wide.
 
Last edited:
Turning off iCloud photos does not remove the code used to create and compare hashes from my device. Claiming "we won't use it unless you do X" is not good enough.
Do you have proof that turning off any feature actually turns it off? What I mean is... you've always felt like a feature is disabled when you disable it, right? How is this any different?

Disabling features does not remove the code from your device. Never has.
 


Apple has delayed the rollout of the Child Safety Features that it announced last month following negative feedback, the company has today announced.

Child-Safety-Feature-Blue.jpg

The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

Apple confirmed that feedback from customers, non-profit and advocacy groups, researchers, and others about the plans has prompted the delay to give the company time to make improvements. Apple issued the following statement about its decision:

Following their announcement, the features were criticized by a wide range of individuals and organizations, including security researchers, the privacy whistleblower Edward Snowden, the Electronic Frontier Foundation (EFF), Facebook's former security chief, politicians, policy groups, university researchers, and even some Apple employees. Apple has since endeavored to dispel misunderstandings and reassure users by releasing detailed information, sharing FAQs, various new documents, interviews with company executives, and more.

The suite of Child Safety Features were originally set to debut in the United States with an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. It is now unclear when Apple plans to roll out the "critically important" features, but the company still appears to be intent on releasing them.

Article Link: Apple Delays Rollout of Controversial Child Safety Features to Make Improvements

Again ... CSAM detection reminds me of when the government said ...

"Don't worry ... we're not listening to your calls. We're only collecting metadata about your calls. And it's to catch the terrorists."
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.