Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
"It is a tale
Told by an idiot, full of sound and fury,
Signifying nothing."

You're just full of bravado, going off at Apple because you are angry.

Apple has always been a secretive company especially since Jobs took over in the 96. To expect openness and detailed explanations from Apple is to make Apple into a company they probably never have been.

Also, it's not a backdoor when you are told about it. A backdoor implies secrecy. There is no secrecy here.

Apple summarised the process and the underlying technologies in a technical documen, here it is for reference: https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf
 
  • Like
Reactions: brucemr
These features have good intentions and I’m fine with all of them except having iCloud Photo Library self reporting you to law enforcement. That feature has got to go. As long as parents have control over the other features I’ve got no issue with them.

Yes, you can disable iCloud Photo Library (and I have) but you can’t add additional storage capacity that you didn’t purchase up front with the expectation that you could use iCloud Photo Library to reduce your storage requirements.

The point is that for a company that touts privacy as one of their main selling points and uses it to justify higher prices for hardware this is an epic failure. Thankfully I always buy more storage than I think I’ll need so shutting off iCloud Photo Library on my Mac, MacBook Air, iPhone and iPad and downloading all of my photos on devices was feasible but if I had a huge library and didn’t buy devices with the capacity to hold them all because of iCloud Photo Library I’d be rightfully ticked off.

For the “Who cares if you have nothing to hide?” crowd see the 4th Amendment to the US Constitution. The founding fathers knew from experience that if there weren’t explicit protections preventing the Federal Government from conducting unreasonable search and seizure that it would be abused. This is no different. This technology will be abused by somebody at some point. Be it Apple, hackers, a government entity, a ticked off significant other who knows your passcode, etc. You mean to tell me US intelligence and the Russian and Chinese governments weren’t licking their chops when they heard about these features? If you think they weren’t I’ve got a bridge in New York City to sell you.

There is NO automated reporting proposed by Apple, I don’t know where you got that idea from.

As far as privacy is concerned, Apple’s solution is a more privacy oriented solution IF CSAM SCANS MUST BE DONE than the alternative used by other cloud providers, scanning entirely on the server.
 
  • Disagree
  • Haha
Reactions: Nuvi and bobcomer
As a father of two small children, a lot of you make me sick. Child abuse and the distribution of such material is a HUGE WIDESPREAD problem. After many years, Apple *finally* wants to implement a system that detects child abuse material on someone's phone to counter this problem. And you bunch of babies cry about your precious "privacy".

If you do not store child abuse images on your phone, how will this even affect you IN ANY WAY? And don't give me all this slippery slope BS about what this possibly *could* lead to. We are talking about a very specific piece of technology designed for one very specific purpose. When they are proposing 24/7 body cams for all adults or scanning phones for political content, we'll talk about that. BUT THEY ARE NOT. They are proposing detecting child abuse images on people's phones. They deserve applause.
You don’t understand that this is the start of a slippery slope? In China they will definitely use this to force Apple to search for “seditious” images. I guarantee they’ll do this in HK. And they won’t tell you they are doing it because the law won’t allow it.
It will also happen in the US and other countries. Imagine what Trump would like to do with it?
It’s astounding you cannot see the serious danger this poses to personal freedom. I care more about that issue for my children.
 
Sounds like the EFF supports the Pedos, not a good look for them.

Doesn’t sound like that at all. With that logic anyone who doesn’t like biometric passports and endless security checks at every damn airport is a Taliban.

A reduction in privacy for every user is not acceptable if that is a company’s supposed top level value.
 
Since when are they putting a backdoor in their encryption? If the EFF wants to be taken seriously they should get their facts right.

It is a backdoor in the sense that the encryption used for files is passing through an intermediary, on-device database for checksum/hash comparison and Safety Voucher creation. The implementation details don't matter as much as the fact that the comparison functionality exists. The plain fact is that this database could feasibly hold any arbitrary match to any data at all, and nothing except Apple's flimsy privacy practices (that would be trivial to change and wouldn't even necessarily require user notification) safeguard us from having this technology used in a more Draconian way. Just because it's not a backdoor in the conventional sense doesn't mean it's not a backdoor. It's a backdoor.
 
Apple have been very clear on how the overall process works as well as described how the individual technologies they use work. Any more and they’d have to be showing you code in their description and if you want that because you’re a security computer science researcher, Apple promised to make the code available. I really dont know what else they’d have to do to make it more clear to you.

No it hasn’t.
When I load 50 photos on my phone, are they scanned then and wait for the iCloud upload to do a hash check? Or is it done all at once? All Apple says is “Before iCloud”.

Apple won’t say and the original docs plus Fed’s interview with the WSJ are at odds.

Anyway, we will likely never know unless Apple opens up.
Crickets.
 
Doesn’t sound like that at all. With that logic anyone who doesn’t like biometric passports and endless security checks at every damn airport is a Taliban.

A reduction in privacy for every user is not acceptable if that is a company’s supposed top level value.
There is no reduction in privacy though
 
As a father of two small children, a lot of you make me sick. Child abuse and the distribution of such material is a HUGE WIDESPREAD problem. After many years, Apple *finally* wants to implement a system that detects child abuse material on someone's phone to counter this problem. And you bunch of babies cry about your precious "privacy".

If you do not store child abuse images on your phone, how will this even affect you IN ANY WAY? And don't give me all this slippery slope BS about what this possibly *could* lead to. We are talking about a very specific piece of technology designed for one very specific purpose. When they are proposing 24/7 body cams for all adults or scanning phones for political content, we'll talk about that. BUT THEY ARE NOT. They are proposing detecting child abuse images on people's phones. They deserve applause.

It's kowtowing like this while our personal liberties are slowly eroded by corporate interests and overreaching political pressures that makes me sick. You can virtue signal all you would like, but I don't believe a word of it. If everything were Koser about this, Apple would not have flinched and rolled it out without delay. It's obviously wrought with systemic technical and legal issues that may not be rectifiable.
 
I run software company for more than 20 years, and I perfectly understand the tech. Countless experts are already given explanations and criticism over this backdoor, go outside the echo chamber and do your research, I will not give it to you freely. Repeating Apples PR mantra "you're holding it wrong, understand the tech" is making you look stupid and uneducated. If you want to present technical argument, please do it. But at this point in time even Apple has understandied that "tech" is easy to be fooled with adversarial networks, that is the reason for delaying it. And obviously iPhone 13 is coming out soon, so they need a PR move.
My point was and is clear — if this is a big issue to the majority of users, why is there such a low number of petition signatures?

I couldn’t even find any information from “Fight for the Future” on their site. The other petitions only require a name, email address, and zip code or country. That’s it. Such a low barrier, and yet why such low numbers? The data I’m seeing doesn’t support the level of outrage I’m reading here. This has gotten worldwide press for weeks from the mainstream media, so you can’t argue that it hasn’t been covered to death.

In contrast, a single state in the US starts a governor recall petition which requires actual signatures, address, county, and a legal oath that the information is true. That petition gathered over 2M signatures, and this was only open to a single state in the US.

Apple says they have over 1.6B devices in use. The EFF petition has about 27,000 signatures. If the press coverage was representative of the feelings of the majority of users, why aren’t the numbers higher? I can only deduce that the majority of users trust Apple at their word or they just don’t care. Believe what you want to believe, but the numbers don’t support this being a concern for the vast majority of users.

All this being said, Apple could still decide to delay this further or drop it altogether. Either way, I’m not concerned. There are many things to be concerned about — this is not one of them.
 
Last edited:
  • Haha
Reactions: Pummers
If I were Apple, and was being pressured to do this, but really didn't want to do it, and was looking for an excuse to get out of doing it, this is exactly how I would do it.
 
My point was and is clear — if this is a big issue to the majority of users, why is there such a low number of petition signatures?

I couldn’t even find any information from “Fight for the Future” on their site. The other petitions only require a name, email address, and zip code or country. That’s it. Such a low barrier, and yet why such low numbers? The data I’m seeing doesn’t support the level of outrage I’m reading here. This has gotten worldwide press for weeks from the mainstream media, so you can’t argue that it hasn’t been covered to death.

In contrast, a single state in the US starts a governor recall petition which requires actual signatures, address, county, and a legal oath that the information is true. That petition gathered over 2M signatures, and this was only open to a single state in the US.

Apple says they have over 1.6B devices in use. The EFF petition has about 27,000 signatures. If the press coverage was representative of the feelings of the majority of users, why aren’t the numbers higher? I can only deduce that the majority of users trust Apple at their word or they just don’t care. Believe what you want to believe, but the numbers don’t support this being a concern for the vast majority of users.

All this being said, Apple could still decide to delay this further or drop it altogether. Either way, I’m not concerned. There are many things to be concerned about — this is not one of them.
Our own internal polling showed 15% would leave Apple over this…. And that’s a Apple fan site…. You can assume apples bean counters have been doing their own polling, it does not really matter how many of you are fine with the spyware, it just matters how many of us are not. Luckily it was enough of us to make Apple hit the pause button…. Now we are just waiting to see what they do next.
 
You don’t understand that this is the start of a slippery slope? In China they will definitely use this to force Apple to search for “seditious” images. I guarantee they’ll do this in HK. And they won’t tell you they are doing it because the law won’t allow it.
It will also happen in the US and other countries. Imagine what Trump would like to do with it?
It’s astounding you cannot see the serious danger this poses to personal freedom. I care more about that issue for my children.

Why would the Chinese government do that when they can simply search through the entire iCloud user content as can the US government (with a warrant)? And yes, they don5 have to tell you about it either. What‘s astounding is your lack of comprehension of basic facts
 
Our own internal polling showed 15% would leave Apple over this…. And that’s a Apple fan site…. You can assume apples bean counters have been doing their own polling, it does not really matter how many of you are fine with the spyware, it just matters how many of us are not. Luckily it was enough of us to make Apple hit the pause button…. Now we are just waiting to see what they do next.
15% of how many polled? What internal poll are you referring to?
 
Our own internal polling showed 15% would leave Apple over this…. And that’s a Apple fan site…. You can assume apples bean counters have been doing their own polling, it does not really matter how many of you are fine with the spyware, it just matters how many of us are not. Luckily it was enough of us to make Apple hit the pause button…. Now we are just waiting to see what they do next.
I am one of the biggest Apple fanboys around. I have supported them on issues that are highly contentious even in the Pro-Apple community, but I do not support them on this. This is a red line and I will be increasingly vocal about it until they shut it down. Just signed the petition. Keeping my wallet closed as well.
 
Apple is progressively moving towards fully end-to-end encrypted iCloud. As a result, they will not be able to provide governments with any user data even when requested with a warrant. This will expose them to accusations that the platform allows and even protects users with CSAM contents. The CSAM scanning feature will be Apple‘s response. This way Apple will achieve swimmingly impossible, iCloud privacy at the cost of somewhat invasive on device scanning. It’s a massive win on one side at the expense of small loss on the other. This is great, as far as I’m concerned. If you’re not then just don’t sync your photos using iCloud, it’s super easy to opt out.
 
Apple is progressively moving towards fully end-to-end encrypted iCloud. As a result, they will not be able to provide governments with any user data even when requested with a warrant. This will expose them to accusations that the platform allows and even protects users with CSAM contents. The CSAM scanning feature will be Apple‘s response. This way Apple will achieve swimmingly impossible, iCloud privacy at the cost of somewhat invasive on device scanning. It’s a massive win on one side at the expense of small loss on the other. This is great, as far as I’m concerned. If you’re not then just don’t sync your photos using iCloud, it’s super easy to opt out.

Privacy is only intact if it’s both on device and in the cloud.

Apple hasn’t announced any plans to have the cloud fully end to end encrypted either.
 
  • Like
Reactions: Pummers and dk001
Apple is progressively moving towards fully end-to-end encrypted iCloud. As a result, they will not be able to provide governments with any user data even when requested with a warrant. This will expose them to accusations that the platform allows and even protects users with CSAM contents. The CSAM scanning feature will be Apple‘s response. This way Apple will achieve swimmingly impossible, iCloud privacy at the cost of somewhat invasive on device scanning. It’s a massive win on one side at the expense of small loss on the other. This is great, as far as I’m concerned. If you’re not then just don’t sync your photos using iCloud, it’s super easy to opt out.

Please provide a source for this other than the speculation of others.

This is a grand idea, and would be great. But Apple has given zero evidence this is their end goal. This has mostly been postulated by those trying to find a logical reason for Apple's methodology.
 
  • Like
Reactions: Pummers


The Electronic Frontier Foundation has said it is "pleased" with Apple's decision to delay the launch of of its controversial child safety features, but now it wants Apple to go further and completely abandon the rollout.

eff-logo-lockup-cleaned.jpg

Apple on Friday said it was delaying the planned features to "take additional time over the coming months to collect input and making improvements," following negative feedback from a wide range of individuals and organizations, including security researches, politicians, policy groups, and even some Apple employees.

The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.

In its response to the announced delay, the EFF said it was "pleased Apple is now listening to the concerns" of users, but "the company must go further than just listening, and drop its plans to put a backdoor into its encryption entirely."

The statement by the digital rights group reiterated its previous criticisms about the intended features, which it has called "a decrease in privacy for all ‌iCloud Photos‌ users, not an improvement," and warned that Apple's move to scan messages and ‌iCloud Photos‌ could be legally required by authoritarian governments to encompass additional materials.

It also highlighted the negative reaction to Apple's announced plans by noting a number petitions that have been organized in opposition to the intended move.
The suite of Child Safety Features were originally set to debut in the United States with an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. It's not clear when Apple plans to roll out the "critically important" features or how it intends to "improve" them in light of so much criticism, but the company still appears determined to roll them out in some form.

Article Link: EFF Pressures Apple to Completely Abandon Controversial Child Safety Features
So how are petafiles caught and prosecuted? I mean how do the police find, track down creeps whom maybe taking pictures of your daughter or son? They can’t just randomly arrest people, they must have ways to see who is looking at girls and boys right?
 
Last edited:
No it hasn’t.
When I load 50 photos on my phone, are they scanned then and wait for the iCloud upload to do a hash check? Or is it done all at once? All Apple says is “Before iCloud”.

Apple won’t say and the original docs plus Fed’s interview with the WSJ are at odds.

Anyway, we will likely never know unless Apple opens up.
Crickets.
The exact wording is “Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the database of known CSAM hashes” which is a lot clearer than “Before iCloud” as you quoted it.

For obvious reasons, the hash needs to be calculated before the database lookup and upload occurs. The process is described in sufficient detail to understand the main concepts. If they queue up the hashing for multiple images first before moving to lookups etc I’m sure they’d parallelise the processing as much as possible but it’s really just an implementation detail, inconsequential to the broader discussion here.
 
Apple is progressively moving towards fully end-to-end encrypted iCloud. As a result, they will not be able to provide governments with any user data even when requested with a warrant. This will expose them to accusations that the platform allows and even protects users with CSAM contents. The CSAM scanning feature will be Apple‘s response. This way Apple will achieve swimmingly impossible, iCloud privacy at the cost of somewhat invasive on device scanning. It’s a massive win on one side at the expense of small loss on the other. This is great, as far as I’m concerned. If you’re not then just don’t sync your photos using iCloud, it’s super easy to opt out.
What a brilliant example of Applelogism. Congratulations. Keep the faith.
 
Apple is progressively moving towards fully end-to-end encrypted iCloud. As a result, they will not be able to provide governments with any user data even when requested with a warrant. This will expose them to accusations that the platform allows and even protects users with CSAM contents. The CSAM scanning feature will be Apple‘s response. This way Apple will achieve swimmingly impossible, iCloud privacy at the cost of somewhat invasive on device scanning. It’s a massive win on one side at the expense of small loss on the other. This is great, as far as I’m concerned. If you’re not then just don’t sync your photos using iCloud, it’s super easy to opt out.
"Is progressively moving towards..." What's taking them so long? After all these years Apple is still "progressively moving"..? Doesn't sound right to me. I expect more from Apple, this is just too late if it ever arrives.
 
Last edited:
with all the kids being trafficked through our southern border, I would imagine all you pro csammers are also pro border wall?
 
  • Haha
Reactions: Pummers
The exact wording is “Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the database of known CSAM hashes” which is a lot clearer than “Before iCloud” as you quoted it.

For obvious reasons, the hash needs to be calculated before the database lookup and upload occurs. The process is described in sufficient detail to understand the main concepts. If they queue up the hashing for multiple images first before moving to lookups etc I’m sure they’d parallelise the processing as much as possible but it’s really just an implementation detail, inconsequential to the broader discussion here.

Nice job trying to explain. It still comes down to “Before iCloud”. Just not when.
So are they prescanning/hashing the photo and it waits in queue till the next upload? Or are they executing the whole process only when upload is triggered. I could see the former for a large photo album and the latter for small albums.

In the original interview it was said that no on device scanning occurs till the upload begins.
Then we get a doc from Apple that says “before iCloud”.
Then we get the more detailed tag can be interpreted a couple of ways.
None of them answer the question - When?
 
  • Like
Reactions: Pummers
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.