Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
68,108
38,863


Apple has published a FAQ titled "Expanded Protections for Children" which aims to allay users' privacy concerns about the new CSAM detection in iCloud Photos and communication safety for Messages features that the company announced last week.

apple-privacy.jpg

"Since we announced these features, many stakeholders including privacy organizations and child safety organizations have expressed their support of this new solution, and some have reached out with questions," reads the FAQ. "This document serves to address these questions and provide more clarity and transparency in the process."

Some discussions have blurred the distinction between the two features, and Apple takes great pains in the document to differentiate them, explaining that communication safety in Messages "only works on images sent or received in the Messages app for child accounts set up in Family Sharing," while CSAM detection in iCloud Photos "only impacts users who have chosen to use iCloud Photos to store their photos… There is no impact to any other on-device data."

From the FAQ:
These two features are not the same and do not use the same technology.

Communication safety in Messages is designed to give parents and children additional tools to help protect their children from sending and receiving sexually explicit images in the Messages app. It works only on images sent or received in the Messages app for child accounts set up in Family Sharing. It analyzes the images on-device, and so does not change the privacy assurances of Messages. When a child account sends or receives sexually explicit images, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view or send the photo. As an additional precaution, young children can also be told that, to make sure they are safe, their parents will get a message if they do view it.

The second feature, CSAM detection in iCloud Photos, is designed to keep CSAM off iCloud Photos without providing information to Apple about any photos other than those that match known CSAM images. CSAM images are illegal to possess in most countries, including the United States. This feature only impacts users who have chosen to use iCloud Photos to store their photos. It does not impact users who have not chosen to use iCloud Photos. There is no impact to any other on-device data. This feature does not apply to Messages.
The rest of the document is split into three sections (in bold below), with answers to the following commonly asked questions:

  • Communication safety in Messages
  • Who can use communication safety in Messages?
  • Does this mean Messages will share information with Apple or law enforcement?
  • Does this break end-to-end encryption in Messages?
  • Does this feature prevent children in abusive homes from seeking help?
  • Will parents be notified without children being warned and given a choice?
  • CSAM detection
  • Does this mean Apple is going to scan all the photos stored on my iPhone?
  • Will this download CSAM images to my iPhone to compare against my photos?
  • Why is Apple doing this now?
  • Security for CSAM detection for iCloud Photos
  • Can the CSAM detection system in iCloud Photos be used to detect things other than CSAM?
  • Could governments force Apple to add non-CSAM images to the hash list?
  • Can non-CSAM images be "injected" into the system to flag accounts for things other than CSAM?
  • Will CSAM detection in iCloud Photos falsely flag innocent people to law enforcement?
Interested readers should consult the document for Apple's full responses to these questions. However, it's worth noting that for those questions which can be responded to with a binary yes/no, Apple begins all of them with "No" with the exception of the following three questions from the section titled "Security for CSAM detection for iCloud Photos:"
Can the CSAM detection system in iCloud Photos be used to detect things other than CSAM?
Our process is designed to prevent that from happening. CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations. This set of image hashes is based on images acquired and validated to be CSAM by child safety organizations. There is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. As a result, the system is only designed to report photos that are known CSAM in iCloud Photos. In most countries, including the United States, simply possessing these images is a crime and Apple is obligated to report any instances we learn of to the appropriate authorities.

Could governments force Apple to add non-CSAM images to the hash list?
Apple will refuse any such demands. Apple's CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government's request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.

Can non-CSAM images be "injected" into the system to flag accounts for things other than CSAM?
Our process is designed to prevent that from happening. The set of image hashes used for matching are from known, existing images of CSAM that have been acquired and validated by child safety organizations. Apple does not add to the set of known CSAM image hashes. The same set of hashes is stored in the operating system of every iPhone and iPad user, so targeted attacks against only specific individuals are not possible under our design. Finally, there is no automated reporting to law enforcement, and Apple conducts human review before making a report to NCMEC. In the unlikely event of the system flagging images that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.
Apple has faced significant criticism from privacy advocates, security researchers, cryptography experts, academics, and others for its decision to deploy the technology with the release of iOS 15 and iPadOS 15, expected in September.

This has resulted in an open letter criticizing Apple's plan to scan iPhones for CSAM in iCloud Photos and explicit images in children's messages, which has gained over 5,500 signatures as of writing. Apple has also received criticism from Facebook-owned WhatsApp, whose chief Will Cathcart called it "the wrong approach and a setback for people's privacy all over the world." Epic Games CEO Tim Sweeney also attacked the decision, claiming he'd "tried hard" to see the move from Apple's point of view, but had concluded that, "inescapably, this is government spyware installed by Apple based on a presumption of guilt."

"No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this," said prominent whistleblower Edward Snowden, adding that "if they can scan for kiddie porn today, they can scan for anything tomorrow." The non-profit Electronic Frontier Foundation also criticized Apple's plans, stating that "even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor."

Article Link: Apple Publishes FAQ to Address Concerns About CSAM Detection and Messages Scanning
 
Last edited:
So as I said in the other thread regarding "slippery slope", this isn't some nefarious ploy and Apple has no intention of doing anything other than what they have stated.

They are fully aware that if they step out of line with something like this the backlash would be huge.
 
I bet you the Chinese government will be very interested in this technology.

Notice how Apple keeps changing the wording and being very careful with the document. I hope this backfires and Apple catches itself in a massive lawsuit.

Privacy matters. Apple: Let us (the consumers decide) if we want you to scan our iPhone.

Apple you are a TECH company. You are not law enforcement. Don’t lose your vision. Stop chasing the $.

Reports like this will be out left and right…



The easiest way to avoid this spying technology.

1. Turn off iCloud photos and messages.
2. Do not log in to your iPhone using your iCloud credentials.
3. Possibly use a fake account to continue using your iPhone. Or, simply do not log in with your Apple ID at all.

1628579303661.png
 
Last edited:
I’m sorry Apple but you are not trustworthy. You and your comrades from the big tech are evil.
It has become like the boy who cried wolf. Nobody really believes apple or anyone are even capable of protecting users from government snooping.
The more Cook grouses about privacy, the less I believe him and the more he sounds like a lying hypocrite.
 
If it’s as apple states, I’m glad apple gives us as parents that feature to protect our kids.

As for the iCloud photos thing, I’m not gonna cry if sex offenders aren’t allowed to store their illegal and inmoral pictures in apple’s servers.
 
So as I said in the other thread regarding "slippery slope", this isn't some nefarious ploy and Apple has no intention of doing anything other than what they have stated.

They are fully aware that if they step out of line with something like this the backlash would be huge.

Think it's too late. The mob has already made up it's mind. For what it's worth, I think the idea behind the implementation is pretty sound and being done for the most respectable of reasons and ultimately if you don't want it, you don't use iCloud to store your photo's. I understand the outrage though because of the scope for abuse.. and the fact that today it's just looking for CSAM images but who knows what it could be used for in the future? But with all this power surely they have some duty of responsibility to look out for those who are incapable of defending themselves.
 
If it’s as apple states, I’m glad apple gives us as parents that feature to protect our kids.

As for the iCloud photos thing, I’m not gonna cry if sex offenders aren’t allowed to store their illegal and inmoral pictures in apple’s servers.
Of course not. As the old joke goes, “won’t someone think of the children?”

The question is, now Apple has created the ability to use the hash, are we to expect that capability will not be expanded? At this point we only have a promise from Apple it won’t. Well, better for Apple if it didn’t put itself in a position to have to promise in the first place.
 
People do realise that companies such as Google, Adobe, Facebook et. al already use some form of automated technology to scan for and detect CSAM? Adobe does it with Creative Cloud:


That's just one example.
So? One of the reasons we use Apple is it had a modicum of respect for privacy. Those companies don't respect our privacy.
 
I fixed it for you guys. The message is very clear now.

25C051A8-E32C-4BAC-A231-0BB9FC34BBF7.jpeg


"No matter how well-intentioned, Apple is rolling out mass surveillance to the entire world with this," said prominent whistleblower Edward Snowden, adding that "if they can scan for kiddie porn today, they can scan for anything tomorrow."

YUP! Get ready for an Apple employee to snoop around your wife’s, girlfriend, ex, aunty, mom and dad, kids, friends photos. An Apple employee will definitely be checking them out… out of curiosity… humans nature. People are nosy.

Everything will be exposed, unfortunately. They will know it all and will be collecting the data.


Expecting reports like this in the future…


When Apple talks about hashtags and algorithms… this is what will be seen.

1628504493693.png
 
Last edited:
The way I heard it described on a podcast was like this:

Apple is not doing a visual scan of your photos. They're not looking at the actual contents of your photos.

They are, instead, comparing hashes of *known* CSAM images. These are photos that have already been labeled as child porn.

So there's no danger of Apple flagging a photo of your child in the bathtub or whatever.

With all that said... no one knows what else Apple could do in the future. Perhaps they could start scanning the actual contents of your photos. So I can see why people are freaked out.

But as others have said... all of the big companies are doing similar things. So I dunno.
 
Step 1. Apple does something new, but in a different way than the rest of the industry.

Step 2. The internet cries it’s evil and they will vote with their wallets and leave the ecosystem. Completely ignoring Apple does it in a better and privacy oriented way.

Step 3. Apple explains more clearly what it did.

Step 4. Everybody shuts up and the industry follows Apple’s lead.
 
Last edited:
This CSAM upgrade is the only one you will hear about. When it starts scanning for other things you won't know and will have no way of finding out. The timing of it in this era of censorship is suspicious.

Under that assumption, they could do whatever they want behind closed doors already. Hell, why even publish this stuff about CSAM, if they could just do it without telling people, as you say?
 
Could governments force Apple to add non-CSAM images to the hash list?
Apple will refuse any such demands. Apple's CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government's request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.

How long before USA and/or China demands Apple to start scanning for content other than CSAM on all iPhones (not just the ones with iCloud Photo Library enabled), or they would not be allowed to sell devices in those markets? These two countries represents two-thirds of Apple's revenue and therefore have a lot of leverage on the company.
 
Last edited:
Step 1. Apple does something new, but in a different way than the rest of the industry.

Step 2. The internet cries it’s evil and they will vote with their wallets and leave the ecosystem. Completely ignoring Apple does it in a better and privacy oriented way.

Step 3. Apple explains some better what it did.

Step 4. Everybody shuts up and the industry follows Apple’s lead.

Bonus... iPhone 13 will be easier to get at launch since everyone is "boycotting" Apple.

:p
 
Step 1. Apple does something new, but in a different way than the rest of the industry.

Step 2. The internet cries it’s evil and they will vote with their wallets and leave the ecosystem. Completely ignoring Apple does it in a better and privacy oriented way.

Step 3. Apple explains some better what it did.

Step 4. Everybody shuts up and the industry follows Apple’s lead.

It's about bashing Apple. No one yells at Google for their GMail scanning for child sex abuse imagery. But Apple is for some reason different.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.