Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MacRumors

macrumors bot
Original poster
Apr 12, 2001
68,122
38,886


Apple employees are now joining the choir of individuals raising concerns over Apple's plans to scan iPhone users' photo libraries for CSAM or child sexual abuse material, reportedly speaking out internally about how the technology could be used to scan users' photos for other types of content, according to a report from Reuters.

apple-park-drone-june-2018-2.jpg

According to Reuters, an unspecified number of Apple employees have taken to internal Slack channels to raise concerns over CSAM detection. Specifically, employees are concerned that governments could force Apple to use the technology for censorship by finding content other than CSAM. Some employees are worried that Apple is damaging its industry-leading privacy reputation.
Apple employees have flooded an Apple internal Slack channel with more than 800 messages on the plan announced a week ago, workers who asked not to be identified told Reuters. Many expressed worries that the feature could be exploited by repressive governments looking to find other material for censorship or arrests, according to workers who saw the days-long thread.

Past security changes at Apple have also prompted concern among employees, but the volume and duration of the new debate is surprising, the workers said. Some posters worried that Apple is damaging its leading reputation for protecting privacy.
Apple employees in roles pertaining to user security are not thought to have been part of the internal protest, according to the report.

Ever since its announcement last week, Apple has been bombarded with criticism over its CSAM detection plans, which are still expected to roll out with iOS 15 and iPadOS 15 this fall. Concerns mainly revolve around how the technology could present a slippery slope for future implementations by oppressive governments and regimes.

Apple has firmly pushed back against the idea that the on-device technology used for detecting CSAM material could be used for any other purpose. In a published FAQ document, the company says it will vehemently refuse any such demand by governments.
Could governments force Apple to add non-CSAM images to the hash list?
Apple will refuse any such demands. Apple's CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government's request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.
An open letter criticizing Apple and calling upon the company to immediately halt it's plan to deploy CSAM detection has gained more than 7,000 signatures at the time of writing. The head of WhatsApp has also weighed into the debate.

Article Link: Apple Employees Internally Raising Concerns Over CSAM Detection Plans
 

Enough is Enough, Apple! It is incredibly distasteful for you to SPY on us, as consumers. Apple it will be in your best interest to give us the Opt-Out option from CSAM feature, please.​


Where are you, Craig? You said this yourself in WWDC - 2021. What happened?

“At Apple, we believe privacy is a fundamental human right,” said Craig Federighi, Apple’s senior VP of software engineering. “We don’t think you should have to make a tradeoff between great features and privacy. We believe you deserve both.”

This is a punishment and a big slap in the face for anyone who owns an iPhone. Whatever you are trying to accomplish with this Apple, leave it in government and law enforcement hands.

1. How about CSAM scans Apple's executives' iPhones first? No one wants their privacy to be exposed. Please stop this nonsense and RESPECT our PRIVACY. It is our fundamental human right.

2. How come this CSAM stuff was not mentioned by Apple during WWDC - 2021. Apple is up to something. Why now? When we are almost to the release date of iOS 15...


Also, this guy needs to be FIRED from Apple. He is the mastermind behind CSAM. What a joke!

1628838481050.png


Does anyone here have a game plan on how we can stop this crappy CSAM feature?​

 
Last edited:
I don't see a problem with this technology
If you don't see a problem with this technology. Apple is definitely playing you. Pretty sure you don't want Apple to be looking/scanning/analyzing/identifying your wife's pictures. This CSAM stuff needs to be shut down. Apple needs to respect our privacy period. Apple needs to figure out another way if they are really interested in catching pedophiles... God knows for what reason. It really feels like this is the end of an era. All good things comes to an end.

1628838318022.png
 
Last edited:
Good, but sorry it's too late for me to stay with Apple.

I'm done with big tech. Bought a Nokia 5310 (2020) for calls and texts. That'll do.

I also have a wifi-only degoogled android for some apps when I'm at the house.

We'll see how it goes. I may take the degoogled android as the main phone in future, but for now, I'm going low-tech.
 
employees are concerned that governments could force Apple to use the technology for censorship by finding content other than CSAM.
This is exactly the problem. Apple insists they will refuse to do this and I wish them luck but I don’t see how they can if countries pass laws requiring them to look for other things.

This pretty much says it all:
By making our phones run an algorithm that isn’t meant to serve us, but surveils us, it has crossed a line.
 
So from what I understand, Apple takes the image hash and compares it to an image hash database of known ‘blacklisted’ images. Now I haven’t looked into the CSAM feature too deeply, but there’s many ways of changing an image hash without ‘changing’ the image itself. For example, you could use steganography to make one simple change to the image and this would change the hash.
 
This whole CSAM situation will take weeks if not months to settle down, and I fear Apple will only suffer some collateral damage And end up being “no big deal” and everyone hurts.
People are really taking this seriously and planning on leaving Apple. They don't care about iOS 15 anymore... They don't care about the Apple's Fall line up.
 
This whole CSAM situation will take weeks if not months to settle down, and I fear Apple will only suffer some collateral damage And end up being “no big deal” and everyone hurts.

Always how it happens with big tech. That's what this issue fundamentally comes down to, the consumer's power has diminished to the size of plankton and now the corporations have all the say. Apple can literally do whatever they want and they will not suffer any consequences, especially when emotionally charged propaganda is behind them. If this were a much smaller company (start up size) and there were many other alternative options they would be dead within a week as most people do not accept this surveillance technology.
 
Last edited:
Enough is Enough, Apple.

Where are you, Craig? You said this yourself in WWDC - 2021. What happened?

“At Apple, we believe privacy is a fundamental human right,” said Craig Federighi, Apple’s senior VP of software engineering. “We don’t think you should have to make a tradeoff between great features and privacy. We believe you deserve both.”

How about CSAM scans of Apple's executives' iPhones. No one wants their privacy to be exposed. Please stop this nonsense and RESPECT our Privacy. It is our fundamental human right.

How come this CSAM stuff was not mentioned by Apple during WWDC - 2021. Apple is up to something. Why now? When we are almost to a release date?


Also, this guy needs to be FIRED from Apple. He is the mastermind behind CSAM. What a joke!

View attachment 1818081
I agree with your comments. But in other threads you are such a cheerleader for spending lots of cash on the new iPhones etc. How do you reconcile your double personalities?
 
Looks like you didn’t get the point! We are not concerned about looking for CSAM material, we are concerned about what countries could do with a technology like this!
Apple's already been caught secretly recording and screening our private conversations, so why is everyone up in arms over THIS??

 
Last edited:
What I'm not getting is when this is said…

"Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC."

If Apple aren't "looking at your photos" and they're only looking at a hash that's coming from an iCloud user's end. Then the only thing Apple could possibly check that against is another hash. So in reality Apple can basically be checking anything against anything. The only thing they can be sure of is what they are reporting the matching to, in this case supposedly NCMEC.

And I really can't imagine that the human reviewer at Apple, they are an Apple employee and not a 3rd party like those Siri recording reviewers… right?!?!, is thrown up the actual image that is matched each time.

So what does "human review" actually mean??
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.