Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If they find it.
They have no duty to actively search for it on the grounds that they suspect it might be there.
This is why Apple is a Safe Haven for this child material. They are doing NOTHING now, resulting in such low numbers its got to be embarrassing for Apple. This is why every other major cloud storage provider actively scans for it.

And yes actually, its because everyone else is doing it Apple needs to do SOMETHING. Otherwise, they will continue to be a Safe Haven.
 
OK, so it will be labeled as a safe heaven. So what?
The cloud should be like a safe deposit box in a bank. The bank isn't responsible for what I keep in the box.

The government and a few angry moms will call it a safe heaven, while millions of other users will applaud them for keeping their ground and actually delivering the privacy they've been touting for years. You can't please everybody. If those who aren't pleased want to call you derogatory names, so be it.
 
Last edited:
This is why Apple is a Safe Haven for this child material. They are doing NOTHING now, resulting in such low numbers its got to be embarrassing for Apple. This is why every other major cloud storage provider actively scans for it.

And yes actually, its because everyone else is doing it Apple needs to do SOMETHING. Otherwise, they will continue to be a Safe Haven.

But Apple is already scanning the cloud, done it for years…
 
  • Like
Reactions: mainemini
By the way, why is Apple a safe heaven for CSAM pictures, but not a safe heaven for other kinds of illegal materials? How many files linked to terrorist activities has Apple reported? How many terrorists have been identified thanks to Apple scanning iCloud files? My guess is, as few as CSAM, if any. So, why focus on CSAM only? The iCloud could be a safe heaven for anything. Why stop at children?
 
By the way, why is Apple a safe heaven for CSAM pictures, but not a safe heaven for other kinds of illegal materials? How many files linked to terrorist activities has Apple reported? How many terrorists have been identified thanks to Apple scanning iCloud files? My guess is, as few as CSAM, if any. So, why focus on CSAM only? The iCloud could be a safe heaven for anything. Why stop at children?
See this part I agree with (and my argument will also apply to Microsoft/Google and others). I do not agree about the complaining Apple is getting yet nobody says a word about Microsoft, Google and others. However, I actually agree not only what you said here, but the whole CSAM scanning is rather pointless. I want to see the actual people abusing children be put in jail or worse. While still absolutely horrible, at the end of the day who cares if Joe Someone find a random picture and saves it on their phone? It doesn't help the actual abuse, they didn't take the picture themselves, they didn't actually abuse the child.

And this is definitely a topic that severely invokes emotion - "BUT THE KIDS!!!!!" Yet Joe Someone is not actively the one abusing the kid. So we go after the Joe Someone's of the world while Jim Someone can still continue to abuse the kid or other kids. Great! Why exactly can't we instead focus on the Joe in this scenario?
 
Last edited:
  • Like
Reactions: Vlad Soare
See this part I agree with. I do not agree about the complaining Apple is getting yet nobody says a word about Microsoft, Google and others. However, I actually agree not only what you said here, but the whole CSAM scanning is rather pointless. I want to see the actual people abusing children be put in jail or worse. While still absolutely horrible, at the end of the day who cares if Joe Someone find a random picture and saves it on their phone? It doesn't help the actual abuse, they didn't take the picture themselves, they didn't actually abuse the child.

And this is definitely a topic that severely invokes emotion - "BUT THE KIDS!!!!!" Yet Joe Someone is not actively the one abusing the kid. So we go after the Jim Someone's of the world while Joe Someone can still continue to abuse the kid or other kids. Great! Why exactly can't we instead focus on the Joe in this scenario?
Bingo!
This is just a publicity stunt, which will not help the children much, if at all. Then why not just stuff it, live with the risk of being labeled as a safe heaven by a few moms and activists, while on the other hand delivering that true privacy that we've been promised for so many years? I'm sure the benefits would outweigh the losses in the long run.
 
  • Like
Reactions: glowdragon
Bingo!
This is just a publicity stunt, which will not help the children much, if at all.
I won't stop the production, it might put a dent in the distribution...and therefore can be seen to be helping the children.
Then why not just stuff it, live with the risk of being labeled as a safe heaven by a few moms and activists, while on the other hand delivering that true privacy that we've been promised for so many years? I'm sure the benefits will outweigh the losses in the long run.
 
This is why Apple is a Safe Haven for this child material. They are doing NOTHING now, resulting in such low numbers its got to be embarrassing for Apple. This is why every other major cloud storage provider actively scans for it.

And yes actually, its because everyone else is doing it Apple needs to do SOMETHING. Otherwise, they will continue to be a Safe Haven.
Do you know how many of our devices are used for illegal activities. Its not the devices or the clouds that are the problem, its the people using them. For all of us to be monitored like criminals for the fraction of a percent that actual are, is not right. Although, I understand child porn is a big problem, why is that, and only that, the thing that got chosen to be scanned. Isn't terrorism a big problem, isn't mass shootings a big problem... I can go on and on. Why is it not okay for them to add hashes to look for pictures of guns or bomb materials... Why aren't they searching all text on the phone for signs that something bad is about to occur. All great causes and if you are okay with someone searching your phone for child porn, you should be okay with them searching for anything else that your government deems inappropriate. It will only be a matter of time for these things to happen because if policing our devices is acceptable to society, then why not add more.

Also, Why is this only Apple and Googles problem. I think Ford and GM need to add interior cameras that scan our cars in case a child is being abused in there. I think all TV's need to come with cameras that search the room for child porn. Don't you know that child porn happens in some of those rooms. Its got to be really embarrassing for Sony to do nothing and allow such a safe haven by not doing anything about this. Again, this or any other crime is not Apple's or any other private companies problem to solve. Like its been for centuries past, Its the police and FBI's job to do it.
 
Then where is the outrage on Google, Microsoft, Dropbox? They are do this "policing". Oh right, because it is just cool to hate on Apple. The minute Apple wants to improve their NCMEC reporting they get called out, meanwhile every other major company that hosts files actively does this policing without anyone saying a word.
No, because they do it on their servers, not my phone.
 
Nothing new there. I've dealt with all this before on this forum in various threads on this topic. First of all, you can't argue against hypothetical conspiracy theories - people will cling to them to their dying day. They're not based on facts. I simply put ZERO stock in them and see absolutely no reason why Apple's proposed CSAM scanning method would lead to the horrible things they're "warning" about. Apple isn't handing the "keys to the kingdom" to any third party - they still have as much control over the system as they want, and I see no evidence that they're all of a sudden planning on going rogue and turning into some evil company in cahoots with authoritarian governments.

The letter also mischaracterizes the CSAM detection process. Apple has never stated that the system automatically notifies law enforcement of potential CSAM. Once the threshold is reached, the photos in question are manually reviewed to verify they are actually CSAM before being reported to NCMEC, who in turn notifies law enforcement. So there is absolutely no reasonable chance that someone will be reported to the authorities for innocent images (and Apple has stated that the chance of an account even getting to the point of suspension for 30+ LEGAL images (i.e. false positives) is less than 1 in 1 trillion.

The letter also fails to mention that users have the ability to disable any scanning by simply disabling iCloud for photos. So if someone is paranoid and thinks some government agency is after them for an anti-Biden meme image on their phone, then they can simply not use iCloud to upload their photos. And they shouldn't use any other cloud service, because since they can their photos too (and not privately, as Apple is proposing), they also have the technical potential to be in cahoots with abusive governments.

Look, I'm glad there are people and organizations concerned about security, but in this case I believe their concern lacks substance and is based on hypotheticals/slippery-slope falacies.
I get a big laugh out of people such as yourself that claim people that have concerns with this are just making up conspiracy theories and fallacies. Either you are trolling or you are very naive to how the real world works. Just in last couple months alone:

- Pegasus App from the NSO can be purchased to break into any iPhone. Proven that many governments have this software and someone got tracked and killed from the use of it.
- Apple and Google cave to Russian supreme leader and removes opposition political apps to sway an election.
- Just yesterday in MacRumours "...Apple Ignored Three Zero-Day Security Vulnerabilities Still Present in iOS 15"

I am sure this list can be expanded tremendously but there is enough there to make my point. What I listed are not theories, they are facts. Apple has proven over and over again that they will buckle to governments that threaten them with no longer allowing sales in that country. It has been proven over and over again that Apple's phones are not secure and their priorities to fixing vulnerabilities is not very high on their list. Its one thing not to agree or care about the issues that a lot of us are bringing up, but to chastise us as being some kind of loons that are coming up with conspiracy theories is just ignorant.
 
just wow how ignorant some can be! what hypothetical conspiracy theory is behind the fact that children/teens may get abused by their own parents/community (religion) when their parents/community get to know the child is gay/trans…?

Well, gee, if you're a CHILD and living with your parents, I think parents should know whether their CHILD is receiving sexually explicit photos, regardless of who is sending them.

But I wasn't even talking about the Messages part of Apple's plan, but rather the CSAM scanning. That's where all the conspiracy theories are coming from.
 
I get a big laugh out of people such as yourself that claim people that have concerns with this are just making up conspiracy theories and fallacies. Either you are trolling or you are very naive to how the real world works. Just in last couple months alone:

- Pegasus App from the NSO can be purchased to break into any iPhone. Proven that many governments have this software and someone got tracked and killed from the use of it.
- Apple and Google cave to Russian supreme leader and removes opposition political apps to sway an election.
- Just yesterday in MacRumours "...Apple Ignored Three Zero-Day Security Vulnerabilities Still Present in iOS 15"

I am sure this list can be expanded tremendously but there is enough there to make my point. What I listed are not theories, they are facts. Apple has proven over and over again that they will buckle to governments that threaten them with no longer allowing sales in that country. It has been proven over and over again that Apple's phones are not secure and their priorities to fixing vulnerabilities is not very high on their list. Its one thing not to agree or care about the issues that a lot of us are bringing up, but to chastise us as being some kind of loons that are coming up with conspiracy theories is just ignorant.
And you think eliminating CSAM will be the end all be all? If the government wants it THAT BADLY, they will force Apple to comply - with or without CSAM. I really don't understand these arguments that CSAM === government control. Even without the CSAM in place, if any government wants that control that badly they will force Apple to comply. It makes zero difference if CSAM exists or not.
 
I get a big laugh out of people such as yourself that claim people that have concerns with this are just making up conspiracy theories and fallacies. Either you are trolling or you are very naive to how the real world works. Just in last couple months alone:

- Pegasus App from the NSO can be purchased to break into any iPhone. Proven that many governments have this software and someone got tracked and killed from the use of it.
- Apple and Google cave to Russian supreme leader and removes opposition political apps to sway an election.
- Just yesterday in MacRumours "...Apple Ignored Three Zero-Day Security Vulnerabilities Still Present in iOS 15"

I am sure this list can be expanded tremendously but there is enough there to make my point. What I listed are not theories, they are facts. Apple has proven over and over again that they will buckle to governments that threaten them with no longer allowing sales in that country. It has been proven over and over again that Apple's phones are not secure and their priorities to fixing vulnerabilities is not very high on their list. Its one thing not to agree or care about the issues that a lot of us are bringing up, but to chastise us as being some kind of loons that are coming up with conspiracy theories is just ignorant.

Good, I'm glad I could brighten your day. There's a big difference between Apple removing an app from the Russian App Store in accordance with local laws (regardless of your opinion of those laws) and allowing a government access to iOS in order to spy on or otherwise abuse its citizens. Also, the argument put forth in the letter you linked was not "the system will be hacked" but rather that Apple will be pressured into allowing third party access to the scanning process. And there are indeed some on this forum that have actually asserted that Apple is using CSAM as a smokescreen to create a system for the express purpose of allowing government espionage on its citizens. If that's not a conspiracy theory, then I don't know what is.
 
And you think eliminating CSAM will be the end all be all? If the government wants it THAT BADLY, they will force Apple to comply - with or without CSAM. I really don't understand these arguments that CSAM === government control. Even without the CSAM in place, if any government wants that control that badly they will force Apple to comply. It makes zero difference if CSAM exists or not.
What kind of reasoning is that. There is a big difference between a government forcing a private company to invent something to spy on people and a private company already having the capability to do so and the government requesting the data you are obtaining. The biggest thing is accountability. Eventually these things come to light. If it ever gets found out that a government "forced" a private company to "invent" something to spy on people, it would be bad news for everyone involved. Completely different than telling me right off the bat, I will be scanning your photos. If you continue using the product it implies that you are okay with it and you have nothing to bitch about later.
 
Good, I'm glad I could brighten your day. There's a big difference between Apple removing an app from the Russian App Store in accordance with local laws (regardless of your opinion of those laws) and allowing a government access to iOS in order to spy on or otherwise abuse its citizens. Also, the argument put forth in the letter you linked was not "the system will be hacked" but rather that Apple will be pressured into allowing third party access to the scanning process. And there are indeed some on this forum that have actually asserted that Apple is using CSAM as a smokescreen to create a system for the express purpose of allowing government espionage on its citizens. If that's not a conspiracy theory, then I don't know what is.
Yes. You are a ray of sunshine to my day. I disagree, there is no difference between apple removing an app from the Russian store because as you said they are doing so in accordance to local law. Because Apple is providing them with a mechanism to do so, If a new local law states that the government is permitted to compare hashes of photos of anything they deem relevant, Apple would have no choice but to follow the law. Like you said, regardless of our opinion of those laws, that's how authoritarian governments work.

I was pointing out articles that relate to problems with Apples chosen method of scanning. There is not just one point of failure, there are many. From, authoritarian governments creating laws to give them access, to hacks that could have you targeted, to vulnerabilities that expose you.

When it comes to conspiracy theories. Do I believe Apple is conspiring with governments to create this back door. I am going to side on no but on the other hand, its quite plausible. There have been many public instances, where Apple and authorities have clashed about having access and putting in a back door. Apple always came out swinging on the side of privacy. Now this CSAM thing comes out of the blue and I hear crickets instead of Tim Cooks re-occuing speech on how Privacy is a fundamental right. How did we go from billboards and doubling down on privacy to an on device scanner that alerts authorities. It may not be true to think this is Apple's way of compromising to government threats but to say its a crazy thought. No, I am not going to say that because I wouldn't be surprised in the least if it turned out to be true.
 
  • Like
Reactions: eltoslightfoot
I disagree, there is no difference between apple removing an app from the Russian store because as you said they are doing so in accordance to local law. Because Apple is providing them with a mechanism to do so, If a new local law states that the government is permitted to compare hashes of photos of anything they deem relevant, Apple would have no choice but to follow the law. Like you said, regardless of our opinion of those laws, that's how authoritarian governments work.

I'm sorry, but I must insist there is a HUGE difference between removing an app on the App Store and granting third party access to local storage. Also, if a government would actually pass a law like that - requiring access to everyone's personal phone to scan their photos, they could pass such a law at any time they wanted to, even if Apple abandons their current CSAM scanning plans. They could legally force them to implement it. If it came to that, I truly believe Apple would pull out of that country based on their core beliefs. And even if they were only motivated by money, they'd be foolish not to pull out of that country, as complying with such an unreasonable request would surely cause FAR more financial damage in the long run through bad press than their lost sales from that country.

Apple always came out swinging on the side of privacy. Now this CSAM thing comes out of the blue and I hear crickets instead of Tim Cooks re-occuing speech on how Privacy is a fundamental right. How did we go from billboards and doubling down on privacy to an on device scanner that alerts authorities.

Did I not already cover this? Stop spreading this misinformation. The on device scanning alerts APPLE (not the authorities) and only if 30+ CSAM matches are uploaded to iCloud. The matches are then manually reviewed to be 100.0% sure that they contain CSAM, and even then the authorities are not alerted by Apple - Apple submits the report to NCMEC and THEY work with law enforcement.

And the whole rationale for on-device scanning is precisely BECAUSE of Apple's dedication to privacy. Rather than decrypting billions of photos in the cloud to scan them, they are scanned on your phone, where Apple can't see what's going on (i.e. it's private). Again, Apple sees NOTHING - no scan results period UNLESS you have uploaded 30+ matched CSAM images to iCloud - and then the only info the see is on only those photos (not all your photo).
 
I'm sorry, but I must insist there is a HUGE difference between removing an app on the App Store and granting third party access to local storage. Also, if a government would actually pass a law like that - requiring access to everyone's personal phone to scan their photos, they could pass such a law at any time they wanted to, even if Apple abandons their current CSAM scanning plans. They could legally force them to implement it. If it came to that, I truly believe Apple would pull out of that country based on their core beliefs. And even if they were only motivated by money, they'd be foolish not to pull out of that country, as complying with such an unreasonable request would surely cause FAR more financial damage in the long run through bad press than their lost sales from that country.



Did I not already cover this? Stop spreading this misinformation. The on device scanning alerts APPLE (not the authorities) and only if 30+ CSAM matches are uploaded to iCloud. The matches are then manually reviewed to be 100.0% sure that they contain CSAM, and even then the authorities are not alerted by Apple - Apple submits the report to NCMEC and THEY work with law enforcement.

And the whole rationale for on-device scanning is precisely BECAUSE of Apple's dedication to privacy. Rather than decrypting billions of photos in the cloud to scan them, they are scanned on your phone, where Apple can't see what's going on (i.e. it's private). Again, Apple sees NOTHING - no scan results period UNLESS you have uploaded 30+ matched CSAM images to iCloud - and then the only info the see is on only those photos (not all your photo).
I am sure there already are laws in some countries that allow the government to spy on their citizens and because Apple now has the means to do so, they must comply. I agree with you, that now that they made it, governments may force them to implement it regardless what they decide. That is a can of worms they did to themselves. Do you think the leader of North Korea cares about their citizens right to privacy. There is no hope their but I am sure there are a lot of governments in between that have no problem using the tools they have to spy on their citizens. Also, just because they already can spy on things, doesn't mean Apple should give them another tool to snoop even deeper.

You seem to think that Apple, a corporation, has morals. Like all public corporations, their sole mission is to make money for their share holders. CEO's that don't end up doing this don't last very long. For marketing purposes, Apple absolutely has given the impression that they have high morals but this reputation they spent decades developing is quickly eroding. I don't know if we are allowed to add links here, so I won't but just look up Apple's morals on censorship. Before Apple started selling phones in China, they were vocally opposed to censorship. Then they started selling millions of phones in China making it impossible for them to pull out "as you call it" with out having huge ramifications on their bottom line. The Chinese government knowing this demanded Apple change many things they vowed they would never do including censorship. So much for those high morals. This scanner thing, won't be any different.

I am not spreading misinformation, You are just being hung up by semantics. I know the device scanning alerts Apple and I never said otherwise. In this particular case, the "authority" making a judgement is Apple. The point is, I don't want the stuff I buy to scan for anything or report to anyone... period. Now this is where you are spreading misinformation. You say "rather then decrypting billions of photo's in the cloud" implies that the photo's in iCloud are encrypted. They are not. Which makes the rest of what you said about this being more private irrelevant. Just to make it clear, I don't think they should be scanning iCloud data for anything either. I already cancelled my 2Tb subscription and I won't be using it because my expectations to pay for such a service is privacy and security.
 
I agree with you, that now that they made it, governments may force them to implement it regardless what they decide. That is a can of worms they did to themselves.

Actually that's not quite the point I was trying to make. Even if Apple hadn't "made" this, totalitarian governments could still demand that they create technology to do it.

You seem to think that Apple, a corporation, has morals. Like all public corporations, their sole mission is to make money for their share holders.

At all costs? I don't think so. A corporation has executives and a board of directors. These are people. People have morals. To use an extreme example, if some government demanded Apple give them access to user data to identify LGBT people in their country so they could round them up an execute them - and if they don't, they will banned from selling any of their products in that country, do you think Apple will give in because--according to you--they have no morals and their only mission is to make as much profit as possible? I don't think you actually believe that.

I am not spreading misinformation, You are just being hung up by semantics. I know the device scanning alerts Apple and I never said otherwise. In this particular case, the "authority" making a judgement is Apple.

Oh please. This is clearly damage control. No one reading your original comment is going to think you meant Apple - it was clearly a reference to civil authorities, which is what people mean when they use the term "authorities" in any legal context.

The point is, I don't want the stuff I buy to scan for anything or report to anyone... period. Now this is where you are spreading misinformation. You say "rather then decrypting billions of photo's in the cloud" implies that the photo's in iCloud are encrypted. They are not. Which makes the rest of what you said about this being more private irrelevant.

Actually, they are:

Just to make it clear, I don't think they should be scanning iCloud data for anything either. I already cancelled my 2Tb subscription and I won't be using it because my expectations to pay for such a service is privacy and security.

Then I guess you'll never use any cloud service. I hope you've set up your own private server, because that's the only way you're going to achieve your expectation. The iCloud terms of service never promised you the same level of privacy that you have on your Mac or iPhone/iPad, so I'm not sure why you expected that.
 
oh well done EFF. Why don't you also do the "non-repair ability" one too while your there.. After all,, its all for the same cause.
 
Hell would have to freeze over before Apple backtracks on something they intend to do. Whether good or bad, they will do it out of sheer stubbornness and malice, rather than concede to the outcry. Rather like an insolent child.
 
By the way, why is Apple a safe heaven for CSAM pictures, but not a safe heaven for other kinds of illegal materials?
Maybe because CSAM photos are illegal to own, but maybe other types of photos are not, and that laws are passed to have service providers report what's classified as illegal them when found? Also maybe because companies are only doing what's required by the local laws they are operating in, and maybe because doing what's unnecessary is a waste of resources and gets into trouble with shareholders?
 
and maybe because doing what's unnecessary is a waste of resources and gets into trouble with shareholders?
Then why go out of your way to actively scan for CSAM? It's unnecessary. The law doesn't require you to do it.
And if you're doing it as a publicity stunt, just to show off how much you care, then why wouldn't you care about other things while you're at it?
 
Then why go out of your way to actively scan for CSAM? It's unnecessary. The law doesn't require you to do it.
And if you're doing it as a publicity stunt, just to show off how much you care, then why wouldn't you care about other things while you're at it?
My take is Apple is preparing to enable E2EE for iCloud Photos, which IMHO, they cannot currently, as they cannot scan for CSAM in their servers if the data is E2EE. I believe this is the reason why Apple's lawyers likely stopped the engineering team to stop their E2EE work.
 
  • Disagree
Reactions: glowdragon
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.