Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
30 images seems high (I know it could include false positives). And now that all this info is public those losers will know how to get around it

People who are into pornography of any type are usually addicted to it and collect hundreds and thousands of images, not just a dozen or so, so 30 seems reasonable to me. Cloud services have already been scanning for CSAM on their servers, so no one has "gotten around" this before or now. They still won't be able to store CSAM collections on iCloud without being detected. They could have 100,000 CSAM images on their phone and not use iCloud for photos, and Apple will never know about that, precisely BECAUSE this is not a "mass surveillance" as some people are twisting it.
 
I'm shocked 99% of them actually don't bother getting to know how it works, they even talk about backdoors without even knowing how it would possible lol but sure, hop on the trend and say you don't like this feature

Apple’s system doesn’t need a back door. It has a massive front door that’s wide open.

Want to know what’s on someone’s phone? Okay well just replace this CSAM database with whatever other database you’re interested in. Done.
 
Weird, because I haven’t seen a single coherent explanation of why it’s a problem if your own device scans your photos for child porn, and only does so if you are trying to upload onto apple’s servers, and only produces information to Apple if you have at least thirty child porn photos that you are trying to upload.
Right Clark, thanks for the explanation. Guess Edward has no clue what he's talking about but we have privacy gurus like yourself to enlighten us. From a human right to should be fine because it's Apple, hypocrisy at it's finest.
 
They can educate everyone as much as possible but I think the social court has already made its emotional ruling.
It’s not an emotional ruling, let’s say people aren’t as stupid as they were when they accepted surveillance after 9/11 which didn’t solve a **** tbh.

It doesn’t matter how much Apple want to disguise it, it’s surveillance simple as that and on your device not even on their cloud services.

Stop to be an advocate and trust a company that really don’t give a **** for you or us.

Also add the FaceID as a plus. It’s a beautiful device for any Gestapo agency wannabe.
 
I'm shocked 99% of them actually don't bother getting to know how it works, they even talk about backdoors without even knowing how it would possible lol but sure, hop on the trend and say you don't like this feature
Given a fully week of Apple FAQ's and clarifications I don't think even they know how it actually works so asking the public to understand it is unrealistic. I'm shocked at how many people view privacy concerns as "screeching voices" and blindly trust.
 
Weird, because I haven’t seen a single coherent explanation of why it’s a problem if your own device scans your photos for child porn, and only does so if you are trying to upload onto apple’s servers, and only produces information to Apple if you have at least thirty child porn photos that you are trying to upload.
I honestly can't tell if you're joking. Apple bends over backwards whenever the CCP threatens to cut off all that sweet sweet Chinese revenue - do you really expect them to all of a sudden grow a backbone when they start demanding that Tiananmen Massacre, Winnie the Pooh, and pro-democracy content be added to the "flagged content" list?
 
Apple

Don’t even start down the road of searching through users content on their device for “illegal things.”**

There is no end to that once you begin




**by illegal things we mean an ever changing subjective list as determined by any number of potential parties
 
Last edited:
Keep dropping documents and interviews, Apple, like we don’t completely understand how any of this works.
We get it, we’re still not ok with it.
Keep treating this like a case of “you’re privacy’ing wrong” and if you still feel so set on dying on that particular hill of public opinion.
 
Lol, wut?

What, this comes as a surprise to you? When's the last time you read a news story about someone busted for child porn where it was revealed they had only 10 images of it? Unless they were just getting started when they got caught (or successfully hid most of it), that's rarely the case. Normally it's at least hundreds of images that are found on their devices. Definitely normally more than 30.
 
People who are into pornography of any type are usually addicted to it and collect hundreds and thousands of images, not just a dozen or so, so 30 seems reasonable to me. Cloud services have already been scanning for CSAM on their servers, so no one has "gotten around" this before or now. They still won't be able to store CSAM collections on iCloud without being detected. They could have 100,000 CSAM images on their phone and not use iCloud for photos, and Apple will never know about that, precisely BECAUSE this is not a "mass surveillance" as some people are twisting it.
Do you have access to their code? Do you even know how it works internally? Or you just trust a company ruled by a government that brainwash people like you to give biased opinions like that?.

Stop the bs about: “think in the children” and start using your brain for the real problem here: Surveillance.
 
I honestly can't tell if you're joking. Apple bends over backwards whenever the CCP threatens to cut off all that sweet sweet Chinese revenue - do you really expect them to all of a sudden grow a backbone when they start demanding that Tiananmen Massacre, Winnie the Pooh, and pro-democracy content be added to the "flagged content" list?
So you’d rather that they do the scanning on their own servers in secret, instead of in the device itself where we will know instantly if they start comparing to taboo Chinese photos because security researchers have access to phones?


I honestly don’t get your point one bit.
 
This is a distraction strategy run by PR to sell the idea that there is confusion.

This will continue until Tim has to step in and announce that it's dead and they are rethinking their strategy.

I am here for all of it.
This is getting more offensive the longer they drag on with more excuses and interviews
 
I think Sally Mann had more than 30 photos in her Immediate Family exhibition. According to many they would all fall foul. All child abuse according to many. It is just a bad idea. AND all the kerfuffle about Pegasus.

Are those photos in the database of known kiddy porn? I don’t think so. Apple isn’t looking for a type of photo - they are matching against specific verified photos.
 
Weird, because I haven’t seen a single coherent explanation of why it’s a problem if your own device scans your photos for child porn, and only does so if you are trying to upload onto apple’s servers, and only produces information to Apple if you have at least thirty child porn photos that you are trying to upload.

Then actually go read some of the threads on the subject or one of the many critiques from privacy advocates. To not find “a single coherent explanation” requires actively remaining ignorant of the conversation.
 
I just don't want Apple to be scanning iCloud period. It's a way to look over and go through our privacy. What if information gets leak to the government or the criminals. Who's held responsible for that?

Find an alternative way to catch criminals.
They are not scanning iCloud. Not at all. This sort of false accusation is what the hubbub is about. People don’t understand how this system works.
 
And they still won’t give us an explanation as to why they won’t fully encrypt iCloud services end to end without any of this encryption bypassing technology. Going that route would be the rational (Ned legal) thing to do if you’re a so called privacy advocating company.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.