Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If Apple would want to save the children, they would ban cameras from their iPhones and the screen would be a bw dot matrix display only showing text. Oh, and of course there would be no internet modem.

The problem is not the users, it's the technology.
 
I don't care about the "children". Make the law against that sort of thing truly terrifing, like boiled alive when caught. Law is about Deterrence. Not about trying to scan through photos. Those f**ers will do anyway with a different phone. Rest of us pay the price. knowing Apple they will screw it up anyway
They're not "scanning through photos" they are matching hashes from known hashes that are on a database.

Are people so stupid that they don't know this? It's well documented.
 
I keep reading that it’s not Apple‘s role to play police here and hunt criminals. At the same time you are asking them to police other countries‘ governments.

Let‘s be real. If the Chinese government wants to scan people’s photos, they will issue a law that all devices store backups on Chinese servers. And then Apple can either comply or leave the market. It’s delusional to think Apple can actually control a foreign government.
First of all we are not talking about a democratically elected "foreign government" here, that is founded on the western principles. We are dealing with a dangerous authoritarian regime that is killing its own people, hunting down and assassinating dissidents domestically an internationally, has its citizens under surveillance 24/7, etc. If anything, under these circumstances, since Apple is preaching to be all about "human rights" and individual freedom then they shouldn't even negotiate with such a disgusting regime. Maybe sell some iToys in their market? Sure, but don't give so much power and wealth to that regime by choosing to produce all of your products there when Apple knows exactly that the people in charge are pure dictators that do not respect human rights.
 
  • Like
Reactions: eltoslightfoot
They're not "scanning through photos" they are matching hashes from known hashes that are on a database.

Are people so stupid that they don't know this? It's well documented.
No. You are uninformed, casually rude and on top of this arrogant. It's well documented.
Do your research instead of ignoring reality. Apple is a global company with unprecedented power.
Nobody from the big tech is living in the Apple's comfort zone. Nobody in the tech industry have this outreach towards data. If I want to ignore Google, I can use instance of https://searx.me/. If I want to ignore Facebook I can use https://joinmastodon.org/. If I want to avoid Twitter cesspool of self promotion and self importance there are ways to go in the open web. RSS and personal blogs.
Apple has it all, because of closed source software and hardly repairable hardware. You can easily root an Android phone and install alternative OS.
Apples phones you can only Jailbreak.
This comfort was given by the users trust,
What Happens On Your iPhone, Stays On Your iPhone..Right? https://bit.ly/2VmkYjj

Matching hashes with third party non auditable database provided by government funded corporation is on device processing in conflict with the idea of privacy (4th amendment) and clearly breach of users trust.

P.S.Apple trolls are everywhere. Apple PR is on overload this days.
 
Last edited:
Personally I think Apple are going to do this regardless. They may change how they represent it and how much they tell the user but they clearly want this to happen.

The only reason I think they're put it on pause is iPhone 13 sales. Once iPhone 13 is out and iOS 15 on enough devices they'll just push it out. Its not like anyone at that point, once on iOS 15, is going to say no to software updates from then on out.
That's exactly my thought when I heard them say they will put a pause on it, it was fear out of iOS15 adoption numbers.

If at the end of the day, with all the hashes, security protocols, and promises, if someone can physically review your pictures, then that is the very definition of backdoor.
 
They're not "scanning through photos" they are matching hashes from known hashes that are on a database.

Are people so stupid that they don't know this? It's well documented.
What I find stupid is people are still talking about how the spyware works instead of whether it should exist on your device at all….. zero people who are against this care about how the code works, they just don’t want the code on their devices. Full stop, the entire issue goes away if Apple just sets up an intermediate server to scan hashes on way to iCloud servers or some other process as long as it takes our devices out of the loop. Don’t be distracted by the trolls and fan boys…. This is a very simple singlular issue
 
They're not "scanning through photos" they are matching hashes from known hashes that are on a database.

Are people so stupid that they don't know this? It's well documented.

Could you tell me how apple is able to inform parents of nude photos teens get, if apple doesnt scan every photos without knowing hashes? im quite sure that teens dont send known/hashed csam pics to each other.

Also, would it be ok, that your postman read your mails to be sure that you arent planning any illegal activity? Or the postman open your packages to be sure, that you havent ordered any illegal stuff?
 
Last edited:
  • Like
Reactions: meady100
Could you tell me how apple is able to inform parents of nude photos teens get, if apple doesnt scan every photos without knowing hashes? im quite sure that teens dont send known/hashed csam pics to each other.

Also, would it be ok, that your postman read your mails to be sure that you arent planning any illegal activity? Or the postman open your packages to be sure, that you havent ordered any illegal stuff?
That’s a different program Apple was working on entirely, it uses an a.I. Algorithm to look for nudity…it’s the part Apple tried to blame consumer confusion over the csam thing…. It a separate issue, some don’t agree with it either but really does need its own thread…. It has zero to do with the issue being discussed here
 
  • Like
Reactions: rmariboe
Also, would it be ok, that your postman read your mails to be sure that you arent planning any illegal activity? Or the postman open your packages to be sure, that you havent ordered any illegal stuff?
Indeed. Or if the DPD delivery driver pushes past you at your door, any time of day or night, and does a shake down on your house, top to bottom. You know, just in case you might have broken a law.
 
They're not "scanning through photos" they are matching hashes from known hashes that are on a database.

Are people so stupid that they don't know this? It's well documented.
Are you so oblivious that you think that matters in terms of privacy? A hash is nothing more than a perceptual summary that captures essence of the statistical structure (perceptual features) of the of the image. I and many others don't want Apple scanning either raw images or summaries of images on our iPhones.

Also, I guess you missed the bit where Apple said a human would review photographs if there were enough hits (I believe the threshold is 30). Apple's entire strategy is based on the false positive rate being low and errors being independent of each other. Apple cannot discuss the creation of the hash in detail because it would trigger an arms race of editing CSAM images by pedophiles, so we cannot know about the rate of false positives, but I guarantee you the latter assumption will not be true. People often take a series of pictures that are nearly identical. If one is a false positive, others in the sequence are also likely to be a false positive. And even if somehow, by magic, the false positives are independent of each other, the more pictures you store on iCloud, the higher the risk of triggering the human review. One more thing: surely smart pedophiles will simply set the number of pictures on their iPhone to the threshold-1 (29). Then they can never trigger the CSAM system.

Apple's proposal is what happens when you let engineers run amok without input from social scientists, legal scholars or statisticians. It is a staggeringly stupid and dangerous idea no matter how well intentioned it was.
 
Could you tell me how apple is able to inform parents of nude photos teens get, if apple doesnt scan every photos without knowing hashes? im quite sure that teens dont send known/hashed csam pics to each other.
...
I don't like Apple's proposal either, by I think the child safety features are triggered when an image in transmitted from one phone to another (e-mail, text). That's a different kettle of fish than scanning iCloud images locally on the iPhone, because it is probably done server-side and the child protection features will be selected by the adult owner of the child's iPhone. I personally don't object to that, but I do object to the CSAM surveillance.
 
  • Like
Reactions: Schismz
Originally this was supposed to just compare the hash of photos to the hash of existing known child porn as it went into or out of encryption. I don’t really have a problem with that. Active AI scanning of actual images seems like a great idea, but it’s encryption back door requirements and near instant potential for misuse by governments worldwide is shocking.

If you don’t understand this, just ask yourself if you think someplace inside of Apple there will be a server full of child porn waiting to be compared to images on your phone…. Or will those images and services be provided by governments, who would then need direct unencrypted instant access to your device. Scary.
Well, they're actually going to distribute to every device a database of hashes against which any picture being uploaded to iCloud is checked. Thus, the whole process takes place on your phone before encryption. The database will be Apple's, but it will be a third party deciding which hashes to include, and Apple will have no insight in the data these hashes represent.

The scary part is actually Apples claim that any positive match will then be checked by an Apple employee, meaning that someone at Apple has access to the iCloud data which was until recently touted as perfectly private, protected by your own, secret key unknown to Apple.
 
  • Like
Reactions: VulchR
I strongly suspect the government is involved in this. Still, the fact that they have at least delayed it bodes well. People need to hold off iPhone 13 because of this so Apple gets the message.
Completely agree about government involvement/pressure!

What worries me is that either:
1) They’ll silently implement it anyway (either willingly or through government coercion)
2) The government will implement an even more terrifying program in its place — one that Apple can’t circumvent

And once it’s implemented, we all know the US government will be issuing FISA warrants left and right to add new hashes to the algorithm for “undesirable” things they deem inappropriate…. And they may not even divulge to Apple what those hashes represent (could be more kiddie porn, could be searching for a book or poster they don’t ideologically approve of). And before people come out with the left or right wing arguments — this isn’t a political thing. Snowden showed us this is beyond the control of policy makers or political oversight.

And that’s just the US. Think of what others would like to do with it…
 
  • Like
Reactions: VulchR and rmariboe
absolutely!
Its mind-blowing how people are not prepared to do EVERYTHING possible to stop this kind of abuse. People need to get it in to their heads that if they are online in anyway there is no such thing as complete privacy. Someone will be able to get into whatever you think you are protecting. Your location, everything.. its all tracked. You're fighting for an illusion.
Absolutely not. Apple has promised that anything stored in iCloud is encrypted by the user's own private certificate - meaning only cryptographs and no real user data is stored in iCloud.

Is Apple is to check positive matches, this means someone at Apple must have access to the images that were matched. Meaning either the whole privacy narrative was a hoax, or you don't know when your phone might send your private pictures to Apple for human analysis.
 
  • Like
Reactions: VulchR
In one of the articles that MR has reported on the matter (am not able to find it at present) I remember reading about Apple saying that images are encrypted and as a result it would take a lot of computing power and programming to pro-actively scan images on icloud servers thus is much easier, simpler and quicker to scan for image hash values on a users device where there would only be a few image files to scan rather than millions of image files. Having to scan the servers on a daily basis would slow down the servers.

As for other tech companies scanning their cloud storage servers, I do not know if they encrypt images in the same manner that Apple does.
The reason is Apple wanted to still be able to employ E2EE for iCloud, which they have kept postponing. They ultimately want to be able to say "We can't look at your stuff even on our servers." They cant do that if they scan on their servers, it would either reveal their encryption can be broken (by them no less) or that they have a key, so it's not truly E2EE. Their solution was to do it on the device, before it was encrypted and sent to iCloud servers. That way they could say they don't know what is on iCloud, but they know it's not CSAM. They want E2EE on iCloud bad, it fits into their whole privacy ethos in the way the CSAM detection does not. They were so focused on threading the needle they didn't consider the consequences.
 
Apple's proposal is what happens when you let engineers run amok without input from social scientists, legal scholars or statisticians. It is a staggeringly stupid and dangerous idea no matter how well intentioned it was.
This is another big reason for me to abandon Apple ecosystem, this is systemic problem. The quality of their software design and ultimately future products is suffering well known problem in the industry - marketing overreach combined with corrupt management and low quality personnel. When you forget about the core value of the company and run wild with politics and wishful thinking, mistakes like this are guaranteed to happen all the time. The fact that they continue to resist public pressure and just "delaying it" is another proof that this company has reached the limit of expansion and cannot deserve the trust anymore.

This "feature" was build in isolation from reality, by engineers and managers with "yes, sir/madam" mentality in a board room with " highly successful " Apple leaders full of themselves, possibly representatives from government agencies, with logical question in the end "How we are going to sell it?" - As a fight for a common good and "saving the children" from predators. A golden classic.
And to push for normalization of surveillance lets give parental picture control over iMessage. People will like it and will have no problem with active scanning in the future and lack of encryption in Apple services.
 
Last edited:
That’s a different program Apple was working on entirely, it uses an a.I. Algorithm to look for nudity…it’s the part Apple tried to blame consumer confusion over the csam thing…. It a separate issue, some don’t agree with it either but really does need its own thread…. It has zero to do with the issue being discussed here


Re-read the original post: ”The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.”

The topic is for both features.
 
this is the only photo left in my iCloud! and I have cancelled my upgrades!

https%3A%2F%2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com%2Fpublic%2Fimages%2F55dcab7a-3977-479f-a522-60238ab2f895_1918x1345.png


i was thinking this one and and add a known hash to it.
 
Apple has been doing on device scanning of photos for years where they actually look at the content of the image so as to classify it as having donuts, mountains, or Jimmy. Now they propose to scan image hashes, not the image itself, for known child pornography and this is pushing people over the edge.

I suppose the difference is they don’t notify authorities if you have reached an arbitrary threshold of donut pictures. But if the fear is that this technology could be abused by bad acting nation states, the framework already exists and has for some time.
This right here has always bothered me about this argument against CSAM. My iPhone knows about landscape, dogs, pools, etc…. Yet people never once batted an eye about it. Like literally there could already be a back door plan handing over your information to covert governments. I personally don’t think thats happening but part of the tools are already there.

I’m not against discussion against or for CSAM but I really wish I could be in the room where people are crying fowl over this CSAM business and say ‘Dude take out your phone and type in ”grass”. How may photos popped up with grass in it? Why are you now complaining and do you understand the difference?”
 
  • Like
Reactions: topgunn
This right here has always bothered me about this argument against CSAM. My iPhone knows about landscape, dogs, pools, etc…. Yet people never once batted an eye about it. Like literally there could already be a back door plan handing over your information to covert governments. I personally don’t think thats happening but part of the tools are already there.

I’m not against discussion against or for CSAM but I really wish I could be in the room where people are crying fowl over this CSAM business and say ‘Dude take out your phone and type in ”grass”. How may photos popped up with grass in it? Why are you now complaining and do you understand the difference?”

But all the time you have had all the rights your content, your images… What it is differend that Apple start acting like everyone is criminal and every phone should be scanned. Then they start taking control to be your nanny and watch and warn about contents and inform other the messages you get.

It is not only scanning illegal material, but is is also scanning messages, emails and then photos sends to icloud and lso opening a port for misusing.

What about a situation… A person working in a particular job and others dont like the results they will see.. They buy prepaids, sends some csam content to the person to get rid of him/her.. When the investigation is done, properly, the damage has already happened… Or you dont like a teacher, send some csam photo from prepaid… Or you dont act as wanted in some country, use some hashed just a normal photos to follow the activity… The misuse of the feature is large enough to start boycotting apple.

Want to try to be a gay in middle east when apple activite this feature?
 
  • Like
Reactions: VulchR
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.