Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Ha ha really :), There was only 3 points, which one(s) incorrect then, are you saying there is no confuson at all? Yes/no are you saying there is no scare mongering yes/no or is it they are scanning your phone Yes/no - and I do apologise if i have misunderstood but I did get impression they were only analizing picture that had been uploaded to the Photos icloud library, you think they are actually looking at contents on my phone weter uploaded or not?
I'm only trying to grasp the situation correctly :-(

Three false statements in eleven words. That's not a record you should be proud of.

There is little in the way of confusion about Apple's approach, which has been well described on numerous occasions.
There is no scaremongering. There is, however, repeated assertion that it's simply a bad approach for Apple to take, for very good reasons.
You do misunderstand. Your iPhone will scan photos for CSAM content on-device, before transmission to iCloud, if and when Apple enable CSAM scanning.
 
its too late. the tool is already created and Apple propagated the idea. Just a matter of time till some government creates its own version.
 
  • Like
Reactions: VulchR
No, I do not think that should be taken for granted. For one thing, the system is supposed to trigger only when there are around 30 pictures with hash matches, to reduce false positives. That means moderately careful CSAM-consumers are safe even when they occasionally do not pay attention and let such a picture into their iCloud. (How long until governments demand a lowering of the threshold? "Apple allows 29 CSAM-pics per user!" Bad PR.)

Then after triggering it is when things get really murky. Apple's reviewers are supposed to provide another safeguard against false positives, but they only get to see degraded variants of the user pics, and they do not get to see the CSAM-originals from the database. How will they decide when their review-version is not blatantly obvious? Are these pixels a teenager or a young adult? Discard, or forward to NCMEC and let them sort it out?

Which leads to NCMEC, the big unknown. A semi-private organization, shielded from transparency requirements and oversight, yet legally the sole US authority over its database of alleged CSAM pictures, and in practice the leading authority worldwide. To my knowledge there is no independent auditing regarding the "quality" of that database. However, when NCMEC gets pictures from companies, they decide whether to forward them to the relevant authorities - and from the Swiss federal police we know that 90% of these NCMEC-notifications are not actually CSAM and discarded. So NCMEC is apparently quite clueless about CSAM - not a good sign for their database. They just like to spamspamspam police agencies. The police has to review every such notification, tying their up resources. Investigating actual CSAM-cases takes time, and arguably NCMEC's spam may hamper actual investigations.

Unfortunately NCMEC's spam is useful for agencies that demand further erosion of privacy. E.g. the German federal police regularly touts the inflated numbers from NCMEC as a reason why lawmakers should mandate generous data retention for ISPs. Basically: "We get so many reports, often ISPs have deleted critical data by the time we start investigating a case!"

TLDR: Even when ignoring concerns about privacy and future expansions, I remain unconvinced that the proposed system would not do more harm than good when it comes to fighting CSAM.
Thank you for contributing to the discussion. I don’t think I care if they were to lower the trigger to 1 but Apple has stated they will refuse demands of the government. So there’s no answer that works here but if they thinking lowering the threshold is bad they will refuse. (Being honest with their system).

Can you please provide sources for your statements in the 2nd paragraph. Specifically the parts that state they see degraded variants and also do not get to compare them against the CSAM-originals. If true I don’t find it all that concerning (personally) but I don’t want to dismiss your concerns either.

Also can you provide sources for your claims in the 3rd paragraph about NCMEC. Keep in mind that Apple has already stated they do not get their CSAM references from just one agency and that they get it from multiple agencies. This is stated in their documents to help combat outside governments from controlling a reference source. Also I’m not sure you are understanding the difference between CSAM and NCMEC. CSAM is already verified illegal images, meaning they were already investigated and the criminals dealt with and the photos added to the database. What NCMEC get might be CSAM but as you say might not. However Apple isn‘t using the images they get. They are only using CSAM that has already been vetted. The stuff coming from NCMEC may become CSAM and will in turn be added to Apple’s search. Its already being verified by the investigating bodies that caught the criminal.

Again if Apple’s system is honest it will have an affect. Since Apple wants it to be affective if outside forces try to manipulate the information they’ll likely just improve their processes and add more layers to make or keep it working. This also means they can still say no to any government agency even if they get a gag order. When you’re as big as Apple the government can say you need to do what we say or we’ll remove you from our country. 2 trillion dollar company says no and government is probably standing there with a gun loaded with blanks. They likely ask or even threat but unlikely they’ll pull the trigger as to the ********* it would cause.

Apple is made up of people as well who also are concerned just like you and will voice their opinions if something is morally wrong. However this doesn’t mean it can’t happen. I just don’t think Apple is trying to be malicious on this and trying to force their hand would likely not happen (my opinion). Apple has also fought this fight before with the US government and said piss off and sorry.
 
When you’re as big as Apple the government can say you need to do what we say or we’ll remove you from our country. 2 trillion dollar company says no and government is probably standing there with a gun loaded with blanks. They likely ask or even threat but unlikely they’ll pull the trigger as to the ********* it would cause.
When you are a country as large as China, they are not standing there with a gun loaded with blanks. Apple has stated they will follow the laws of the countries they operate in. Apples resistance to US demands were contested in the courts, not refused outright. Apple will follow a final US court ruling or the penalties for non-compliance could be ruinous even to Apple. In the end Apple's morality will give way to the power of the law, which in theory, does reflect the cultural norms and general morality of the country making those laws.

In general the only way Apple can legally refuse to do something required by law in most jurisdictions is to assert then prove that they are unable to do what's ordered. Having created technology and an implementation for on device surveillance means they can no longer make that claim.
 
Last edited:
  • Love
Reactions: VulchR
No, they’re not. Cloud providers only scan files when they hit their servers. Apple is scanning the files on your phone.
You are using semantics here to defeat the point. Apple’s proposed system ONLY activates if you connect the phone with iCloud.Even then, it only scans the hashes for matches. While there are real reasons to worry about the implications for abuse and weaknesses, some of the posters here should stop depleting the worlds tin foil supplies.
 
Thank you for contributing to the discussion. I don’t think I care if they were to lower the trigger to 1 but Apple has stated they will refuse demands of the government. So there’s no answer that works here but if they thinking lowering the threshold is bad they will refuse. (Being honest with their system).


If this is implemented, Apple will change that “30” number. Apple stated that they were using ”30” initially as there was unsurety around the severity of false positives. As time goes and they can better lock down the actual instances of false positives, that number was to be adjusted.
 
lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?
They need to find another way to catch offenders. It's like reporting all cars speed in a pursuit to catch offenders. No way.

The same goes for terrorists or any "threat" they present to justify spying and surveillance.
 
Thanks 1284814, i was just asking, we are not talking about brain scan watchlists isn't that you scare mongering?
Yes, i don't want people to know my bank details but what is their on photographs you are concerned about, i'm not being judgemental just asking
as for scaning devices i thought this was the position "Yes, stopping iOS 15 from scanning your photos is rather straightforward. All you have to do is stop uploading your photos to iCloud and you are good to go. Apple has confirmed that the check is performed only if you have opted to upload your photos to iCloud"
i am not sure what you are refere to when it comes to fear mongering. The thing is simple I don’t want the codes of this scanning crap on my phone. And this has nothing to do with what I am trying to hide. Which is exactly what I said in my previous comment. It is the equivalent of a rando asking me private questions. they don’t have to go to an illegal stuff.
pat last I have been watching a trend of tech companies assuming governmentAl roles, and going even further than they would because tech is a ever evolving landscape. I am not buying for the children crap from any of them including Apple. I don’t mind the scanning if Apple is hosting the photos. It has to stay server side. Not on the device it self. Simple. disabling iCloud has nothing to do with my problem. Besides Apple, as well as the 3 letters agencies could arrest me and force me to give them access to my device if I am ( a criminal). Apple is a tech company , not a government or a doctor - they should act like one. And do the scanning server side like the government asked.
 
You are using semantics here to defeat the point. Apple’s proposed system ONLY activates if you connect the phone with iCloud.Even then, it only scans the hashes for matches. While there are real reasons to worry about the implications for abuse and weaknesses, some of the posters here should stop depleting the worlds tin foil supplies.

Apple's image comparison does scan actual images...

I haven't been following the conversation between you and MuppetGate, so I'm not going to enter the argument one way or the other between the two of you, however, I need to point out this constant miss-understanding about Apple not scanning images and only comparing hashes - is wrong.

You probably know this, but just in case you don't - if you were to take a RAW photo of something in your garden, or what ever, process it in Lightroom / Capture One / DxO etc., and export it to JPG image. Now increase the saturation a little (even by a fraction that you cannot perceive) - export another JPG. Now change the image to greyscale, export yet another JPEG. Finally, produce an MD5 hash of each JPG file. Low and behold they all have a different hash.
Most people know this, and I expect you know this too.

However, Apple's scanning will say all those JPG images all have the same 'hash'. Indeed, if you rotated the image, reduced the resolution, change the color profile, and exported to a higher JPG compression - their 'hash' would still be the same. This is because their hash is made from the content of the image, not the bits n bytes of the file. In order to do that, their software needs to open the image, load it, process it, analise the content, and finally produce a hash. It identifies people and objects in images. They then use that hash to compare against their hash database. So a big fat YES they are scanning what is IN the image. When people say "they are only comparing hashes" they are missing the point that in order to do that, you have to produce a hash first.

Perhaps this doesn't matter. However, the comment 'they are "only" comparing hashes' is simply wrong.
 
lot of confusion/scare mongering here, they are not scanning phones. They are looking at photos that are on Apple servers. How can anyone object to this unless you have something to hide, why would you object when it helps children?

If you could step out of the soap bubble and think about the issue in a larger scale and many possible scenarios how the thing can be misused.
 
  • Like
Reactions: 1284814
Ok I’ve heard enough.


I thank all of you for at least giving me the moral green light to bring this issue at the intergovernmental arena in 2023. Yes that is how long it will take and action will not happen - if successful until after 2025.


You are not alone and you have power. This is is not solely a Apple thing but with international co-operation will it have limits. I didn’t say end, but have limits. Thanks for this green light, nothing is for sure, but at least I have a track record (BTW I’m not fat, it was a nick name a girlfriend gave me).


Cheers!
 
You are using semantics here to defeat the point. Apple’s proposed system ONLY activates if you connect the phone with iCloud....
For now. And what happens if an authoritarian government takes the technical documents Apple posted in defence of their proposed 'surveillance, but with privacy' scheme and creates their own malevolent system? There are children in countries ruled by authoritarians who can be harmed as well. And the range of perceptual features that machine algorithms can recognise in data is immense - it spans from the ethnic identification of Uyghurs to the identification of gay people just in the domain of facial recognition alone.

I don't believe Apple is in some deep state conspiracy. i do believe their engineers have been allowed to go off half-cocked without thinking the implications of their invention through. We can ill afford that. We have to think several steps ahead or the future will be very bleak indeed.
 
Last edited:
However, if Apple plans to have access to offline content stored physically on one's Apple device, then it is a whole new level of privacy invasion.

Apple does not plan to have access to offline content. The proposed software would run “client side”. That means your side, not Apple’s side (server side). And you could be turned it off if you don’t want it.

Whether you believe that’s a good idea or not, you should at least understand what Apple is talking about, not listen to people who are unintentionally (or perhaps intentionally?) misrepresenting it. Otherwise, you’re just arguing about a strawman.
 
Seeing all the comments and the continued narrative of CSAM still makes me think, well where is the alternative? If I decide to stop using apple what do I use? Android? Windows? Etc.
 
  • Like
Reactions: VulchR and dk001
Seeing all the comments and the continued narrative of CSAM still makes me think, well where is the alternative? If I decide to stop using apple what do I use? Android? Windows? Etc.

That is a very good question. The answer will vary.
iPhone to Android, to Linux, to one of the non-Google Androids, stay on iOS but keep a close eye .....

Not an easy decision by any means.
 
Seeing all the comments and the continued narrative of CSAM still makes me think, well where is the alternative? If I decide to stop using apple what do I use? Android? Windows? Etc.
The big differentiator for Apple was its solid and proven privacy stand and a lot of people put a lot of weight on that for their buying decisions. Now that that has been effectively dropped, Apple is just another provider among many, so decisions can be made on other criteria as to who is best for what is needed. It opens up the options a lot.

So far only Apple plans on putting spyware on personal devices that can call the cops on you if it finds something illegal and that Apple doesn't like. The others haven't gone that far but with Apple leading the way they will likely follow. I guess I'll stick with Apple until it actually happens, then decide what to do then.
 
The big differentiator for Apple was its solid and proven privacy stand and a lot of people put a lot of weight on that for their buying decisions. Now that that has been effectively dropped, Apple is just another provider among many, so decisions can be made on other criteria as to who is best for what is needed. It opens up the options a lot.

So far only Apple plans on putting spyware on personal devices that can call the cops on you if it finds something illegal and that Apple doesn't like. The others haven't gone that far but with Apple leading the way they will likely follow. I guess I'll stick with Apple until it actually happens, then decide what to do then.

The day that does happen I’ll instantly sell my device and go a different direction.
 
If this is implemented, Apple will change that “30” number. Apple stated that they were using ”30” initially as there was unsurety around the severity of false positives. As time goes and they can better lock down the actual instances of false positives, that number was to be adjusted.
Okay, so the more reliable it is the more likely they’ll lower it. Seems fine unless I’m missing something.
 
When you are a country as large as China, they are not standing there with a gun loaded with blanks. Apple has stated they will follow the laws of the countries they operate in. Apples resistance to US demands were contested in the courts, not refused outright. Apple will follow a final US court ruling or the penalties for non-compliance could be ruinous even to Apple. In the end Apple's morality will give way to the power of the law, which in theory, does reflect the cultural norms and general morality of the country making those laws.

In general the only way Apple can legally refuse to do something required by law in most jurisdictions is to assert then prove that they are unable to do what's ordered. Having created technology and an implementation for on device surveillance means they can no longer make that claim.
I think (not know) Apple will defy even China’s demand on something like this. Only time will tell
 
  • Haha
Reactions: dk001 and bobcomer
That is a very good question. The answer will vary.
iPhone to Android, to Linux, to one of the non-Google Androids, stay on iOS but keep a close eye .....

Not an easy decision by any means.
Emphasis on your last point, it wont be an easy by any stretch. It almost seems inevitable and not many places to turn to. Aren't most of these companies already running some sort of software like this ? I feel like the further we move into the future with technology, the less privacy we have.
 
The big differentiator for Apple was its solid and proven privacy stand and a lot of people put a lot of weight on that for their buying decisions. Now that that has been effectively dropped, Apple is just another provider among many, so decisions can be made on other criteria as to who is best for what is needed. It opens up the options a lot.

So far only Apple plans on putting spyware on personal devices that can call the cops on you if it finds something illegal and that Apple doesn't like. The others haven't gone that far but with Apple leading the way they will likely follow. I guess I'll stick with Apple until it actually happens, then decide what to do then.
That was indeed a huge and vital part of them separating themselves from the bunch, heavily marketing privacy. I've read were many people haven't upgraded to iOS 15 because of the csam news. Have you upgraded ?
 
  • Like
Reactions: dk001
That was indeed a huge and vital part of them separating themselves from the bunch, heavily marketing privacy. I've read were many people haven't upgraded to iOS 15 because of the csam news. Have you upgraded ?
I've upgraded as Apple hasn't started doing their on-device surveillance and I do have some hope they won't. I'll likely sell my Apple stuff and move on if they actually start doing it. Unfortunately there may be no place to go if Apple starts this, the others will follow and may be forced to by law.
 
Seeing all the comments and the continued narrative of CSAM still makes me think, well where is the alternative? If I decide to stop using apple what do I use? Android? Windows? Etc.

Right now, I'd argue the best real world choices for privacy are probably Linux for computers, and deGoogled Android. But...these choices may or may not work for a given person.

I use Linux--but it helps that all my needs are met by the available software. One thing that I've considered if my needs change is having a secondary system just to handle certain tasks I can't do with Linux.

I currently use a feature phone--and it's good enough for what I actually need. But I can say I've heard some people swtiched to an iPhone from a deGoogled Android phone. They recognized that the deGoogled Android was better for privacy--but the iPhone worked better for their particular needs.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.