Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
How do you know that cat images won't have a greater probability of causing a mismatch?

The images you sent by email (?) will only get in the CSAM database if they are forwarded to NCMEC and they determine, by visual inspection, its child pornography and decide to add it to their CSAM database.

At this point, Apple have no way of making that image part of their system. Apple has no access to NCMEC's database.
But the FBI does.
What will Apple do when they receive a national security letter with a gag order ordering them to report certain pictures?
Have they promised not to scan your phone or MacBook for anything but CSAM in the future? (or even the next x years?)
No!
And they're not going to tell us that because they know we won't like the answer.
 
  • Like
Reactions: BurgDog
the code to make sure even apple doesnt read your emails is also built into the OS. this is amateur tech talk if you think the code is in OS to scan non-iCloud data then that means Apple can read your private data. 😆 I mean .. c'mon man.

the OS also stores your fingerprints and facial ID. do you think Apple has access to it because its built into the OS? this is hilarious
You clearly don’t understand what you’re talking about or how software works. You don’t need code to not do something until you have code to do something. For example, I don’t need code to say don’t share photos with iCloud until there is code to share photos with iCloud. Apple would need to add code to the OS in order to read your emails or any user content. In iOS 15, the code to scan user content will be present in the OS and will need to be told what to or not scan. It will only be targeted to scan photos set to be uploaded to iCloud and messages if you are under the set age and the initial scope will be narrow. You are defending something you don’t understand in ways that don’t make sense. It is pointless to continue discussing.
 
I know some super loyal people are defending apple here, but try to be objective. If Google outlined the same policy and laid out the same exact measures, would you still come to their defense?

Google is probably scanning Google Photos already. If they moved the scanning to the local device it wouldn't change anything for me.

Where Google do their scanning is the least of my problems with Google.
 
  • Like
Reactions: BurgDog
because....as has been pointed out many times in responses to your posts.....this is a proof of concept for abuses that are NOT CSAM in the future. Seems pretty coherent to me.
🤦‍♂️
But a lot of the examples people are going for future misuse is something this system is pretty bad at.

Things like finding people with pictures of protests or pictures indicating being gay.

The only people who should be a bit worried about future misuse is people living in China and Russia.
 
But a lot of the examples people are going for future misuse is something this system is pretty bad at.

Things like finding people with pictures of protests or pictures indicating being gay.

The only people who should be a bit worried about future misuse is people living in China and Russia.
Or the US, or Europe,etc,etc
Freedom is under attack everywhere.
 
You mean like when Apple forced people who do not use icloud, to upload their photos without their consent by enabling icloud stream on by default ??? (without asking user's permission...)

No idea what you're referring to here. Link? And "force" means there was no option to stop it, which I highly doubt was the case.

It doesn't matter how you try to twist it. Apple tried to pass a very specific message about privacy all those years.

If you do not mind, please answer this. Why do you think apple should protect children from child pornography? Do you think it's apple's role?

If yes, should it protect them from searching other topics, like suicide, guns etc? I would really appreciate if you would answer those questions please.

First of all, there are two different topics here. 1. Safety in Messages (a parental control feature). 2. CSAM detection for iCloud photo uploads. Even with the former, from the screenshots and writeups I've seen, it's still not stopping children from viewing potentially offensive images sent to them via Messages, but does notify their parents that they viewed it. This is between the parent and the child whether to activate this feature.

As for CSAM detection, that has nothing to do with "protecting children from child pornography." That is simply to prevent the storage and distribution of child porn collections on iCloud.
 
So that big billboard with apples ad was missing the "* E"?

The "big billboard ad" applies to local storage on your iPhone, not cloud storage. That's why it said "what happens on your IPHONE" (not iCLOUD).
 
If they don't want the images on their servers they should implementing the scan on their servers and not adding spyware to iOS 15.
People using iCloud (I don't store any photos in iCloud and don't use iCloud Keychain) give up some of their privacy because they store their data on a system not owned and because it's not end-to-end encrypted.
Apple just needs to get their fingers off of devices I own!

Imagine the whine-fest of epic proportions here if Apple routinely scanned photos on customer iCloud accounts. And Apple said, as you suggest, "you need to give up some of your privacy." People here would turn that phrase into Apple's slogan.

If I were Apple I would not want those images on my servers to begin with. No way. I suspect the server companies that Apple contracts with (such as AWS AWS) would not want them either.

Simply elect to not put your photos on iCloud and your devices will not be touched. Easy.
 
Craig has to go. He’s so tone deaf and out of touch, it’s unbelievable.

First, he saw no problem with implementing a spyware functional architecture. This debate has been going on for decades and even Google/Facebook know that such a functional architecture is too far, too easy to abuse, and incompatible with a free society. Being ignorant, willfully or otherwise, is unacceptable for someone in his position.

Second, he tried to silence and bully us with the “screeching minority” leak. Yes, the zealots at NCMEC wrote that, but there’s no way that comment was used without consent and concurrence of Apples senior leadership. Silencing and bullying your customers is not only bad business, it’s unacceptable, it’s unethical.

Third, he’s gaslit us multiple times during his clarification interviews. He’s claimed we’re “confused,” despite most of his opponents being long term privacy and security experts. E.g. the guy who started this, Matthew Green, is a computer scientist and encryption researcher at John Hopkins. If you’re a techie. You’ve been reading the original papers and analysis from experts like him and are are not confused.

There are plenty of people who are indeed confusing the three features (mostly news orgs trying to beat each other to the scoop) and others are confused about perceptual hashes and how this works at a deep technical level (mostly proponents who immediately jumped in to defend Apple without having read any of the technical docs). Those generally are NOT the screech minority, Craig.

#FireCraig
 
As for CSAM detection, that has nothing to do with "protecting children from child pornography." That is simply to prevent the storage and distribution of child porn collections on iCloud.
Then why do they say they're only scanning new uploads?
 
  • Like
Reactions: BurgDog
So… Were y’all okay with Apple scanning your iCloud library before this announcement? They’ve been doing it for awhile, since 2019 at least.

I knew about that already, and yes, I'm okay with it -- iCloud is apples servers and they can scan it all they want. I didn't turn off iCloud photos until the on device scanning was announced...
 
Because existing uploads have already been scanned.
You mean on the iCloud servers? If so,then they're already preventing storage and distribution, no? Why did they only reported 265 pictures to the NCMEC last year? Did they detect CSAM without reporting it to the NCMEC?
 
Last edited:
No idea what you're referring to here. Link? And "force" means there was no option to stop it, which I highly doubt was the case.



First of all, there are two different topics here. 1. Safety in Messages (a parental control feature). 2. CSAM detection for iCloud photo uploads. Even with the former, from the screenshots and writeups I've seen, it's still not stopping children from viewing potentially offensive images sent to them via Messages, but does notify their parents that they viewed it. This is between the parent and the child whether to activate this feature.

As for CSAM detection, that has nothing to do with "protecting children from child pornography." That is simply to prevent the storage and distribution of child porn collections on iCloud.

Apple had iphoto stream activated by default for quite a while! Even for users who were not using icloud photos. You had to manually disable it. That was without user's permission. Many medical photos for example (or other sensitive material) were uploaded to their servers. You are more than welcome to highly doubt anything you want.

You haven't answered my questions but it was expected anyway.
 
Imagine the whine-fest of epic proportions here if Apple routinely scanned photos on customer iCloud accounts. And Apple said, as you suggest, "you need to give up some of your privacy." People here would turn that phrase into Apple's slogan.

If I were Apple I would not want those images on my servers to begin with. No way. I suspect the server companies that Apple contracts with (such as AWS AWS) would not want them either.

Simply elect to not put your photos on iCloud and your devices will not be touched. Easy.

Get your facts straight. Apple has already been scanning photos for CSAM on iCloud since 2019. No whine-fest here. There is a massive difference between scanning data on their device (iCloud servers) and on device owned by a client.
 
The problem is that Apple can easily turn it on in the background without telling you

No they cannot, at least without getting caught.

The perceptual CSAM hash database is included, in an encrypted form, as part of the signed operating system. It is never downloaded or updated separately over the Internet or through any other mechanism. This claim is subject to code inspection by security researchers like all other iOS device-side security claims. Since no remote updates of the database are possible, and since Apple distributes the same signed operating system image to all users worldwide, it is not possible – inadvertently or through coercion – for Apple to provide targeted users with a different CSAM database.
Apple will publish a Knowledge Base article containing a root hash of the encrypted CSAM hash database included with each version of every Apple operating system that supports the feature. Additionally, users will be able to inspect the root hash of the encrypted database present on their device, and compare it to the expected root hash in the Knowledge Base article. That the calculation of the root hash shown to the user in Settings is accurate is subject to code inspection by security researchers like all other iOS device-side security claims.
This approach enables third-party technical audits: an auditor can confirm that for any given root hash of the encrypted CSAM database in the Knowledge Base article or on a device, the database was generated only from an intersection of hashes from participating child safety organizations, with no additions, removals, or changes.

Source: https://www.apple.com/child-safety/...del_Review_of_Apple_Child_Safety_Features.pdf

If you wanna know more about Apple's Security Research Program, check it here.
 
The problem is, all you need after this as a tyranical government is to insert certain hashes as Cp and push them out to tech companies to detect....it's not like Apple is going to verify by looking at that filth.
Simply let's speculate how politics works: Apple is faced with the alternative from the highest level: ‚we strengthen the resistance against your ‚walled app store‘ (with propaganda that can easily be fanned, also at MacRumors) or we annoy you with child porn prevention to see our ideas implemented.‘ What can Apple do about it?
 
  • Like
Reactions: rme
You clearly don’t understand what you’re talking about or how software works. You don’t need code to not do something until you have code to do something. For example, I don’t need code to say don’t share photos with iCloud until there is code to share photos with iCloud. Apple would need to add code to the OS in order to read your emails or any user content. In iOS 15, the code to scan user content will be present in the OS and will need to be told what to or not scan. It will only be targeted to scan photos set to be uploaded to iCloud and messages if you are under the set age and the initial scope will be narrow. You are defending something you don’t understand in ways that don’t make sense. It is pointless to continue discussing.
as I mentioned before. Your whole argument is built on the narrative of conspiracy theory rather than actual proof of privacy being breached


For All new folks reading lets just reiterate.

1. Apple says it will not expand scanning categories beyond CSAM. If you dont agree with that you don't trust them. Move on to a new ecosystem​

2. Apple says it will not share allow governments to expand what can be done on the Phone. If you agree you don't trust them. move on to a new ecosystem​

3. Device AI scans the hashes of your images in transition to iCloud . Your privacy remains intact unless you are a Pedophile because 30 positive image hash threshold is only shared with Apple if a user meets it. Apple does not receive ANY communication of what the device scanned unless its 30 positive image hashes.​

4. If Apple scanned image hashes only on iCloud it would have the user information for each hash that was scanned unlike on device where the user information is only sent to Apple if you are a pedophile.​


Thats it.

All the slippery slope arguments don't fly because its speculation and illogical because the same folks who applied this argument didnt use this same fervor when cloud data from all tech companies has been openly shared with authorities since the early 2000s. The same argument could be made for categories beyond CSAM,
 
Simply let's speculate how politics works: Apple is faced with the alternative from the highest level: ‚we strengthen the resistance against a closed app store (with propaganda that can easily be fanned, also at macrumors) or we annoy you with child porn prevention to see our ideas implemented.‘ What can Apple do about it?
What was your argument for this question before August 5, 2021 when it comes to cloud data scanning and government requests..
 
  • Like
Reactions: hlfway2anywhere
No they cannot, at least without getting caught.





Source: https://www.apple.com/child-safety/...del_Review_of_Apple_Child_Safety_Features.pdf

If you wanna know more about Apple's Security Research Program, check it here.
Simply let's speculate how politics works: apple is faced with the alternative from the highest level: we strengthen the resistance against a closed app store (with propaganda that can easily be fanned, also at macrumors) or we annoy you with child porn prevention to see our ideas implemented. What can apple do about it?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.