Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
For the umpteenth time, some posters here are trying to use the red herring of the technicalities of Apple's intended software. IT IS NOT ABOUT THAT.

The child abuse is a very convenient excuse which will not aid fighting child abuse or child pornography.

This is about SURVEILLANCE.
Exactly. It's about mass surveillance of a billion iPhone and some Mac users.
(And Pegasus just showed how to exploit iOS, I am sure this feature would also exploited if Apple makes this really happen.)
It would have been bad enough if they put this spyware just in iCloud (which I could live with, since I don't use iCloud / Keychain) but putting this ON A DEVICE I PAID FOR IS A NO GO.
Personally I just downgraded from iOS 15 beta to iOS 14 and will defer my yearly iPhone upgrade cycle until Apple stops putting spyware into iOS.
 
Another thought: what triggered Apple into this?

A company like Apple is not a law enforcement entity, it wouldn't fit their business model. Moralities (that we all share) aside, this doesn't make business sense to me -- I wonder if there has been private gov't pressure to "comply" with certain requirements (ie: backdoors) and this is a convenient excuse.

Over the years, it feels as if various entities use the topic of "child porn" or some other nefarious activity as an ingress to ulterior motives -- being a psychological method: get the public riled up about something really awful, then sneak in your real agenda, unseen.

Sounds paranoid, I realise, but I suspect this may be closer to the truth here.

I've also read that Apple is modifying Siri and other mechanisms, so that if someone searches for certain things, like CSAM content, it will redirect you to other resources.

Anyone else see the obvious here. These both would be incredibly good adjuncts to larger surveillance capabilities.

To ensure "participation" by the public, Apple will invest more into the UI/UX of their newer products to entice you to upgrade, where you won't have a choice but to participate in having your privacy violated, without your consent.

This all sounds like a raging dumpster fire with an Apple logo on the front.
 
1 false positive is too many for such a severe accusation…. If it flagged you then nothing else matters, your toast, some crimes are just as bad to be accused of as being convicted, the media would eat you for lunch
Please educate yourself on the subject before making ill-informed comments.

It requires 30 matches for them to even begin to investigate. 30 matches = you undoubtedly have child porn on your phone as 30 false positives would be virtually impossible.
 
And by the way, Apple has designed its own silicon chips. Who knows what hidden capabilities these will have as an adjunct to enable other forms of surveillance. "Nothing to see here." :)

Depending on what we find in their code/design/etc, I wonder if someone will find a way to interfere with these processes, so that gibberish is returned to their systems. Hmmm.
 
Last edited:
Depending on what we find in their code/design/etc, I wonder if someone will find a way to interfere with these processes, so that gibberish is returned to their systems. Hmmm.
I take it as given that security researchers all over the world - as well as the bad guys and intelligence agencies - are going to have a very very close look at this once its rolled out.
If at all possible they are going to develop a proof of concept circumventing or somehow misusing the feature.
I am looking forward to experiencing the ensuing ****storm if researchers find an exploit. And of course Apple’s response will be even more interesting. Speaking of „told you so“.
Let‘s wait and see.
 
I've read that Apple suggest it protects privacy more than Google or Facebook by virtue of it only being photos and because its on the hardware...That's complete tosh.

Other systems do often scrutinise whole content of cloud, but its SERVER BASED, and the idea that Apple are trying to sell this as being more private because AT PRESENT your hardware will only scan photos, that is very unlikely to continue. Perhaps Apple forgets that whilst others scan content, they don't do it via customers having to download individual operating systems with this facility embedded in, where bypassing System Integration Privacy controls, it has. access to anything Apple want to it, but it also has access your unique device identification amongst other things.

Server based systems might scan content, but you know that if you are using it and you have the opportunity not to use it. That is not the case with HARDWARE based surveillance on equipment Apple doesn't even own and where they could still utilise neuralgash only on photos but via server

So we have Apple's word that only photos are of interest so far and they use that to suggest its more private than server based systems....wrong.

Those server based systems are not embedded in your operating system without you even having a say in the matter, or be modified to do all sorts of things WITHIN THE CONFINED OF YOUR OWN DEVICE.
 
Last edited:
Can you even imagine how devestating would Be if you were one of those 3 false positives, not to even mention the new viruses that could be developed to plant images in to unsuspecting innocent people

Apple wouldn't know since the threshold is around 30 for each account.

Also, how many people have close to 100 000 million photos in iCloud?
 
Admittedly not a cyber security expert, but why does Apple decide to allow people in possession of CSAM 30 opportunities of child pornography? I would assume if the system is so perfect, then just one example of CSAM should be reported, no?

1. In the US it's not illegal to possess less than 3 images.

2. The system isn't perfect which is why they have introduced a high treshold so it's extremely unlikely an account is flagged by mistake. And they have added human review in addition.

3. The purpose of the system isn't to eradicate such material. Only reduce its spreading by the people who spread or collect the most using Apple's platform. And to make iCloud a dangerous place to store such images in large numbers.

It's very pragmatic and not principled.
 
So we have Apple's word that only photos are of interest so far and they use that to suggest its more private than server based systems....wrong.

If you don't trust Apple you can't really use any of their hardware, software and services since they could have been lying and hiding things from you the whole time.

The CSAM detection system doesn't change this.
 
There is no way Apple is going to truly end to end encrypt iCloud, especially after this announcement.
don't be so sure, apple may be doing this on-device scanning as a way to keep authorities happy when they roll out e2e, it will serve to silence legislators who predictably scream "what about the kiddie porn!"
 
A veritable media disaster for Apple, and yet they are still clawing to a system that is not in the best interests of customers, not in the best interest of Apple and not even in the best interest of child safety.











Hypocrisy?

BRUSSELS, July 2 (Reuters) - Europe's tech chief Margrethe Vestager on Friday warned iPhone maker Apple (AAPL.O) against using privacy and security concerns to fend off competition on its App Store, reasons CEO Tim Cook gave for not allowing users to install software from outside the Store.





Apple: "You have control over what you share".


Not if Apple proceed with their CSAM application within system software?














If Apple allowed had apps from its own store compromising privacy what hope is there to prevent Apple doing it, as it bypasses System Integration Protection?







Then we see surveillance of workers:

What is so coincidental is that Apple has been under pressure for some time to introduce a backdoor for security services in the UK, EU and USA amongst others.

But apparently everyone including the media has misunderstood how wonderful this new software ON YOUR HARDWARE is, but it suggests the media has understood and like many of us they don't like it!
 
before on device scanning apple could tell the government they could not comply with court orders. (because they didnt have the software to do it)

After on device scanning what is apple going to tell governments?
this was my thought also, it is proof of concept, but remember apple does comply with orders that it is capable of responding to, but if data is encrypted (like accessing the phone itself) apple says "we can't do anything"

but yeah sure, if this still rolls out and apple doesn't change it's mind and we see another database for something else, like terrorism, etc then we will know the game is over

quoting bruce schneier who is quoting tim may on the four horsemen of the apocalypse:

Back in 1998, Tim May warned us of the “Four Horsemen of the Infocalypse”: “terrorists, pedophiles, drug dealers, and money launderers.”

of these child abuse is the issue that screams the loudest (for good reason) and apple has now found a way to answer that complaint especially if they are getting ready to roll out e2e encryption

the others "horsemen" are less easy to justify putting databases on phones since none of them are visually oriented, csam is unique in that regard, we know the kinds of "data" pedophiles are looking for where in the case of terrorists, money launderers and kidnappers, the incriminating material is considerably more abstract and arcane to be able to build a useful database of
 
And your not even addressing nefarious hackers that could add images to your account, hackers can still take photos from careless users, no doubt they can add too…. Might even be a service to do this to your enemies for a price
they can add all the dirty photos they want and none will be a match unless the photos exist in the csam database, and the hashes match ... how would they do this ?

this protocol by apple doesn't suddenly make it easier to add photos to people phones

you have the ability to look at your photos so you would see anything added, if your phone has a good password no on can get in

we could extend the paranoia of this argument about stuff added to your phone endlessly right ?
 
I think they drank their own Koolaid. The cryptography that they developed is fairly impressive. Unfortunately it doesn't change the basic fact that they have created the means for a new level of surveillance, and it's also so complex that it's difficult to explain to the general public.
Hashes and checksums are as old as time. There’s nothing new or "impressive" here that they've developed. Any developer is capable of doing this and this method is in fact used all the time to make sure files didn’t get modified or corrupted during upload or download.

What’s new here is that Apple is now eager to babysit its customers.
 
If you don't trust Apple you can't really use any of their hardware, software and services since they could have been lying and hiding things from you the whole time.

The CSAM detection system doesn't change this.
I trust until there is a blatant betrayal of that trust, such as Apple's installing a general purpose AI surveillance engine that can scan my storage and monitor me for illegal content and report me to the police if any is found. That they currently limit it to a small subset of illegal things is a policy decision and doesn't take away from the fact they have put spyware that provides no benefit to me, that has a chance of harming me, and that I can't remove on my equipment.
 
Last edited:
don't be so sure, apple may be doing this on-device scanning as a way to keep authorities happy when they roll out e2e, it will serve to silence legislators who predictably scream "what about the kiddie porn!"
Why do you suddenly believe Apple is going to encrypt? Apple had the chance in 2018 and backed down.
 
Hashes and checksums are as old as time. There’s nothing new or "impressive" here that they've developed. Any developer is capable of doing this and this method is in fact used all the time to make sure files didn’t get modified or corrupted during upload or download.

What’s new here is that Apple is now eager to babysit its customers.
What is new is that Apple will now report you to the police if they find you doing something suspicious. I already get the babysitting from my watch telling me when to breath, exercise, go to bed - I've learned to accept that and I can turn it off.
 
Last edited:
  • Like
Reactions: IG88
Weird, because I haven’t seen a single coherent explanation of why it’s a problem if your own device scans your photos for child porn, and only does so if you are trying to upload onto apple’s servers, and only produces information to Apple if you have at least thirty child porn photos that you are trying to upload.
This is the beginning of Minority reports. I don't like people owning child pornography as much as anyone here. But having a phone or server scanned I own/rent for is not the right way. There is no guarantee that it stops a good cause. China will force apple to scan for protestors or whatever. Where is that line?
 
There are about 1 billion iPhone users.
Apple could easily be scanning 500,000 images a day once this rolls out.

It's also very easy to make the false assumption that, if the chance of one of your photos being a false match is (say) 1 in 1 million, then the chance of two of your photos matching is one in a trillion, as if they were independent events like throwing a fair dice.

False matches are not going to be independent: Apple's own documents explain that their "NeuralHash" system is designed to produce the same hash for visually similar images so that it won't be fooled by cropping, scaling, changes in image quality etc. So, let's say, by sheer bad fortune one of your photos triggers a false match because some element of it matches a known CSAM image. Given that one of your photos contains something that triggers a match, it is highly likely that you have other photos in your collection containing the same/similar visual element: even before digital made it free, a keen amateur photographer might use most of a 36 exposure film trying to get a single shot. Or maybe the false trigger is a poster on your living room wall... So that "one in a million" chance of a single random match in a random sample can easily turn into 'more likely than not' for subsequent matches.

Apple also scanned 100 000 million images of a general nature which resulted in 3 false positives.

It's clearly indicative of the probability any one account should have on the order of 30 such false positives are extremely low. It also shows that having one false positive doesn't really increase the probability of having several other false positives to reach the threshold. If that's where the case you should have seen clusters and much larger number of false positives.
 

Or, a slightly different, but more general fallacy when it comes to confusing "1 in X chance of a random match" with "1 in X chance of a false accusation" is:


In addition to the CSAM detection system also failing, the following safeguards would also have to fail:
A. Apple's human review of the derivates of the images
B. NCMEC's human review
C. The law enforcement agent(s) review

Only at this stage will a search warrant be issued and police showing up at anyone's home.

The famous case your pointing to involves medical conclusion which by their very nature is uncertain. Also court systems have learned from these cases.

In possession cases there will be concrete evidence of possession. I'm not even sure Apples derivative of the image would be acceptable in a court case. If the search conducted by the police didn't turn up any evidence the case would probably be dismissed.
 
Why do you suddenly believe Apple is going to encrypt? Apple had the chance in 2018 and backed down.
i have read that this may be a prelude to e2ee from several influential folks, like this from john gruber

Which in turn makes me wonder if Apple sees this initiative as a necessary first step toward providing end-to-end encryption for iCloud Photo Library and iCloud device backups.

plus, it makes sense, apple has a legal responsibility to report child abuse and if they encrypt end-to-end the justice department may well use that as grounds to file a lawsuit against them on the grounds that they are shirking their responsibility to look out for csam, which may break apple's ability to provide any encryption going forward

this is apple's response to say "we are actively looking out for these materials and also protecting our users privacy with e2ee"
 
For the umpteenth time, some posters here are trying to use the red herring of the technicalities of Apple's intended software. IT IS NOT ABOUT THAT.

The child abuse is a very convenient excuse which will not aid fighting child abuse or child pornography.

This is about SURVEILLANCE.

There is no skirting around that this is surveillance, and constant posts arguing about it being safe surveillance or necessary surveillance doesn't alter it.

You should lookup Safari Safe Browsing.

It analyses every URL you browse and Apple might send information about "your" URL to a third party without asking you.

Now, you can turn it off, just like you can the CSAM Detection system.
 
  • Like
Reactions: mw360
Are you saying with Google Workspace they don't scan your Gmail, Google Drive and Google Photos for CSAM?

In general they do. However, there are at least couple of ways one can make sure no scanning is taking place. However, disabling CSAM scanning is definitely not the intended purpose of these actions. When it comes to privacy (or adjustments in general) Google Workspace seems to have option for everything, which can make it exhausting experience to set up especially if you have special requirements. Anyway, if one wants to disable CSAM on Google Workspace you can set up the account as HIPAA and sign the BAA. Other option is to set the fundamental data storage location to country which has very strict privacy laws and doesn’t allow scanning. However, these choices in general can make the user experience suboptimal since all the data must be encrypted or data is in single location only which at least theoretically can be security risk.

Anyway, Google Workspace is very flexible which allows good privacy for users who handle confidential data. However, Google Business services are crazy about options and setting. They probably have settings to control settings. It can get bit frustrating.
 
Last edited:
  • Like
Reactions: m.dricu and BurgDog
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.