Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
He definitely seems to skirt most of those questions. Which is interesting. I still trust Apple to do the right thing for privacy, which I’m glad it’s checking hashes and not doing image scanning… but we’re definitely in troubled waters with the mob mentality going around tech forums and the general media right now. Going to be an interesting next few weeks/months

It is scanning your images. How else can it compare it to a hash? How else can it compare the image if its been altered?
Doing the right thing for privacy would mean not developing, and deploying spyware.
 
  • Like
Reactions: 09872738
As he states, it is clear that Apple will tolerate a single CSAM file within an iCloud Photos account. They designed it to do so. So what is the point of this? That fact alone gives law enforcement a battering ram to demand access to iCloud Photos. This feature does not preclude that there is CSAM stored in iCloud Photos. All Apple can claim is there is less CSAM in iCloud Photos.

If PR approved this disaster, firings must commence.
no, if the photo is flagged as CSAM content, it will NOT be uploaded. hence, NO CSAM content on iCloud servers. plus they're already scanning for this stuff in iCloud content, its just now it'll flag it with your CPU cycles.
 
It is scanning your images. How else can it compare it to a hash? How else can it compare the image if its been altered?
Doing the right thing for privacy would mean not developing, and deploying spyware.
Scanning and hashing are not the same. The hashing program will calculate all pixels and give a result. its not looking at photos for faces, etc.
 
It is scanning your images. How else can it compare it to a hash? How else can it compare the image if its been altered?
Doing the right thing for privacy would mean not developing, and deploying spyware.

It is not scanning your images. It is creating a hash of your images, then taking that hash, and comparing that to the hashes of known images that are illegal.

That is a huge difference which has to be stressed, because the above here is feeding the FUD going around. Now, that said, I'm not saying that I agree or disagree with what Apple is doing, but let's get the facts straight and debate the facts.

BL.
 
It is not scanning your images. It is creating a hash of your images, then taking that hash, and comparing that to the hashes of known images that are illegal.

The problem is that it's building a system, on your device, to do this.
At the moment it's running hash comparisons against an opaque database of known CASM imagery

Nothing other than Apple policy (or being compelled by a local jurisdiction) is stopping that from changing once the tool is built and installed on every user's device.

That...is the outrage here.

It's not at all about CSAM and this specific situation.
It's building the capability to do this - at all - against any database --- on users devices

The only damn here is Apple policy
(and they've already shown and admitted that has limitations)

They shouldn't build in the tool on users devices - at all
 
Last edited:
By using the iPhone you agree to all programs and software Apple deems is appropriate in the operation of your iPhone.
Read the agreement when starting up a new iPhone. The one where if you don't agree you cant proceed
 
It is scanning your images. How else can it compare it to a hash? How else can it compare the image if its been altered?
Doing the right thing for privacy would mean not developing, and deploying spyware.
“Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

“Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.

“Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.”


They don’t know the contents of the image unless it’s a match. Hence the double blind process they developed. And I don’t know much about the technology, but a simple guess could be a history is kept in the photos’ metadata of changes (hash changes) so it can see the original hash.
 
  • Like
Reactions: brucemr
With private set intersection and threshold secret sharing you can keep things secret from Apple until a threshold is reached. If the threshold is never reached Apple can't read the data even if it is in its possession.
No. If the threshold is never met they know I am under the threshold. That's data that invades my privacy.
Yes, if you have iCloud Photo Sharing turned on. But that's the purpose of this setting. Sharing information from your device to the cloud.
No. The purpose of iCloud is for me to have access to my data and the option to share the content I wish to share. The purpose of iCloud is not to grant a private organization access to my data.
 
No. If the threshold is never met they know I am under the threshold. That's data that invades my privacy.

No. The purpose of iCloud is for me to have access to my data and the option to share the content I wish to share. The purpose of iCloud is not to grant a private organization access to my data.
It might be your intended purpose, but the agreement you agreed to says differently. If you don’t like what’s going on, you are going to have to look for a new device and new services. Not saying that to be an ass, just being factual.

CalyxOS and Linux are both pretty good alternatives.
 
  • Like
Reactions: timeconsumer
It might be your intended purpose, but the agreement you agreed to says differently. If you don’t like what’s going on, you are going to have to look for a new device and new services. Not saying that to be an ass, just being factual.

CalyxOS and Linux are both pretty good alternatives.
I didn't agree to have my device scanned when I invested hundreds of thousands of dollars into Apple's products. If they want to refund me then we can start having that discussion, but nothing short of paying for both my out-of-pocket expenses and the time invested will be enough.
 
I didn't agree to have my device scanned when I invested hundreds of thousands of dollars into Apple's products. If they want to refund me then we can start having that discussion, but nothing short of paying for both my out-of-pocket expenses and the time invested will be enough.
Yeah, you actually did. Before any of this current topic your device is constantly scanning content. Good luck getting your money back.
 
You realize none of the people you are supporting will actually make the trains run on time, right?
I’ve no idea what you are driveling on about. Go take the EULAs, the Privacy Policy and the technical documents for CSAM to your attorney and go cry about how Apple owes you money - let’s see how far you get.

I’ll get my popcorn.
 
Last edited:
I think we'd all be fine if they stick to only doing iCloud server side scanning, as they have been.
If Apple were to ever implement E2EE, this would no longer be possible.

I think this step is to satisfy the requirement to ensure questionable contents do not get stored in their cloud servers when they starts E2EE, probably a proof of concept for them.

I'm thinking that Apple couldn't implement E2EE for their cloud servers because of such laws?
 
At the fear of repeating posts I've already made. It is a complete red herring for posters to comment about 'nothing to fear'. It is nonsense to see this action by Apple as anything other than a form of SURVEILLANCE and its implications on the very PRIVACY Apple has espoused to protect time and time again and even used PRIVACY in its lawsuits for Apple and those seeking to use Apple to get access to PRIVATE information.

It has nothing to do with a tag on a photo or anything technically related to CSAM, it has everything to do with Apple deciding to be Big Brother and deviating from its publicly stated aims over decades.

This is NOT about keeping kids safe. Apple does NOT have access to every database of potential dodgy pictures either in the US or anywhere else in the world, making the claims they make ridiculous. In fact it may well make agencies set up to detect these criminals an even harder job as these criminals take countermeasures. Its no good Tim Cook appearing in the media about refusing to assist FBI and other agencies against potential terrorist activities, then seeking to engage in SURVEILLANCE using children as an excuse, where in fact it makes life harder for all agencies involved in crime fighting, including fighting the abhorrent abuse of children.

Apart from this does anyone really believe that a dedicated paedophile, where they are known to be ultra deceitful set up secret dark web information exchanges etc. etc., would be stupid enough to leave unencrypted information and then put it to the cloud?

Set aside the argument of what is INTENDED with this potential volt-face by Apple and concentrate on what it actually is and what it is not.

It is Apple turning its publicly stated policies over the years on its head, it is about Apple engaging in SURVEILLANCE which infringes PRIVACY.

Forget that it might have good intentions, because that is not the point, nor is arguing about how Apple intend to go about checking these particular photographs at present, because its not relevant to the overall picture of what Apple are doing.

The Road to Hell is Paved with Good Intentions...Apple have started to go down that road!

Now any company including Apple can pick the next emotive target where few could argue against the good intentions, but which extends and extends the sphere of control over our data and by consequence of that our lives.

So these arguments about Apple's designed it to do this or do that, so its OK...COMPLETE RUBBISH. Nothing to do with what is actually happening with the Big Picture.

Its SURVEILLANCE and PRIVACY that are the ultimate commenting points, not about how safe it is or is 'designed' to be in this particular implementation of SURVEILLANCE.

Now whilst the intention may be honourable, it doesn't alter what it is, and where the intentions are irrelevant anyway, because ANY law breaker of any sort unless they are lacking in mental capacity, will just encrypt data or not use the cloud, let alone Apple have been economical with the truth if they suggest they have access to government files in order to check anything. They haven't, not in the US, not in the UK and I doubt most countries except perhaps China where I'm sure Apple's current actions will be loudly applauded!
 
Last edited:
I didn't agree to have my device scanned when I invested hundreds of thousands of dollars into Apple's products. If they want to refund me then we can start having that discussion, but nothing short of paying for both my out-of-pocket expenses and the time invested will be enough.

I think the risk when buying a smart phone was always there for something like this. People concerned need to not use iCloud
 
At the fear of repeating posts I've already made. It is a complete red herring for posters to comment about 'nothing to fear'. It is nonsense to see this action by Apple as anything other than a form of SURVEILLANCE and its implications on the very PRIVACY Apple has espoused to protect time and time again and even used PRIVACY in its lawsuits for Apple and those seeking to use Apple to get access to PRIVATE information.

It has nothing to do with a tag on a photo or anything technically related to CSAM, it has everything to do with Apple deciding to be Big Brother and deviating from its publicly stated aims over decades.

This is NOT about keeping kids safe. Apple does NOT have access to every database of potential dodgy pictures either in the US or anywhere else in the world, making the claims they make ridiculous.

Apart from this does anyone really believe that a dedicated paedophile, where they are known to be ultra deceitful set up secret dark web information exchanges etc. etc., would be stupid enough to leave unencrypted information and then put it to the cloud?

Set aside the argument of what is INTENDED with this potential volt-face by Apple and concentrate on what it actually is and what it is not.

It is Apple turning its publicly stated policies over the years on its head, it is about Apple engaging in SURVEILLANCE which infringes PRIVACY.

Forget that it might have good intentions, because that is not the point, nor is arguing about how Apple intend to go about checking these particular photographs at present, because its not relevant to the overall picture of what Apple are doing.

The Road to Hell is Paved with Good Intentions...Apple have started to go down that road!

Now any company including Apple can pick the next emotive target where few could argue against the good intentions, but which extends and extends the sphere of control over our data and by consequence of that our lives.

So these arguments about Apple's designed it to do this or do that, so its OK...COMPLETE RUBBISH. Nothing to do with what is actually happening with the Big Picture.

Its SURVEILLANCE and PRIVACY that are the ultimate commenting points, not about how safe it is or is 'designed' to be in this particular implementation of SURVEILLANCE.

Now whilst the intention may be honourable, it doesn't alter what it is, and where the intentions are irrelevant anyway, because ANY law breaker of any sort unless they are lacking in mental capacity, will just encrypt data or not use the cloud, let alone Apple have been economical with the truth if they suggest they have access to government files in order to check anything. They haven't, not in the US, not in the UK and I doubt most countries except perhaps China where I'm sure Apple's current actions will be loudly applauded!
Very well put. The supporters of this initiative are completely blind to the real intent.
 
The hash list is built into the operating system, we have one global operating system and don't have the ability to target updates to individual users and so hash lists will be shared by all users when the system is enabled.
This is not as assuring as he might think.

And secondly, the system requires the threshold of images to be exceeded...
That threshold can be changed by Apple on a whim.

And then, thirdly, the system has built into it a stage of manual review where, if an account is flagged with a collection of illegal CSAM material, an Apple team will review that to make sure that it is a correct match of illegal CSAM material prior to making any referral to any external entity.
These criteria can easily be changed by outside, i.e. legislative, pressure.

And the last point that I would just add is that it does still preserve user choice, if a user does not like this kind of functionality, they can choose not to use iCloud Photos and if iCloud Photos is not enabled, no part of the system is functional.
We have to trust Apple on that, which I'm no longer really inclined to do.
 
Last edited:
If you are careful enough to never turn on iCloud Photos (which sometimes they are enabled by default), yes. Your phone still does the scanning, but any vouchers (if generated) will not be communicated with Apple. At least for now. Since iOS is not open source, you will never know.

And the problem remains, that the fact the scanning still takes place locally, on everyone's iPhones worldwide.
And at some point, there will be pressure on Apple to enable the scan and reporting even if iCloud Photos isn't enabled. After all, the code is already present in IOS, and that's only a status flag, you'd only object if you have something to hide, and it's for the children. You want to protect the children, don't you? And then, in some countries, the database will be expanded to other "undesirable" imagery. But that's okay, because it isn't happening here and who cares if a bunch of "foreigners" end up in reeducation camps. Maybe that won't extend to the US, so we'll be okay. Maybe not.

I believe it was Mark Twain who said, "Half of the results of a good intention are evil."
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.