Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
well, apple won't see the images at all until there is a match of the hash and there are at least 30 matches

at that point a person from apple will review the images

but it should never get to that point at all

to your point, yes, it either is an image in the database or it isn't, there should be no "judgement" on apples part since the csam hashes are fixed even in the case of image alteration

so there is no way any of your pictures, even your nude baby-in-the-bath or romping-on-the-lawn-in-the-sprinkler-naked pictures will get flagged because they cannot produce a hash that will match the csam materials
Ok so lets assume there is 30 matches and a 25 year old that looks 15 is the subject. Apple would not verify the age of the subject?
 
They already decrypt users' pics on-server all the time, for example when you view them in a web browser on icloud.com, or to hand them over to law enforcement when legally required. It would be another story if they used E2E encryption, but they don't. As it is, scanning on the device isn't any more private, but opens the door to an entirely new level of surveillance.

“all the time” = only 265 reported CSAM in 2020, thousand-fold less than comparable companies

And those 265 come exactly from the iCloud webapp when users tried to email those pics out, not from dragnet scanning of iCloud Photos on-server.

Subpoenas are another matter.
 
“all the time” = only 265 reported CSAM in 2020, thousand-fold less than comparable companies

And those 265 come exactly from the iCloud webapp when users tried to email those pics out, not from dragnet scanning of iCloud Photos on-server.

Subpoenas are another matter.
The point is that they can decrypt your pictures anytime they want, given that it is not E2E encryption. No privacy is gained by scanning on the device.

If they had introduced E2E encryption at the same time, at least there would be a potential benefit for the user to weigh against the intrusion into your device. As it is, it is entirely adversarial with no advantage for the user. People don't like it if their own devices are turned against them, and rightfully so.
 
  • Like
Reactions: BurgDog
Sorry but its complete rubbish to say its safer on the Users equipment! Of course its not, it gives the potential for modification and access to over 1,000,000,000 users unique identifiers plus anything else that the software was modified for.
did you read the technical docs ? the database is signed with os15 and the hash of the database will be published by apple in a knowledge base article, there is no way they can alter the database, plus like the blockchain, it is in full view of everyone including users

the same system in the cloud is completely opaque we have no idea what is being matched or where the images came from or whether they have been added to or subtracted from

it would be infinitely easier for a bad actor to insert non-csam materials in cloud scanning and we wouldn't even know it

apple has chosen to put the database in every phone so it can be seen and not altered or messed with in any way
 
  • Like
Reactions: cupcakes2000
See this is what I have always found a bit confusing. As a developer, I would definitely like to study the code and play around with what we will call a "black box". I would love to put am image in a mock CSAM database and adjust it in Photoshop to see what the threshold is. Its just developer curiosity and it does help me understand how something works more than human words ever could. How can the same subject at a different angle not be a match? I mean the edits must be VERY VERY SMALL for a match to occur at this point. If I take a photoshop warp tool and modified part of the image slightly, would that be enough to have the image not match?

Then a person with child porn would play with modifications until their child porn isn't detected, making the whole solution pointless?
 
Yes, and NONE of the others are doing it via software being installed on the USER's own hardware, which is why so many media organisations and tech companies are kicking off about that aspect of it, and even Apply employees have emailed their concern at it being installed on our hardware.

What “media organisations”?
Tech companies like Apple’s arch-enemies Epic and Whatsapp/Facebook?
How many employees?
That’s not proof of anything.
There are people with different ideas about this, doesn’t mean they’re right or wrong. Then there are those misrepresenting the facts or overreacting or being unreasonable (the battery concerns..lol) or being unfair.
 
1) You can turn off Spotlight
2) Spotlight isn't designed to report you if it thinks something on your computer is "suspicious"
1) I can’t on iOS/iPadOS
2) How do you know? It’s not open source.
3) Even if Spotlight doesn’t do anything nefarious today, how do you know it will not in the future if Apple is pressured to silently update it to send out all the indexing data?
 
Ask yourself the following questions:
If Apple suggest NeuralHash is less invasive than other companies surveillance is, then ask yourself why aren't Apple putting the check on their own server like other companies as it would still convey the improved 'privacy' they suggest NeuralHash confers.

Instead why would they require 1,000,000,000+ users downloading software from their server, same amount of users having to install the software on their own hardware, electricity costs for both parties, processing costs etc., instead of Apple just putting 1 software on its own servers.

They belatedly tried to sell NeuralHash benefits, but in that case it would convey the same privacy 'benefits' if it was on their server not on your devices.

The fact they are doing this on YOUR hardware when they don't have to, demonstrates in my opinion that it will end up being far more intrusive, as it has access because it bypasses Systems Integration Protection on Apple devices and they are not going the way of using YOUR hardware for no reason, especially as the costs of that are far in excess of having it on their own server..
There is speculation that some markets will get iCloud encryption where the user gets the encryption keys. Right now, Apple holds the keys.

I say some markets because I doubt China or Russia would allow that.
 
Sorry but its complete rubbish to say its safer on the Users equipment! Of course its not, it gives the potential for modification and access to over 1,000,000,000 users unique identifiers plus anything else that the software was modified for.

On the server it can only serve the general function at present NeuralHash, which as Apple suggest is designed to be anonymous anyway.

It doesn't even make financial or ecological sense to have it on 1,000,000,000+ devices, as its that number of downloads, that number of processing time, electricity etc., whereas installing it at the cloud end requires 1 piece of software.

It would be the same neuralHash tools so anyone suggesting it conveys more privacy on your own device just doesn't understand Apple software nor the fact that Systems Integration protection, stops other companies from changing system files, whereas Apple bypasses that, as authors.

When costs are greater to put on individual hardware and they are, then its there for a reason, and I do not believe it should be on users hardware

Five paragraphs of ranting yada yada. Without addressing my very simple question in the slightest.

Shocked. No...totally shocked.
 
If there's no CSAM on your device, then how would Apple ever see any of your images? I don't feel bad for anyone that has that kind of stuff in their possession. This fact makes this system 100% private to those that don't own CSAM.
Yeah, no nationstate or other well funded groups don’t spend resources on exploiting new frameworks. Apple is now saying databases from other countries will be included, another expansion before it’s even released. Every IOS and iPadOS device will have this forcibly installed, I’m assuming worldwide. because everybody is guilty until proven innocent by a process with no consumer challenge or notification process, until there is a knock on your door. But hey complicated software/human processes work 100% with no unintended consequence.
 
  • Like
Reactions: Playfoot
Ok so lets assume there is 30 matches and a 25 year old that looks 15 is the subject. Apple would not verify the age of the subject?
It’s literally science fiction that 30 matches depicting the same 15yo subject would escalate to human review.
You don’t understand how this works.
 
  • Like
Reactions: JBGoode
Yes, and NONE of the others are doing it via software being installed on the USER's own hardware, which is why so many media organisations and tech companies are kicking off about that aspect of it, and even Apply employees have emailed their concern at it being installed on our hardware.
Yes that’s true. I have the same questions as to why do it user side when it can be done server side. It’s been speculated by smarter people than me that maybe Apple wants to turn on encryption where users would control their key. But if that were really the case I don’t know why Apple doesn’t get in front of that question.
 
You can turn off photo uploads
Don’t say it when you don’t know it…. It’s totally unclear at this point what the program will do when it detects stuff on your phone…. But it’s still irrelevant to the issue that the software is still on your phone scanning for stuff regardless of whether you send it to iCloud …. Lets see a toggle cutting off the scan…. Doubt you will find that
 
It’s literally science fiction that 30 matches depicting the same 15yo subject would escalate to human review.
You don’t understand how this works.
Yes I do understand how it works. I am asking about the HUMAN REVIEW process, not the MATCHING PROCESS. What all goes in the human review process. Will they be checking drivers licenses for the subject to determine if they are under age or not? I DON'T KNOW THAT DETAIL. Which is why......I am asking? Seriously, you have had this attitude towards me for weeks when all I am doing is asking questions so I can learn the process. I am not saying I know everything. Which is why I am posting asking these questions so I do know.
 
Ok so lets assume there is 30 matches and a 25 year old that looks 15 is the subject. Apple would not verify the age of the subject?
if there are 30 matches the person who is uploading and who is getting matched is uploading child porn, period, because the false positive rate is so high, 1 in a billion is a lot but they (apple) are talking about 1 in a trillion

unless you have one of the images in the database (which itself is a confluence of 2 different database where images are matched to insure that at least 2 different agencies deemed it be child porn) you are not going to get a match

you can upload 100 pictures of your 2 year old daughter running naked through the lawn sprinklers and you won't get a match
 
….. just to be clear, there is only ONE issue, Apple is installing software on YOUR hardware to scan YOUR data, it’s not about anything else….. anyone that says otherwise Is missing the big picture, it’s 100% different than anything else being done in the industry, The closest breach would probably be the Alexa thing however that’s not as serious a breach compared to what Apple is doing, both are invasion of privacy though

Apple is installing software that scans your data but DOES NOT RELAY the results of said scan to Apple UNTIL you have uploaded the pictures to their servers ANYWAY.

It’s a bit more nuanced than “they’re scanning my local data”, we’ll agree to disagree about this till the end of times.
 
if there are 30 matches the person who is uploading and who is getting matched is uploading child porn, period, because the false positive rate is so high, 1 in a billion is a lot but they (apple) are talking about 1 in a trillion

unless you have one of the images in the database (which itself is a confluence of 2 different database where images are matched to insure that at least 2 different agencies deemed it be child porn) you are not going to get a match

you can upload 100 pictures of your 2 year old daughter running naked through the lawn sprinklers and you won't get a match
I am asking about the human review process, not the matching process. Have any details about how that process works been publicly stated? Will they be checking drivers licenses for the subject's age?
 
Yes I do understand how it works. I am asking about the HUMAN REVIEW process, not the MATCHING PROCESS. What all goes in the human review process. Will they be checking drivers licenses for the subject to determine if they are under age or not? I DON'T KNOW THAT DETAIL. Which is why......I am asking? Seriously, you have had this attitude towards me for weeks when all I am doing is asking questions so I can learn the process. I am not saying I know everything. Which is why I am posting asking these questions so I do know.
No. That's not Apple's job.

Again, read Apple's document about how the system works. It has been covered in length on this thread.
 
Yes I do understand how it works. I am asking about the HUMAN REVIEW process, not the MATCHING PROCESS. What all goes in the human review process. Will they be checking drivers licenses for the subject to determine if they are under age or not? I DON'T KNOW THAT DETAIL. Which is why......I am asking? Seriously, you have had this attitude towards me for weeks when all I am doing is asking questions so I can learn the process. I am not saying I know everything. Which is why I am posting asking these questions so I do know.
They are looking for exact matches at all stages, both the automated stage and the human review stage. That’s how reporting to NCMEC works.
 
No. That's not Apple's job.

Again, read Apple's document about how the system works.
Geez people, I am just asking questions here what is the damn problem? Apple did NOT go into specifics about how their manual review process will work. I have read the documents. I have watched videos on the topic. I have listened to podcasts on the topic. But its all been vague "Apple will manually review the pictures". HOW? What will they be doing? If the goal is to prevent child porn, shouldn't they be reviewing the drivers license of a child subject in a nude picture? Even if its not in CSAM, shouldn't that be the job to make sure child porn is stopped?

I am tired of repeating myself every third post. I HAVE READ THE DOCUMENTATION. It doesn't go into specifics. I cannot see the algorithm. I cannot play around with the code to really understand. Its like people on this site are SEVERELY AGAINST asking questions.
 
did you read the technical docs ? the database is signed with os15 and the hash of the database will be published by apple in a knowledge base article, there is no way they can alter the database, plus like the blockchain, it is in full view of everyone including users
Auditability of the on-device database is a (small) step forward (interestingly that was not in the initial documentation they put out). But it still doesn't give you as a user any insight about what images are actually included in the database, because the system is specifically designed to prevent that. They now say that the database will be open to independent audits, but again that's just policy which they can change (or be forced to change) anytime.
 
I am asking about the human review process, not the matching process. Have any details about how that process works been publicly stated? Will they be checking drivers licenses for the subject's age?
no they will be checking for a false positive, if they get a match (1 in a billion or trillion) they will look at the photo, if it looks like child porn they will send it to ncmec who presumably will match by doing a visual analyses to the actual image it matches

but you will need 30 matched images before any of this happens, so unless you are uploading actual-seen-before-and-logged-by-ncmec-child porn you will not have any matches so the human review process is irrelevant
 
  • Like
Reactions: Jayson A
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.