Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
if it looks like child porn they will send it to ncmec
This is the part that I don't see have details anywhere. Again, what if the subject is 25 but looks 15? Will Apple make that judgement call that its child porn? 15 is still a child where I live.

Again, I am just trying to learn the process. I guess I need to keep saying this over and over and over again to you people to avoid the hate responses.
 
Lets just be real here. You can get addicted to pretty much anything. Porn addiction is like alcohol addiction, or video game addiction.

It also depends on your timing. Does someone have 30 (or even 100 for an example) different content downloaded, but did so over the course of 10-20 years? Internet in this country (US) is generally on average still very bad. Even though I had 500 Mb/s internet, I still downloaded everything from iTunes/Apple TV for offline viewing because it was better that way. Even when I had hundreds to a thousand dollars of high quality routers, modems and infrastructure in my house, streaming on Netflix or iTunes or even YouTube most of the time was a pain. I have Gigabit internet now which has improved things dramatically.

I am not saying people can't be addicted. But people can be addicted to pretty much anything. And its an equal problem. If it impacts your life and your work, then its a problem. I know a couple of people that are in SEVERE debt due to their gambling addiction. I know some people that almost lost their jobs due to their video game addiction and they were sneaking to play their games while at work.

I mean I have a 24 TB NAS and have about 8 TB of Movies/Games/TV Shows downloaded from iTunes and Steam and other sources. Just because I had to deal with crappy internet for the longest time, it was beneficial to keep this stuff downloaded. Plus it saves on the monthly bandwidth!

Of course people can be addicted to anything, but with porn, it's far more common to be addicted to it because the sex drive is a powerful thing and hard for most people to overcome once they open that pandora's box. Like I said before, find me a news article of someone caught with a child porn collection of less than 30 images. It's normally at least hundreds. My whole point was simply that Apple's now-revealed 30 image threshold makes sense. The person I was originally responding to acted like that number was too high. I still stand by what I originally said with the word "addicted", but I would have just left that word out in hindsight if I had known so many people were going to focus on that and miss the forest for the trees. But then again, I guess they would have just found something else to latch onto to miss the point, because some on this forum can't concede anything to Apple here and think everything they do is wrong.
 
  • Like
Reactions: Ethosik
Auditability of the on-device database is a (small) step forward (interestingly that was not in the initial documentation they put out). But it still doesn't give you as a user any insight about what images are actually included in the database, because the system is specifically designed to prevent that. They now say that the database will be open to independent audits, but again that's just policy which they can change (or be forced to change) anytime.
well yes, we are dealing with a lot of trust here

apple creates the database by merging images from 2 different agencies and they only use matching images so that the images in the database are deemed child porn by 2 different agencies

they then put the database on every phone in the world that ios15 and the database is signed with the operating system

apple publishes a hash of the database which users and third parties can inspect, that database will sit there unaltered until presumably apple releases another os and loads a new database with a new hash

it makes changing the database impossible until they replace it with a completely new one

but yeah we are placing a lot of trust in apple

to me it's better than facebook where i have zero knowledge of the database they are using on their servers or google
 
Apple is installing software that scans your data but DOES NOT RELAY the results of said scan to Apple UNTIL you have uploaded the pictures to their servers ANYWAY.

It’s a bit more nuanced than “they’re scanning my local data”, we’ll agree to disagree about this till the end of times.
There is nothing nuanced about scanning stuff I own on my hardware…. It’s so cut and dry I am not sure why any argument exist, your just saying you don’t mind and I’m just saying I do….. but the facts of what they are going to do is clear…. Even if you take out the slippery slope stuff it’s still a clear breach
 
  • Like
Reactions: duffman9000
Of course people can be addicted to anything, but with porn, it's far more common to be addicted to it because the sex drive is a powerful thing and hard for most people to overcome once they open that pandora's box. Like I said before, find me a news article of someone caught with a child porn collection of less than 30 images. It's normally at least hundreds. My whole point was simply that Apple's now-revealed 30 image threshold makes sense. The person I was originally responding to acted like that number was too high. I still stand by what I originally said with the word "addicted", but I would have just left that word out in hindsight if I had known so many people were going to focus on that and miss the forest for the trees. But then again, I guess they would have just found something else to latch onto to miss the point, because some on this forum can't concede anything to Apple here and think everything they do is wrong.
Oh I understand. And I do think a lot of people here are making this a much bigger issue than it is. While I am not happy about this, and I take privacy very seriously. I do believe Apple is doing this the best way it can with privacy in mind. Personally if you have 1 picture like that on your phone I think you have a problem (even adult content).
 
hey folks, try an experiment

do you have an image of a tree or some flowers or a garden that you took ?

not an image you downloaded from the web, an image you took, a tree or trees, a garden, flowers, a beach whatever

go to tineye.com and upload the image and see if you get any hits, tineye searches 45 billion images in less than a second

see if you get any false positives ... i get zero matches no matter what i upload
 
To be clear, 1 in 1 trillion is the chance of having 30 false positives and 0 actual positives.

Then there’s a slightly higher chance of having 29 false positives and 1 actual positive.

And a slightly higher chance of having 28 false positives and 2 actual positives

And a slightly higher chance of having 27 false positives and 3 actual positives.

etc.

But even owning ONE of the actual CSAM pics is super illegal.

So the only case that we should consider “unfair human review” is the extra-improbable 1-in-1-trillion 30/0 situation.

29/1, 28/2, 27/3 etc. combinations are totally fair for Apple to look at and report. There are not “judgment calls” to make, just picking the exact matches to report and discarding the others.
 
  • Like
Reactions: Ethosik
no they will be checking for a false positive, if they get a match (1 in a billion or trillion) they will look at the photo, if it looks like child porn they will send it to ncmec who presumably will match by doing a visual analyses to the actual image it matches
One wrinkle here is that legally NCMEC is the only organisation in the US that is allowed to be in possession of CSAM. So technically it would be illegal for Apple to collect the photo in readable form from the user's device (and they also don't have NCMEC's image database). The way they are trying to get around that is by collecting a "visual derivative" of the photo instead of the photo itself. They haven't said exactly what that is, but presumably it's some kind of low-resolution or blurred version of the photo. How exactly that will be evaluated by Apple's reviewers is unclear.
 
  • Like
Reactions: Playfoot
There is nothing nuanced about scanning stuff I own on my hardware…. It’s so cut and dry I am not sure why any argument exist, your just saying you don’t mind and I’m just saying I do….. but the facts of what they are going to do is clear…. Even if you take out the slippery slope stuff it’s still a clear breach

A scan is not a scan if its output is cryptographic gibberish and it sits on my phone till the end of times without being relayed to Apple, sorry. It’s a courtesy pre-scan. It’s half-a-scan. Locally, it’s nothing.
 
One wrinkle here is that legally NCMEC is the only organisation in the US that is allowed to be in possession of CSAM. So technically it would be illegal for Apple to collect the photo in readable form from the user's device (and they also don't have NCMEC's image database). The way they are trying to get around that is by collecting a "visual derivative" of the photo instead of the photo itself. They haven't said exactly what that is, but presumably it's some kind of low-resolution or blurred version of the photo. How exactly that will be evaluated by Apple's reviewers is unclear.
Oh that is interesting. I don't know a whole lot about the legalities of this too so I am curious if the low quality version is enough to bypass this issue. I would assume it is otherwise Apple wouldn't be doing it.
 
One wrinkle here is that legally NCMEC is the only organisation in the US that is allowed to be in possession of CSAM. So technically it would be illegal for Apple to collect the photo in readable form from the user's device (and they also don't have NCMEC's image database). The way they are trying to get around that is by collecting a "visual derivative" of the photo instead of the photo itself. They haven't said exactly what that is, but presumably it's some kind of low-resolution or blurred version of the photo. How exactly that will be evaluated by Apple's reviewers is unclear.
right, be nice to know more clearly how that will work, presumably apple would not be culpable because they don't know what the image is until they see it and then make a human decision that it is a false poitive or a match and then refer it to ncmec, but again there needs to be 30 false positives to get to that stage

apple did a trial run on a 100 million non-child porn images and got 3 false positives

as i have said upthread try uploading an original image to tineye.com which searches 45 billion images and see if you get a false positive
 
right, be nice to know more clearly how that will work, presumably apple would not be culpable because they don't know what the image is until they see it and then make a human decision that it is a false poitive or a match and then refer it to ncmec, but again there needs to be 30 false positives to get to that stage

apple did a trial run on a 100 million non-child porn images and got 3 false positives

as i have said upthread try uploading an original image to tineye.com which searches 45 billion images and see if you get a false positive
Well the other side of this is, you need two items for a comparison. So what is Apple comparing a flagged image TO? If they can't legally have CSAM, how can they do a comparison?

Is there a document breakdown of the review process? I am curious to know what the actual employee would be doing.
 
Well the other side of this is, you need two items for a comparison. So what is Apple comparing a flagged image TO? If they can't legally have CSAM, how can they do a comparison?

Is there a document breakdown of the review process? I am curious to know what the actual employee would be doing.
they wouldn't have the actual child porn image, only the hash which would match the hash of the user uploaded photo

presumably apple would look at the user uploaded photo and if it appeared to be child porn then they would refer it to ncmec

again they would need to have 30 images from any one user which would be highly highly highly unlikely unless it all was in fact, child porn
 
presumably apple would look at the user uploaded photo and if it appeared to be child porn then they would refer it to ncmec

You probably don't know, but at this point its just a judgement call right? So Apple will make a judgement on someone's age if they are close to the 15-25 range? Or are the borderline age (16 when legal age where I live is 18) not what CSAM is addressing?

I know this is all hypothetical, I know the chances of someone with 30 false images are low. But its NOT zero otherwise why even have a manual review process at Apple?
 
People who are into pornography of any type are usually addicted to it and collect hundreds and thousands of images, not just a dozen or so, so 30 seems reasonable to me. Cloud services have already been scanning for CSAM on their servers, so no one has "gotten around" this before or now. They still won't be able to store CSAM collections on iCloud without being detected. They could have 100,000 CSAM images on their phone and not use iCloud for photos, and Apple will never know about that, precisely BECAUSE this is not a "mass surveillance" as some people are twisting it.
If apple is already scanning on their end on iCloud servers, why scan on device too?
 
  • Like
Reactions: Vegas_Dude
How is this any different than the police searching your home without a warrant because “don’t worry, if you don’t have anything then we’ll ignore the rest of what we find”
I believe Apple is under legal guidelines to make sure CSAM content is not on their servers - thus scanning iCloud data (which it does currently). I honestly do believe Apple is doing this with as much focus on privacy as possible.

A good comparison would be you have a lot of guns at your house and you do have permits/licenses for those guns. So everything is legal right? But you cannot bring them with you in the airport.

This is ONLY applicable to images being uploaded to iCloud anyway. Its part of the iCloud pipeline.
 
Don’t say it when you don’t know it…. It’s totally unclear at this point what the program will do when it detects stuff on your phone…. But it’s still irrelevant to the issue that the software is still on your phone scanning for stuff regardless of whether you send it to iCloud …. Lets see a toggle cutting off the scan…. Doubt you will find that
What are you talking about. Apple confirmed that disabling iCloud photos disables the scanning completely. Also, the “scanning” ONLY TAKES PLACE DURING THE UPLOAD PROCESS.
 
How is this any different than the police searching your home without a warrant because “don’t worry, if you don’t have anything then we’ll ignore the rest of what we find”
You’re inviting them in though by toggling on iCloud Photo Library.
 
  • Like
Reactions: giggles
This is the part that I don't see have details anywhere. Again, what if the subject is 25 but looks 15? Will Apple make that judgement call that its child porn? 15 is still a child where I live.

Again, I am just trying to learn the process. I guess I need to keep saying this over and over and over again to you people to avoid the hate responses.
Is that 25 year old in the database of existing CSAM images?? If not, then not a chance of being a match.
 
  • Like
Reactions: giggles
This is the part that I don't see have details anywhere. Again, what if the subject is 25 but looks 15? Will Apple make that judgement call that its child porn? 15 is still a child where I live.

Again, I am just trying to learn the process. I guess I need to keep saying this over and over and over again to you people to avoid the hate responses.
The Apple review wouldn't need to make that determination. Any positive matches would be to images from the NCMEC database where the images are already catalogued and subjects are known to be underage.

False positive matches are pure coincidental matches. They are no more likely to happen to porn-like images than any other kind of image. They are chance mathematical collisions with completely random images. (Apple claims their testing shows this is a less than 1 in a trillion chance of it happening to you, per year. That's about 1 occurrence to somebody, somewhere every 1000 years)

Because of this, the Apple reviewer has almost nothing to do. They would only need to glance at the collection of thumbnails, and decide if the collection looks suspect or if it's that 1 in 1000 year event and it looks completely random.

It's possible that when this event happens, in the year 2785 or so, the false positive collection could coincidentally contain legal adult porn and get passed to NCMEC as 'suspect' even though it's legal. But that would be an incredibly unlucky coincidence. Even if that did happen, NCMEC would get the report, check the thumbnails against the real images on record, and immediately realise they aren't a match. I guess they might want to investigate if the subjects are underage anyway, but because they only have the low res derivative to work from its probably not going to be productive.
 
The Apple review wouldn't need to make that determination. Any positive matches would be to images from the NCMEC database where the images are already catalogued and subjects are known to be underage.

False positive matches are pure coincidental matches. They are no more likely to happen to porn-like images than any other kind of image. They are chance mathematical collisions with completely random images. (Apple claims their testing shows this is a less than 1 in a trillion chance of it happening to you, per year. That's about 1 occurrence to somebody, somewhere every 1000 years)

Because of this, the Apple reviewer has almost nothing to do. They would only need to glance at the collection of thumbnails, and decide if the collection looks suspect or if it's that 1 in 1000 year event and it looks completely random.

It's possible that when this event happens, in the year 2785 or so, the false positive collection could coincidentally contain legal adult porn and get passed to NCMEC as 'suspect' even though it's legal. But that would be an incredibly unlucky coincidence. Even if that did happen, NCMEC would get the report, check the thumbnails against the real images on record, and immediately realise they aren't a match. I guess they might want to investigate if the subjects are underage anyway, but because they only have the low res derivative to work from its probably not going to be productive.
Well if it won't happen, what is the point of the manual review process from Apple in the first place? Especially if they cannot actively compare it with the CSAM images. Why not just send it to review by those that DO have access to those images?
 
I posted this in another thread: i actually think it is better here. But i always like more thoughts.


Let me ask this. Since this is only currently legal in the United States, if that…

and this is not turned on for phones from Europe or other countries.

what happens if I come from Italy, into United States with an image that may not be tagged in a European database but is tagged in an American.

or VSVS when the eventual role out in different countries happens, I assume they will have different standards hashes images etc., unless this is one giant worldwide database singularly pulled from

honestly curious, how will that work. If it is illegal to have a homosexual image in country “a” and the person comes from country “b” where it is legal visiting, will they get flagged? Arrested? Will it be reported to their own jurisdiction or the local law?
 
Don’t say it when you don’t know it…. It’s totally unclear at this point what the program will do when it detects stuff on your phone…. But it’s still irrelevant to the issue that the software is still on your phone scanning for stuff regardless of whether you send it to iCloud …. Lets see a toggle cutting off the scan…. Doubt you will find that
It is perfectly clear what Apple is saying about this. The entire matching process is wrapped in a form of encryption which reveals absolutely nothing about the image. Your phone does not know if there was a match. Nothing is 'detected'. iCloud allows your phone to upload it as normal.

It is only later, when Apple server periodically examines the safety vouchers you uploaded might it discover that 30 or so of those vouchers collectively provide a decryption key for those vouchers. This cannot happen on your device. This is why there won't be a toggle, because the system is useless without the upload and the periodic analysis on iCloud servers.
 
I posted this in another thread: i actually think it is better here. But i always like more thoughts.


Let me ask this. Since this is only currently legal in the United States, if that…

and this is not turned on for phones from Europe or other countries.

what happens if I come from Italy, into United States with an image that may not be tagged in a European database but is tagged in an American.

or VSVS when the eventual role out in different countries happens, I assume they will have different standards hashes images etc., unless this is one giant worldwide database singularly pulled from

honestly curious, how will that work. If it is illegal to have a homosexual image in country “a” and the person comes from country “b” where it is legal visiting, will they get flagged? Arrested? Will it be reported to their own jurisdiction or the local law?
All iPhones globally will contain the same hash database. Users will be able to verify this on their devices. The hash db will contain only entries which are provided by child safety organisation in at least two different countries. Only images flagged in both countries will appear in the DB. The DB is a part of the OS not a separate download. This is their system against manipulated DBs and coercion. Apple are insisting that there's is only one version of iOS because they don't have the means to create multiple different versions. This is the defence they used in the San Bernadino case against the FBI - there is one iOS and you can't make us develop a separate one for your projects.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.