Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Seems like Apple really IS trying to please their CCP masters in the mainland. Definitely won’t be updating to the new iOS and will be looking at other manufacturers in the future for mobile devices if Apple decides to follow through with this awful idea. I thought they were all about privacy and protection of information. Yes, child porn and exploitation needs to be addressed and dealt with. No, this isn’t the way to go about it. Aside from opening the can of worms that leads to scanning of all other types of pictures and abuse of powers, let’s say people intentionally trying to get someone in trouble…sends them a child porn photo to their device. The owner didn’t ask for it or even wants it but it was sent to them. Framed, essentially. And what about the adult movies with role playing where one or the other is ‘young’ looking and acts the part. Definitely not my taste and not my interest but there’s nothing illegal about that in many locales. How does the scanning software distinguish between genuine child porn and two consenting adults playing out fantasy roles. Like i, and many others have stated, Apple has good intentions but it’s a truly awful idea to go about it
 
It doesn't seem like anybody read any details about what Apple is actually doing here (not a surprise). This appears to include Snowden and the EFF (also not much of a surprise).

Apple is not scanning all your pictures looking for certain types of content that "might" be images of child sexual exploitation or abuse.

Rather, there is a database of known child sexual exploitation images maintained by law enforcement. Cryptographic hashes are generated for each of these known images/files.

The only thing Apple is scanning for are files that match the cryptographic hashes of known images of child sexual abuse. They are not looking at your images using machine learning, or anything close to that.

If this is something you feel is worthy of criticism, go to town. But criticize what they are actually doing, not some inflated imaginary version of what they're doing.

Oh, and as per yesterday's article on the subject -- Apple and all other major tech companies have already been doing this for years. It's nothing new.

I don’t know it sounds pretty fuzzy to me. Apple's summary:
NeuralHash is a perceptual hashing function that maps images to numbers. Perceptual hashing bases this number on features of the image instead of the precise values of pixels in the image.
Sounds like they are looking for features of the photo to make sure cropping or slight pixel changes don’t throw it off.
 
  • Like
Reactions: peanuts_of_pathos
Apples a hypocrite! First they release App Tracking Transparency Att to ensure our privacy and now they’re scanning out iPhones for child nudity and ?????
I Bought An iPhone Not big brother watching over me
I wouldn’t be surprised if down the line we find out your Face ID has been given to agencies etc
This is the beginning of the end for Apple as much as I hate saying it.
Always remember that regardless of what you do on your iPhone nothing is private. !!!!!!
 
So you'd rather they be analyzing all your photos on iCloud, where they can decrypt and view them vs. on your phone where they can't? I'm honestly not understanding your logic here. This new method is extremely more privacy-friendly.

Can they not do the same hash comparison in the cloud and then gather image data based off of that?
 
Apples a hypocrite! First they release App Tracking Transparency Att to ensure our privacy and now they’re scanning out iPhones for child nudity and ?????
I Bought An iPhone Not big brother watching over me
I wouldn’t be surprised if down the line we find out your Face ID has been given to agencies etc
This is the beginning of the end for Apple as much as I hate saying it.
Always remember that regardless of what you do on your iPhone nothing is private. !!!!!!
Now I understand why Apple and google pushed so hard against the Ubuntu os mobile phones to be released in the United States. God forbid there’s a competitor that actually offers privacy
 
I appreciate Apple's sentiment, but I can't imagine why Apple is pursuing this functionally anencephalic idea.

OK - so the scanning process matches your private photos with templates of child abuse photos using processing and memory local to your device. First, how much memory and battery is that going to chew up?

iPhones already scans every photo today for content, faces etc. Creating an additional hash isn't resource heavy at all. Also iPhones do most of their image processing when connected to power and you aren't using your phone.

Unless of course the template matching does not require an exact match. In that case, there will be inevitably false positives.

It's probably local-sensitivity hashing which doesn't require an exact match. Google is the leader here and even presented a paper (in China!) in 2008 on how to it, and they had implementations for at least 10 years.

The probability for a false positive is very low. And Apple has additional controls.

if your photo erroneously 'matches' a template, some human at Apple will review it. So some person could be looking at your private photos, perhaps of you, your kids or your partner, that have been misclassified. If that doesn't define creepy, I do not know what does.

You have to have several matches. People who downloads child pornography usually have thousands and hundreds of thousands of pictures. Apple could easily set this to 50 to reduce the probability of false positives dramatically.

Yes, someone at Apple would be looking at the photos which have been flagged. Apple already has this power today if they want to. And if they're served with a search warrant they turn everything over if needed.


And this is if the system works as advertised, what prevents the templates from being changed to detect memes, flags, political slogans, etc. that could allow governments to abuse this type of surveillance, as governments have abused every other form of surveillance made available to them? Also, when is Apple going to start scanning our audio material for key words (oh, I forgot, the NSA do that), start scanning our text documents for verboten thoughtcrime, etc. etc.???

Google and Facebook have been doing this for a decade. How many governments are forcing Google and Facebook to do as you describe?
 
  • Angry
Reactions: peanuts_of_pathos
There goes my usually strongest argument for iOS vs. Android. :(
I've been saying this forever. Apple's "privacy first" thing is almost pure marketing. You probably are better protected than on Android with third party apps roaming wild and carriers installing stuff, but that's more due to the locked down nature of the platform than Apple really caring about privacy more. If it ever interferes with their profits, they'll pick the profits just like Google does.
 
A- This is a mess, but Apple’s fanatics will applaud no matter what.

B- I take for granted that any nation state is able to unlock or spy on any gadget. The converse would be naive.
 
  • Like
Reactions: danskin
What will stop that very photo become “the known child abuse image”? You do know the database is under constant update right?

In your examples the children are probably too old since the database, AFAIK, is a worst of the worst database and contains imagery of very young children.

But if teenager creates child pornography and shares it (inadvertently), well it's child pornography which is shared, which is exactly what the system is designed to stop.

The smart solution would be to delete those images from the phone.
 
Can they not do the same hash comparison in the cloud and then gather image data based off of that?

I'm sure they could, but--again--that means they're scanning files they actually have access to and then there's a chance for abuse. With this new system, as explained on Apple's website, it will be impossible for Apple to know which photos have been flagged until that specific threshold is reached, because that info is coming from a scan happening on the user's end, not Apple's. Again, from Apple:

Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.

Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.

So the irony here is Apple is actually doing what they've always done but in a MUCH more private manner due the checks and balances, but people are losing their **** as if this is somehow a breach of privacy. Crazy! I guess they hear the word "scan" and "on your phone" and envision an Apple employee watching live scan results from your phone and browsing through the results. Of course that's not AT ALL what's happening. Apple remains completely ignorant of any of your images until that threshold number of flagged files is uploaded to iCloud and only then reveals to them which files were flagged as matches so they can review and confirm.
 
In your examples the children are probably too old since the database, AFAIK, is a worst of the worst database and contains imagery of very young children.

But if teenager creates child pornography and shares it (inadvertently), well it's child pornography which is shared, which is exactly what the system is designed to stop.

The smart solution would be to delete those images from the phone.
We could secretly search everyone’s house every so often to take a peek at its content just in case something is illegal. If nothing is illegal they won’t notice. Just add a disclaimer when you buy the lock.

“Apple stop child pornography”, said nobody but the marketing department trying to sell a mass surveillance feature.
 
Did we not all accept our fates when we filled our homes, vehicles and lives with "connected to all things" technology that requires us to sign up for accounts and apps then handing over our personal data. These devices are tracking our movements, searches, behaviour etc.

The surveillance state was secretive, now we willingly pay our hard earned cash to buy gadgets and be a part of it just to watch online videos, chat on social media, stream music and tv etc.

The illusion of privacy is a long time dead.
 
It doesn't seem like anybody read any details about what Apple is actually doing here (not a surprise). This appears to include Snowden and the EFF (also not much of a surprise).

Apple is not scanning all your pictures looking for certain types of content that "might" be images of child sexual exploitation or abuse.

Rather, there is a database of known child sexual exploitation images maintained by law enforcement. Cryptographic hashes are generated for each of these known images/files.

The only thing Apple is scanning for are files that match the cryptographic hashes of known images of child sexual abuse. They are not looking at your images using machine learning, or anything close to that.

If this is something you feel is worthy of criticism, go to town. But criticize what they are actually doing, not some inflated imaginary version of what they're doing.

Oh, and as per yesterday's article on the subject -- Apple and all other major tech companies have already been doing this for years. It's nothing new.
I understand the details.

But if I don't have access to the source content that was used to create the hash database (which I shouldn't have because it's child porn), how can I trust the database? How can I know that they don't sneak that picture of the tianmen square in '89 in there, to find out if I'm secretly against the communist party? The only proper thing to do design the system that nobody can judge what's on my phone. I don't even use iCloud photos because I know it's not encrpyted and it could get hacked and leak at any time. But what they're doing is opening a channel where my phone can decide whether Apple or authorities should know about what private things I do on my device. because the Phone software is essentially a black box I can't just install something like Little Snitch that I can use to prevent that data from being sent to Apple.

It also doesn't matter that apple vows to only share content if it indeed contains child porn. They can just get a court order to share all the content anyway and then have to comply. This breaks the whole "but we don't even HAVE the data" that is one of the main reasons to get an iPhone in the first place. If the iPhone can, on its own, "decide", for whatever reason to send such an "alert", it can and will be abused.
Seems like Apple really IS trying to please their CCP masters in the mainland. Definitely won’t be updating to the new iOS and will be looking at other manufacturers in the future for mobile devices if Apple decides to follow through with this awful idea. I thought they were all about privacy and protection of information. Yes, child porn and exploitation needs to be addressed and dealt with. No, this isn’t the way to go about it. Aside from opening the can of worms that leads to scanning of all other types of pictures and abuse of powers, let’s say people intentionally trying to get someone in trouble…sends them a child porn photo to their device. The owner didn’t ask for it or even wants it but it was sent to them. Framed, essentially. And what about the adult movies with role playing where one or the other is ‘young’ looking and acts the part. Definitely not my taste and not my interest but there’s nothing illegal about that in many locales. How does the scanning software distinguish between genuine child porn and two consenting adults playing out fantasy roles. Like i, and many others have stated, Apple has good intentions but it’s a truly awful idea to go about it
The stuff you mentioned shouldn't be an issue because, again, that wouldn't be in the database and thus wouldn't be flagged. The database would only contain known, authentic material of child pornography. in theory.
Can they not do the same hash comparison in the cloud and then gather image data based off of that?
Yes, I don't understand. If iCloud photos aren't end-to-end encrypted anyway, why not do its in the cloud and let people opt out if they don't want their photos x-rayed.
iPhones already scans every photo today for content, faces etc. Creating an additional hash isn't resource heavy at all. Also iPhones do most of their image processing when connected to power and you aren't using your phone.



It's probably local-sensitivity hashing which doesn't require an exact match. Google is the leader here and even presented a paper (in China!) in 2008 on how to it, and they had implementations for at least 10 years.

The probability for a false positive is very low. And Apple has additional controls.



You have to have several matches. People who downloads child pornography usually have thousands and hundreds of thousands of pictures. Apple could easily set this to 50 to reduce the probability of false positives dramatically.

Yes, someone at Apple would be looking at the photos which have been flagged. Apple already has this power today if they want to. And if they're served with a search warrant they turn everything over if needed.




Google and Facebook have been doing this for a decade. How many governments are forcing Google and Facebook to do as you describe?
Facebook complies with a lot of requests for law enforcement to obtain chat conversations due to court orders. Apple, with iMessage, can say that they do not have access to those conversations, which is one of the big reasons to use iMessage over Facebook Messenger in the first place. If Apple now says that they'll put a software on your phone that will read your messages and alert an outside party of whatever dirty or politically inconvenient business you're doing, the whole thing breaks down. Unacceptable.
 
I'm sure they could, but--again--that means they're scanning files they actually have access to and then there's a chance for abuse. With this new system, as explained on Apple's website, it will be impossible for Apple to know which photos have been flagged until that specific threshold is reached, because that info is coming from a scan happening on the user's end, not Apple's. Again, from Apple:



So the irony here is Apple is actually doing what they've always done but in a MUCH more private manner due the checks and balances, but people are losing their **** as if this is somehow a breach of privacy. Crazy! I guess they hear the word "scan" and "on your phone" and envision an Apple employee watching live scan results from your phone and browsing through the results. Of course that's not AT ALL what's happening. Apple remains completely ignorant of any of your images until that threshold number of flagged files is uploaded to iCloud and only then reveals to them which files were flagged as matches so they can review and confirm.

I get what you're saying, and appreciate the rational discussion here. But the opportunity for abuse exists in iCloud photos already. Nobody hand picks which photos are uploaded / backed up to iCloud Photos (most don't anyway). We trust they won't abuse the access they already have simply because they say they won't do it. That's a baseline assumption that was already in place. Now that CSAM is being implemented, we still have to simply trust they're not abusing their access to iCloud photos, whether flagged by CSAM or not. So while I get that the appearance of doing it how Apple is doing it looks a bit better from a privacy perspective, it's already based on an assumption of trust on our part. The only difference now (and dare I say sacrifice to doing it this way?) is that they're forcing it to happen on our "servers", instead of on their servers. And it's still unclear to me if this hash comparison is happening even if a user is not using iCloud Photos. I still revert back to my house analogy from the previous comment; which, again, wasn't perfect, but IMHO illustrates the general concept and problem that people have with this (excluding those that haven't actually read about it).
 
  • Like
Reactions: peanuts_of_pathos
Do you have any idea how the database is being updated? How often it would be? What would stop law enforcement to add those photos immediately after the case? Sure, If those individuals are stupid enough, they deserve to get caught. Still doesn’t invalidate the huge potential of overreach.

The development of this database started in 1999.

It's used to be maintained by the Department of Justice but I believe its now being managed by U.S. Immigration & Customs Enforcement (ICE). NCMEC (private organization) is also important. I guess anyone can contribute but its usually different law enforcement organizations.

The police doesn't have direct access to the database. It's limited to a few police officers, most of them from the FBI or the Department of Justice, which will run searches for other law enforcement agency.
 
It's probably local-sensitivity hashing which doesn't require an exact match. Google is the leader here and even presented a paper (in China!) in 2008 on how to it, and they had implementations for at least 10 years.

Google's technology literally can't tell the difference between a black person and a Gorilla, but sure, this tech will be perfect...

 
  • Like
Reactions: Somian and danskin
The development of this database started in 1999.

It's used to be maintained by the Department of Justice but I believe its now being managed by U.S. Immigration & Customs Enforcement (ICE). NCMEC (private organization) is also important. I guess anyone can contribute but its usually different law enforcement organizations.

The police doesn't have direct access to the database. It's limited to a few police officers, most of them from the FBI or the Department of Justice, which will run searches for other law enforcement agency.

Sidenote - major props to those that work in this area to fight all of this filth. I can't imagine the long term emotional damage it does to someone...
 
  • Like
Reactions: peanuts_of_pathos
The database contains mostly prepubescent children in explicit sexual poses or acts. Teenage sex and nudity photos are not the target for this system.
So this comment sort of helps me understand what Apple is doing. And maybe I am still wrong. Or maybe I need more coffee.

Government has a database of child p word images. If your Apple device happens to have one of these images and you are distributing said image, it would flag the system, which would then be turned over to law enforcement. Sort of like antivirus. The antivirus company has a database of known malicious files. As the files pass through the system, if said file matches or contains a certain signature, they are flagged as potentially harmful or malicious.

So in a way I guess this is child p word antivirus for Apple products.
 
  • Like
Reactions: peanuts_of_pathos
It’s well intentioned but it’s just a bad idea. It could/should be used for actual child porn suspects but not for every single iCloud user writ large as the potential for images to be falsely identified by the software is all but admitted because of the human review process. And presumably child porn involves anyone under the age of 18 so for edge case scenarios where images are not obviously depicting little kids, how could human reviewers possibly know for certain the age of who is depicted in an image? There are non porn images aplenty on the internet of women who are over the age of 18 but can look convincingly younger in a photograph, intentionally or not. And visa-versa. I just don’t think they have an answer for that and until they do, there is no reason for confidence that it will all “just work” in a draconian implementation like this and that someone innocent of uploading child porn to iCloud will never need to explain their perfectly legal iCloud images.

The program and the database have existed for 22 years and you weren't aware of it probably because it avoids the problem you describe.

AFAIK, the database contains pretty young children in very explicit poses or acts, not high school teenagers.
 
  • Like
Reactions: peanuts_of_pathos
This best fits the current evidence. My guess: Apple was forced to add scanning of documents stored locally on devices by various authoritarian governments as a cost of doing business in those countries. Apple knew the surveillance would be discovered so they are trying to dress up the complete loss of privacy as "think of the children" and hope people accept their spot under the microscope without complaint.
This. Excellent thought there.
thumpsup.gif
(You may very well be electric, but you certainly are no potato.) :)

There was a recent report of a person being arrested on child porn charges because they had anime in their possession... just run of the mill anime, not any of those dodgy sub-genres. Even on the surface, this latest move by Apple should cause people to be concerned.
 
It's usually the same people who think Google is going to take all the data they collect for marketing ads and use it to black mail them specifically, like Google even cares who they are.
It's usually the same people who think NSO Group is going to take all the data they collect from Pegasus and use it to black mail them specifically, like NSO Group even cares who they are.

See what I did there...
 
  • Like
Reactions: peanuts_of_pathos
The thing is, this isn't only working for pictures. It's no problem to scan voice conversations for "illegal words or phrases" or maybe the at this time wrong political view etc.

So once this is implemented, this is the road we all will go. There will be zero privacy on Apples devices and any other brand that will follow.

My bet is that Apple and the others have gone too far now after making this public.

People who have lived in failed states don't use smartphones or even landline phones for anything that is private. Now this will expand to the rest of the world.

Would you rent an appartment knowing anything you do is watched by the owner? Sure not!

I will not use my iPhone and iPad anymore once this new software is part of the OS. Not because i have suspicious pictures (i almost have none pictures at all, because I don't take many) or plan crimes.

But really, this idea alone makes using iPhones or any other smartphone a complete bad idea.

I might continue using my fire tv for streaming and maybe my Ipad for online shopping and some inevitable emails.

But i will not use smartphones anymore. Looks promising for Nokia now.

Wouldn't that be ironic, if Nokia will destroy Apple 15 years after Apple destroyed Nokia?

Thank you Tim Apple, you had one job!
 
It's parents' prudish approach since they have to turn it on.



Since Google started doing this in 2008 I believe and Facebook in 2011 isn't it a bit late to be worried now?

Google and Facebook are scanning material published on their services. They are not processing all your pictures client side. There is massive privacy difference between scanning data (by the service after receiving the data) when it’s distributed or transferred to service outside of users control and scanning material client side when it is located on clients device.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.