Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
As long as we're being conspiratorial, why stop there?

It's probably been enabled since the late 70's and all of our false positives triggered by baby bath photos are being kept in sealed inditements until we retire because they're really just after our social security money.

ah, sarcasm. when you're desperate and have nothing else.
 
Last edited by a moderator:
Imagine if the Chinese government could one day use this system to check who has the Tank Man image in their cloud storage and deport that person to a muder camp the next day without question.

Apple can only enforce the local law. If the law is different in a different country, will it enforce that for its citizens? Say, everyone agrees that child abuse is bad. But what if in Russia, where homosexuality is pretty much a crime, anything labeled "LGBT propaganda aimed at minors" such as an informative book about an LGBT subject would be called "child abuse" for political reasons, and thus be illegal. Would Apple play international judge and pick and choose what it considers right and wrong based on its own morals, or would it strictly abide by the respective laws of each country, even if they go against Apple's initial "good intentions"? What happens when a government puts pressure on Apple to hand over control of this system to them "or else"? Will they do the right thing or will there come a point where money will matter more? (Hint: money eventually always takes priority over morals).

It sounds good but it gets messy the more questions you ask, which is not a good omen.
Apple is going to do whatever it thinks will maximize shareholder wealth regardless of the ethical issues involved.
 
This is indeed the bulls-hit that will happen.

The funny thing is, the people who actually do have ill intend behind, they are not stupid enough to put their stuff on iCloud.

So all this does, is just mass surveillance on regular people with the risk of being flagged for something that is innocent.

And I'm sure the US government will then later extend this mass surveillance capabilities for other things.
I agree that implementing this functionality is a bad idea, but you’re misinformed on how this implementation works. This functionality doesn’t use computer vision to, for example, detect children in a state of undress. Rather, it creates a derivative of each photo uploaded to iCloud and compares it to similarly created derivatives of known CSAM images by checking their hashes.

A photo taken by a parent of their kid in the bathtub or whatever will almost definitely not match a photo in the database; even in the vanishingly small chance that there is a false match, you would have to meet a threshold (reportedly around 30) of matched images and a human would review the matches before forwarding the matter to authorities.

The implementation is fine server-side; it works and has put a lot of bad people in jail while allowing people to simply not use a cloud service if they don’t want their photos scanned. Building it into the OS is another matter — how likely is Apple to resist government pressure to add custom databases including non-CSAM content or reapply it for features other than iCloud Photos? I think we know the answer.
 
What? Snowden has unveiled how the US government uses tech products to spy on people. That is a fact.

CSAM is another spying method, but this time Apple will now be monitoring for the US government, and you got to be really naive not to think the US government will not later extend CSAM given the horrible track record the US government has when it comes to spying.
So, you’re saying that, knowing how the US government CURRENTLY uses tech products to spy on people, they are currently NOT using tech products to spy on people where it comes to unencrypted images stored in cloud image repositories. So, they’re currently using tech products to spy, but, where it makes the conspiracy convenient, they’re absolutely not in one very specific area?

The government currently (supposedly) doesn’t require Apple to monitor for the US government because the government is currently using tech products to spy on people. BUT, the government, with its tech spying, has just been WAITING to hand off the tech spying to Apple and, I suppose, Google, Samsung, Microsoft, etc.? Or, is it that the government ONLY wants to hand off the tech spying to Apple in this case and most certainly not anyone else?
 
  • Like
Reactions: pdoherty
Totally. And people would prefer to protect their rights of a perceived breach of trust (with no facts to back it up) than the rights of children we know are being hurt. Shows exactly where there mind is at. Selfish and disgusting Imo.
I’m no fan of Antonin Scalia, but he was exactly correct when he said that, with respect to U.S. law, the criminality of a few is sometimes insulated in order to protect the privacy of us all. The people trafficking in this stuff need to be locked up, but that doesn’t mean that every measure taken against them is acceptable in the face of real risks for collateral damage to everyone’s privacy.
 
The issue here is that they're using AI to generate the hashes, it isn't simply a scan for identical hashes of photos. You don't need any AI to compare hashes, so what they're doing and how effective it is all speculation.

Bottom line is this system would use AI on your phone to scan your content, somehow. Given how primitive AI seems to be recognizing objects when using the camera, my speculation would be that it would make lots of mistakes.

And how long before this same technology is used to identify terrorism, and how effective and prone to mistake/bias would that be? Imagine what they would train the AI with to find terrorists--brown skinned, young men?

Just not a good idea to have our tech reporting on us to the government. The government has access to amazing technology itself, let them use their own resources and not use our computers against us.
 
  • Angry
  • Like
Reactions: CarlJ and fishmoose
I agree that implementing this functionality is a bad idea, but you’re misinformed on how this implementation works. This functionality doesn’t use computer vision to, for example, detect children in a state of undress. Rather, it creates a derivative of each photo uploaded to iCloud and compares it to similarly created derivatives of known CSAM images by checking their hashes.

A photo taken by a parent of their kid in the bathtub or whatever will almost definitely not match a photo in the database; even in the vanishingly small chance that there is a false match, you would have to meet a threshold (reportedly around 30) of matched images and a human would review the matches before forwarding the matter to authorities.

The implementation is fine server-side; it works and has put a lot of bad people in jail while allowing people to simply not use a cloud service if they don’t want their photos scanned. Building it into the OS is another matter — how likely is Apple to resist government pressure to add custom databases including non-CSAM content or reapply it for features other than iCloud Photos? I think we know the answer.
They're using AI to generate the hashes, not simply hashing the image file and doing a look up. The old style of hash the image and compare to known hashes would require no AI. I don't know if that's a great idea, but this implementation is more concerning because we don't know how it works at all--it's not just reporting for identical photos.

Traditional method was: Image>hash function>hash>compare to known bad hashes

Apple method: Image>AI scan>AI output>hash function>hash>compare to known bad hashes of AI output
 
Last edited:
People are judging this based on its current purported modus operandi but forgetting that everything changes, nothing stays the same. Today it’s child abuse. Tomorrow it’s anti public policy memes. And anyone who thinks this can’t happen has probably been in a coma for the past two years.
 
But history will tell us if CSAM was a good decision or not, when a Snowden 2.0 will tell us what the US government is doing now.
Actually, I can tell you right now, no future Snowden required. :) The government will be using newer, even more advanced technology to spy on people. And will, very likely, still NOT be handing off their surveillance to any tech companies.
 
Let parents enable it to protect their children’s devices.

That’s the solution.

If others upload abusive images to iCloud, perform the hash detection automatically on the server and report them to the FBI.

Done.

1660493993584.gif
 
People are judging this based on its current purported modus operandi but forgetting that everything changes, nothing stays the same. Today it’s child abuse. Tomorrow it’s anti public policy memes. And anyone who thinks this can’t happen has probably been in a coma for the past two years.
They're going to be scanning everything. Not just Apple obviously. The future is everything is analyzed a million times over...and it will be very helpful in many cases. Like you want a picture of your friend Jimmy? Tag him once and the AI can find all old pics of him, etc. But again, don't think it's a great idea to have these scanning features reporting back to the government.
 
That’s not how end to end encryption works. The whole reason for end to end encryption is to keep the middle man transmitting the data from seeing it.
So, your idea is that, in these countries, the governments ABSOLUTELY do not have the keys to decryption OR the ability to request any images they want to see, right now? All image repositories are able to be decrypted by the service providers so they can perform CSAM scans today, so they’ll decrypt for their own purpose but NOT decrypt for the government if the government asks, even though they’ll give the government the matched CSAM hash matches when they ask?
 
Apple will always do what’s right for customers. Ignoring this serious and disturbing problem will never be an option for Apple.

Those who are doing nothing wrong have nothing to fear.

This is about protecting young children from disgusting abuse.
No, this is about creating a technology that can be used to perform all the worst case scenarios from 1984. I think we can begrudgingly agree that Apple has the right to scan images on their own service - iCloud. But we will NEVER agree upon Apple creating a process to scan people’s personal devices and report them to the authorities.

Are you seriously saying that you can’t see anything wrong with doing this?

Famous last words there, in bold.
 
Let parents enable it to protect their children’s devices.

That’s the solution.

If others upload abusive images to iCloud, perform the hash detection automatically on the server and report them to the FBI.

Done.

View attachment 2042905
I think this is correct, too. Like we have to have some privacy and control over our own information. I think what's on our own hard drives/ssd, we should control. Once we upload it to Apple's servers, then I think we lose that control.

And soon enough everything will be networked storage probably, so we may have to revisit the idea of whether it makes sense on Apples servers too, but that's for a later date.
 
  • Like
Reactions: Count Blah
And there we have perfectly summed up everything that’s wrong with capitalism and particularly surveillance capitalism.
Ironic that the 1st example everyone uses for obvious abuse, is your beloved communist China - surveillance communism to the 1000th degree.

if you get to claim that REAL communism has never been tried, I’d contend that we don’t have capitalism, we have corporatism(corporation based, not classical definition) with the government bought and paid for by corporations, instead of REAL capitalism.
 
Ironic that the 1st example everyone uses for obvious abuse, is your beloved communist China - surveillance communism to the 1000th degree.

if you get to claim that REAL communism has never been tried, I’d contend that we don’t have capitalism, we have corporatism(corporation based, not classical definition) with the government bought and paid for by corporations, instead of REAL capitalism.

From where do you get the idea that I love China? China is also as capitalist as they come, with a totalitarian regime. China is a great example of what the U.S will become things going the way they seem after the next election.

And you’re right that we don’t have have real capitalism in the west. That was tried in the third-world. And that’s why it is now called the third-world.
 
  • Like
Reactions: nebojsak
I understand your distrust but I don’t believe hashing would benefit the government because it searches for a specific file. Intelligence doesn’t work like that, searching for links on evidence does, specifically for what it’s been designed. I don’t know how it would help government when government already have the ability to spy on peoples phones/computers.
This is key. The method used for hashing CSAM literally adds nothing to the volumes of information any government already has on anyone through far more effective methods that we don’t even know about.
 
They're using AI to generate the hashes, not simply hashing the image file and doing a look up. The old style of hash the image and compare to known hashes would require no AI. I don't know if that's a great idea, but this implementation is more concerning because we don't know how it works at all--it's not just reporting for identical photos.

Traditional method was: Image>hash function>hash>compare to known bad hashes

Apple method: Image>AI scan>AI output>hash function>hash>compare to known bad hashes of AI output
No. A visual derivative is created of the image before hashing in order to thwart trivial workarounds to prevent a hash from matching (making a color photo grayscale, cropping, etc.). AI isn’t needed for that step and isn’t used for that. Computer vision is far too imperfect to be used for a case like this, where too many false matches can ruin someone’s life.

Apple did implement a computer vision-based feature along these lines, but it’s strictly on-device, unrelated to iCloud Photos, and only for children in Family Sharing. It warns them about sensitive photos in iMessage.
 
British journalist Julia Somerville was famously detained by the Metropolitan Police for taking a snap of her kid in the bath.
Some overzealous clerk saw baby bath photos and jumped the gun? The entire lot should have been taken to court.
Saving an iCloud photo in 2022 is in some ways similar to having a photo developed at Boots in 1995.
Not remotely the same.
 
  • Like
Reactions: Unregistered 4U
the hard targets are never such stupid.
So, your thinking is that cloud providers should not scan for such images because no one would ever use cloud providers for this purpose, yes? What’s more likely is that they don’t use the services because they know they’re being scanned.
 
No. A visual derivative is created of the image before hashing in order to thwart trivial workarounds to prevent a hash from matching (making a color photo grayscale, cropping, etc.). AI isn’t needed for that step and isn’t used for that. Computer vision is far too imperfect to be used for a case like this, where too many false matches can ruin someone’s life.

Apple did implement a computer vision-based feature along these lines, but it’s strictly on-device, unrelated to iCloud Photos, and only for children in Family Sharing. It warns them about sensitive photos in iMessage.
So where does the AI come in?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.