Do they have a warrant?
they don’t have a warrant when they scan and then catch 100s of child porn hoarders every month using Windows devices
Do they have a warrant?
On-device AI on the receiving end?Can someone explain to me how Apple can scan a child’s iMessages for nudes if iMessage is end-to-end encrypted? Is encryption turned off for child accounts?
Everything starts small. Today’s protection of child sexual abuse could be tomorrow’s political attack and targeted censorship. Even if I give apple full benefit of doubt and assume everything they mention here is correct, the action itself is already alarming regardless of the cause. We have “attempted murder” as an offence in many countries btw, which focuses on actions, not the end result.
Again, this is an ongoing issue which will evolve as time goes on. The only thing that will not change is change itself.
"Extremely high" isn't good enough, though, when taken in context with what they're doing. Apple is putting one foot into law enforcement by doing this. Their role should be protecting the privacy of the devices, not playing Chris Hansen (To Catch A Predator reference... ).
The way this should work is that law enforcement gets a warrant for Apple to do this hash scan on a user's device first, and then they implement it on a case by case basis.
This deserves a re-post :
![]()
Things are different now.
(not for me,if they really do this,no more iPhones for me after 12 years simple as that)
Then it needs to be 100% opt in, with Apple having no way to enable it remotely.It's an awful place to be in for Apple.
I can understand the anxiety, fear of parents. Nobody wants your to have the kid airdropped a dick-pic from a random stranger on the train.
But it's also a slippery road on a path to mass-surveillance.
It's a sad thing that Apple deems this necessary.
I don’t know if you are just aloofWon‘t follow legal Federal request from multiple agencies to unlock a suspected mass murderer’s phone due to “privacy and precedent” concerns, introduces on device snooping that at best will expose private photos to 3rd party for “review.”
Remember when people threw a fit about Alex and Siri recordings being listened to by employees of those respective companies? How did Apple think this was going to go over any better than that?
Furthermore, this feature has been explained as a way to notify parents if someone texts their child a nude, but yesterday it was explained by Apple that when these hash checks run they are only looking for previously identified illegal material that have been catalogued by federal agencies and victims rights organizations. How would a selfie spontaneously snapped in a bathroom and then sent to my daughter match that description? In order for this feature to work, their phone has to be actively scanning for all nude content, not just those previously flagged by the above groups. I’ve already seen comments that effective say “oh you’re only against this if you have something to hide,” which is ironic as that as the same argument advocates of the “stop and frisk” policy used. Given how that turned out you’d think they would know better.
Apple is going to get torn apart over this. Tim Cook has completely lost his mind.
What a foolish comment..that’s all you understood from this whole privacy concern? you think only sickos with kiddy porn are concerned??
If Apple can scan your photos and messages (for whatever reason),others can too..that means no privacy..
it’s like you leaving your house door for the police to enter anytime they want..guess what? burglars can enter too!
It’s the human review process that bothers me the most. No clue whose these people are, what their qualifications are, what exactly they are looking at and so on. Since ultimately they are the last in the process chain they can have a profound effect on determining if an image constitutes child porn. A single human mistake can unfold into a horrific life changing event for someone who’s image is incorrectly identified as child porn.It compares your pics against a “certified“ CP catalogue.
It doesn’t try to figure out if your pics are actually CP on the spot.
The question should be more about how this “certified” catalogue is audited and stuff.
The point is it’s automatic, it’s been enabled silently for everyone who has iCloud enabled (which is most people - Apple prompts you to enable this when you set up a device).Actually, nothing's changed. This only affects things you upload to iCloud (i.e. NOT your iPhone). Yes, the scan is happening on the device, but that info isn't passed along to Apple unless you upload them to their servers. And you'd be pretty stupid to do that if you have illegal images.
Ok my bad then. Sorry.Again, I AGREE with you about the bigger picture. I'm talking about people who are spinning what Apple is actually specifically doing in THIS case.
It’s the human review process that bothers me the most. No clue whose these people are, what their qualifications are, what exactly they are looking at and so on. Since ultimately they are the last in the process chain they can have a profound effect on determining if an image constitutes child porn. A single human mistake can unfold into a horrific life changing event for someone who’s image is incorrectly identified as child porn.
It's parents' prudish approach since they have to turn it on.Awful step by apple. And the 'alert your parents if you view a nude' is some awful overreach but typical of apple's strangely prudish approach.
The point is, this is indeed a slippery slope.
The difference here (that so many people seem to keep forgetting or didn't read carefully) is that nothing (beyond the scan) happens unless the user uploads a number of these flagged photos to iCloud. Once you do that, the photos are no longer just on your personal device, but on Apple's computers. That changes things drastically - you've now uploaded illegal images to someone else's computers (servers).
I have no problem with Apple checking that you’re not storing bad **** on their servers, but they should be getting your informed consent first. That means telling you exactly what they’ll be looking for and who they’ll be notifying if they find it (eg “we search for known pro-democracy images and notify the CCP when we find some”).
Given how willing Apple is to treat all of its customers as guilty until proven innocent for possessing illegal photos it just seems absolutely perverse that they refuse to assist in cases of mass murder with judge-approved probable cause.
An irreversible move. The sad thing at this point is that governments around the world now know Apple has developed tech for mass surveillance with pinpoint accuracy. Even if they cancel their plans now they can be forced to use it at some point covertly. If it can be used to scan for CSAM it can be used to scan for anything specific the government wants. Sigh
I'm not sure that's correct. If that was the case, then why scan on the device at all, and why not do all of the scanning in iCloud? Unless I'm misunderstanding you...
CSAM detection
Another important concern is the spread of Child Sexual Abuse Material (CSAM) online. CSAM refers to content that depicts sexually explicit activities involving a child.
To help address this, new technology in iOS and iPadOS* will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC). NCMEC acts as a comprehensive reporting center for CSAM and works in collaboration with law enforcement agencies across the United States.
Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.
Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.
Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.
Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.
This innovative new technology allows Apple to provide valuable and actionable information to NCMEC and law enforcement regarding the proliferation of known CSAM. And it does so while providing significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account. Even in these cases, Apple only learns about images that match known CSAM.
For example - Tim Cook has been outspoken about his homosexuality, yet homosexuality used to be illegal in the United States. In many parts of the world, this is still the case. Should Apple be served with a subpoena to scan images and identify iPhone users which demonstrate that the laws of the country in which their devices are being used are being broken, will they comply with that subpoena? Would the evolution of LGBTQ rights in the United States ever have happened if there had been tools, such as these, facilitating a legal government crack-down?