Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Won‘t follow legal Federal request from multiple agencies to unlock a suspected mass murderer’s phone due to “privacy and precedent” concerns, introduces on device snooping that at best will expose private photos to 3rd party for “review.”
Remember when people threw a fit about Alex and Siri recordings being listened to by employees of those respective companies? How did Apple think this was going to go over any better than that?
Furthermore, this feature has been explained as a way to notify parents if someone texts their child a nude, but yesterday it was explained by Apple that when these hash checks run they are only looking for previously identified illegal material that have been catalogued by federal agencies and victims rights organizations. How would a selfie spontaneously snapped in a bathroom and then sent to my daughter match that description? In order for this feature to work, their phone has to be actively scanning for all nude content, not just those previously flagged by the above groups. I’ve already seen comments that effective say “oh you’re only against this if you have something to hide,” which is ironic as that as the same argument advocates of the “stop and frisk” policy used. Given how that turned out you’d think they would know better.

Apple is going to get torn apart over this. Tim Cook has completely lost his mind.
 
Everything starts small. Today’s protection of child sexual abuse could be tomorrow’s political attack and targeted censorship. Even if I give apple full benefit of doubt and assume everything they mention here is correct, the action itself is already alarming regardless of the cause. We have “attempted murder” as an offence in many countries btw, which focuses on actions, not the end result.

Again, this is an ongoing issue which will evolve as time goes on. The only thing that will not change is change itself.

Again, I AGREE with you about the bigger picture. I'm talking about people who are spinning what Apple is actually specifically doing in THIS case.
 
This is how Minirity Report start. Scanning your iPhone, which pretty much contains a good portion of your life, and then extrapolating your search, images, apps, and locations to put together a profile of you to see if you may do something wrong in the future.
 
"Extremely high" isn't good enough, though, when taken in context with what they're doing. Apple is putting one foot into law enforcement by doing this. Their role should be protecting the privacy of the devices, not playing Chris Hansen (To Catch A Predator reference... ).

The way this should work is that law enforcement gets a warrant for Apple to do this hash scan on a user's device first, and then they implement it on a case by case basis.

The difference here (that so many people seem to keep forgetting or didn't read carefully) is that nothing (beyond the scan) happens unless the user uploads a number of these flagged photos to iCloud. Once you do that, the photos are no longer just on your personal device, but on Apple's computers. That changes things drastically - you've now uploaded illegal images to someone else's computers (servers).
 
This deserves a re-post :

1628231729468-jpeg.1815406


Things are different now.
(not for me,if they really do this,no more iPhones for me after 12 years simple as that)

Actually, nothing's changed. This only affects things you upload to iCloud (i.e. NOT your iPhone). Yes, the scan is happening on the device, but that info isn't passed along to Apple unless you upload them to their servers. And you'd be pretty stupid to do that if you have illegal images.
 
It's an awful place to be in for Apple.
I can understand the anxiety, fear of parents. Nobody wants your to have the kid airdropped a dick-pic from a random stranger on the train.

But it's also a slippery road on a path to mass-surveillance.

It's a sad thing that Apple deems this necessary.
Then it needs to be 100% opt in, with Apple having no way to enable it remotely.
 
Won‘t follow legal Federal request from multiple agencies to unlock a suspected mass murderer’s phone due to “privacy and precedent” concerns, introduces on device snooping that at best will expose private photos to 3rd party for “review.”
Remember when people threw a fit about Alex and Siri recordings being listened to by employees of those respective companies? How did Apple think this was going to go over any better than that?
Furthermore, this feature has been explained as a way to notify parents if someone texts their child a nude, but yesterday it was explained by Apple that when these hash checks run they are only looking for previously identified illegal material that have been catalogued by federal agencies and victims rights organizations. How would a selfie spontaneously snapped in a bathroom and then sent to my daughter match that description? In order for this feature to work, their phone has to be actively scanning for all nude content, not just those previously flagged by the above groups. I’ve already seen comments that effective say “oh you’re only against this if you have something to hide,” which is ironic as that as the same argument advocates of the “stop and frisk” policy used. Given how that turned out you’d think they would know better.

Apple is going to get torn apart over this. Tim Cook has completely lost his mind.
I don’t know if you are just aloof

but

this applies to iCloud data which Apple already shares with feds or local cops if they ask for it .
What Apple does NOT give is end to end encrypted data which is not on the cloud.

iCloud photos are NOT end to end encrypted

this is exactly why celebrity photos from iPhones have been hacked

this is exactly why Apple can share iCloud data with cops

this is exactly why Apple cannot scan your photos if you have iCloud photos off
 
What a foolish comment..that’s all you understood from this whole privacy concern? you think only sickos with kiddy porn are concerned??
If Apple can scan your photos and messages (for whatever reason),others can too..that means no privacy..

it’s like you leaving your house door for the police to enter anytime they want..guess what? burglars can enter too!

How would "others" get into your phone? Would you be handing your phone and login info over to "others" to give them a hand?

Can you list the specific steps that would be required?
 
  • Angry
Reactions: peanuts_of_pathos
It compares your pics against a “certified“ CP catalogue.
It doesn’t try to figure out if your pics are actually CP on the spot.

The question should be more about how this “certified” catalogue is audited and stuff.
It’s the human review process that bothers me the most. No clue whose these people are, what their qualifications are, what exactly they are looking at and so on. Since ultimately they are the last in the process chain they can have a profound effect on determining if an image constitutes child porn. A single human mistake can unfold into a horrific life changing event for someone who’s image is incorrectly identified as child porn.
 
  • Love
Reactions: peanuts_of_pathos
Actually, nothing's changed. This only affects things you upload to iCloud (i.e. NOT your iPhone). Yes, the scan is happening on the device, but that info isn't passed along to Apple unless you upload them to their servers. And you'd be pretty stupid to do that if you have illegal images.
The point is it’s automatic, it’s been enabled silently for everyone who has iCloud enabled (which is most people - Apple prompts you to enable this when you set up a device).

I have no problem with Apple checking that you’re not storing bad **** on their servers, but they should be getting your informed consent first. That means telling you exactly what they’ll be looking for and who they’ll be notifying if they find it (eg “we search for known pro-democracy images and notify the CCP when we find some”).
 
It’s the human review process that bothers me the most. No clue whose these people are, what their qualifications are, what exactly they are looking at and so on. Since ultimately they are the last in the process chain they can have a profound effect on determining if an image constitutes child porn. A single human mistake can unfold into a horrific life changing event for someone who’s image is incorrectly identified as child porn.

Human review will happen if
1) the hashes match
2) this happens multiple times (threshold), we don’t know how many times

What is the compound probability of multiple ”unlucky” false positives matches?
 
  • Like
Reactions: bryn0076
Awful step by apple. And the 'alert your parents if you view a nude' is some awful overreach but typical of apple's strangely prudish approach.
It's parents' prudish approach since they have to turn it on.

The point is, this is indeed a slippery slope.

Since Google started doing this in 2008 I believe and Facebook in 2011 isn't it a bit late to be worried now?
 
The difference here (that so many people seem to keep forgetting or didn't read carefully) is that nothing (beyond the scan) happens unless the user uploads a number of these flagged photos to iCloud. Once you do that, the photos are no longer just on your personal device, but on Apple's computers. That changes things drastically - you've now uploaded illegal images to someone else's computers (servers).

I'm not sure that's correct. If that was the case, then why scan on the device at all, and why not do all of the scanning in iCloud? Unless I'm misunderstanding you...
 
I have no problem with Apple checking that you’re not storing bad **** on their servers, but they should be getting your informed consent first. That means telling you exactly what they’ll be looking for and who they’ll be notifying if they find it (eg “we search for known pro-democracy images and notify the CCP when we find some”).

Um, isn't that exactly what they're doing? If the law requires consent here, it will be in an updated TOS the user is required to agree to when they install the iOS update where this scanning takes effect. And they HAVE told us what they're looking for. Some of you are sounding like Alex Jones level conspiracy theorists here. Again, I understand the concerns about the bigger picture and what this step might mean for the future, but it sounds like Apple is being quite transparent about this (I mean, isn't this news all over the place now and everyone's talking about it, including many threads on just this site? Not exactly something they're hiding, LOL!). https://www.apple.com/child-safety/
 
  • Like
  • Disagree
Reactions: bryn0076 and mazz0
Given how willing Apple is to treat all of its customers as guilty until proven innocent for possessing illegal photos it just seems absolutely perverse that they refuse to assist in cases of mass murder with judge-approved probable cause.

They do help. If the police has a search warrant, Apple will turn over most of what is in iCloud to the police which includes a lot if a user is using iCloud for backups.
 
One interesting feature that requires to have iCloud Photos active: facial recognition in Homekit Security Video.

And I suppose this kind of service will exist on the iGlasses too.

It’s like being in the “iCloud Photo AI“ club gives us some powers but also some responsibilities (like accepting that photos get a “CP or not” automated score).
 
  • Sad
Reactions: peanuts_of_pathos
An irreversible move. The sad thing at this point is that governments around the world now know Apple has developed tech for mass surveillance with pinpoint accuracy. Even if they cancel their plans now they can be forced to use it at some point covertly. If it can be used to scan for CSAM it can be used to scan for anything specific the government wants. Sigh

It's not Apple who have created CSAM. This functionality has been deployed for many years.
 
  • Angry
Reactions: peanuts_of_pathos
I'm not sure that's correct. If that was the case, then why scan on the device at all, and why not do all of the scanning in iCloud? Unless I'm misunderstanding you...

I think everyone just needs to read about it from the horse's mouth:

CSAM detection​

Another important concern is the spread of Child Sexual Abuse Material (CSAM) online. CSAM refers to content that depicts sexually explicit activities involving a child.

To help address this, new technology in iOS and iPadOS* will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC). NCMEC acts as a comprehensive reporting center for CSAM and works in collaboration with law enforcement agencies across the United States.

Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices.

Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.

Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.

Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated.

This innovative new technology allows Apple to provide valuable and actionable information to NCMEC and law enforcement regarding the proliferation of known CSAM. And it does so while providing significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account. Even in these cases, Apple only learns about images that match known CSAM.
 
  • Love
Reactions: peanuts_of_pathos
For example - Tim Cook has been outspoken about his homosexuality, yet homosexuality used to be illegal in the United States. In many parts of the world, this is still the case. Should Apple be served with a subpoena to scan images and identify iPhone users which demonstrate that the laws of the country in which their devices are being used are being broken, will they comply with that subpoena? Would the evolution of LGBTQ rights in the United States ever have happened if there had been tools, such as these, facilitating a legal government crack-down?

Apple an already scan images which are in iCloud Photo Library and iCloud backups today. Apple does give the entirety of your iCloud content when served with a subpoena. Most of it are readable to Apple.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.