Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Do you know of any other phone manufacturer that does this on their hardware?

Android does similar things.

Every URL used by an app (including browsers) are analysed by the system and if they're found to be potentially harmful, they are being sent to Google.

Most Android phones also has a system service which will scan part of the file system and delete stuff if found to be malicious.

But Google does most of their scanning on their servers. Almost every Android user outside China uses Google Photos and gets their images scanned. Google also reports through the same system.
 
What you are describing is not perceptual hashing. You seem to describe some kind of machine learning.

Perceptual hashes for photos are designed to not change much when an image undergoes minor modifications such as compression, some cropping, doing colour correction, changing hue and brightness.

It's not good for finding pictures with similar "feel" which is what you are describing.

Anyway, Apple is using Hyperplane locality sensitivity hashing. It doesn't require any dataset to learn AFAIK.

In fact, Apple never got access to the CSAM-database so there would be some problem for Apple to train its algorithms.
As far as I understand Apple has the perceptual hashes of those 200K known CSA-pictures, and those hashes are different from the traditional pixel-perfect hashes, so Apple's hashing algorithm will have seen the pictures. Maybe they just gave NCMEC a Mac with a hashing GUI, told them to point it at a directory with the 200K pics and press Start, and then hand over the resulting hash-db to Apple. I agree that it does not look like they did training on the pictures, but I think it would have been fairly easy to have NCMEC do the training.

Regarding the tolerance of the perceptual hashing, I think Apple's whitepaper leaves a lot of room for interpretation there. They do speak of semantically similar images at one point, which is worrying.
 
  • Love
Reactions: TakeshimaIslands
Don't like the idea of clock-cycles and battery being used for purposes I didn't opt-in to. No matter how good the cause... And the fact that this tech can (and probably will) at some point be abused for other purposes should have made it a big no-go if apple is serious on privacy.
You agreed to the iCloud TOS, then you did opt in.
 
Wrong.

Fourth Amendment protections apply to searches conducted by private parties who act as “agents” of a government. How do you know if a private citizen or company acts as a government agent? The legal definitions and tests vary by court, but generally, a court will consider the degree of control that the government exercised over the private party’s search and whether the private party had an independent reason, unrelated to law enforcement, to conduct the search (such as a business justification).

This issue is arising increasingly in criminal cases in which online service providers turn over information that the government ultimately seeks to use as evidence of a crime. If a service provider is found to have conducted a warrantless search as a government agent, the criminal defendant may be able to prevent the court from considering not only the evidence that the service provider gave to the government, but any subsequent discoveries due to that initial evidence.

It is no when you consented for search. As soon as you clicked agree on the term and conditions, you are consenting search.

And you can argue that when dealing with child pornography, it is justifiable to search in order to preserve evident.
 
  • Sad
Reactions: Razorpit
This system will work if these people all have the same iconic pictures on the phone in their Photo Library. But it will not work if you have a bunch of general pictures from a protest.

If the Chinese government wants to catch these people there are much more efficient technologies from Apple TODAY.

The Chinese government could force Apple today to do this much more efficient and Apple wouldn't have to change ANYTHING on the iPhones.
... Or photograph a brochure on a wall containing one of the targeted pictures. Or receiving such target picture via iMessage. Object detection models are pretty good nowadays.
 
Are you ok with explosives sniffing machines at airports just because a few people may bring explosives?
Are you ok with Facebook, Google, etc. routinely scanning their platforms for illegal stuff just because a fee people will store illlegal stuff?
No, I'm not OK with that mass surveillance - just as I'm not ok that the government or my landlord regularly search people's home without a warranty.

And there is difference:
Scanning for explosives in airports is an effective preventative measure against terrorists blowing up a plane.
Scanning for known child sexual abuse material does not prevent children from being abused.
 
No, I'm not OK with that mass surveillance - just as I'm not ok that the government or my landlord regularly search people's home without a warranty.

And there is difference:
Scanning for explosives in airports is an effective preventative measure against terrorists blowing up a plane.
Scanning for known child sexual abuse material does not prevent children from being abused.

Who is going to search people’s home regularly without warrant? Talk about blowing things out of proportion.

You are using iOS software, you agree with terms and condition, you are consenting with the search. If you get problem with that, then don’t use iPhone or don’t update.
 
Because of a fully anonymous and local (local if negative) check against a CP database?

They will unsubscribe if they don’t understand how this works or own CP. Or out of principle because everything is a slippery slope.
I'll unsubscribe from every Apple service. Instead of Apple Music I'll use Spotify.
I don't have CP. Apple lied to us.
The hated one and Louis Rossman are right.
 
  • Like
Reactions: boswald
Scanning for known child sexual abuse material does not prevent children from being abused.

That’s not how that works, otherwise owning CP would be legal because “well, it’s already happened anyway, it’s not like me looking at these pics will prevent those children from being abused, it’s already happened anyway”.

I don’t advise you to use that as defense in court.
 
So it is OK with Tim if millions of users each have one photo of child sex abuse. Odd ethical standard.
That’s what called making a realistic trade-off. if you have one pic, it can be a fluke, an accident, an error. If you have 40.000, there’s no discussion. Apple wants to ban those users, rightfully so.
 
I applaud Apple’s afford to curb child pornography. If you are this worried about on device search, what are you hiding?
While I applaud Apple’s efforts, I don’t think it’s right to say if you have nothing to hide, you shouldn’t be worried about privacy.

As far as the details go, it’s obvious Apple has implemented this system with privacy in mind.
 
  • Like
Reactions: cwosigns
Unfortunately this not the way it works. What Apple is doing is actually neural matching (they call it neural hashing because of hashed data is matched which can be done). They have used aprox. 200 000 images to teach the AI. All the images are scanned and if they fall within a threshold the report is being made. Effectively they are not looking for exact matches but similarities.

That's not how NeuralHash works. It's not AI.

They are in fact looking for exact matches and derivates of that exact photo.

If you take two photos of the same sex act with 2 second difference the photos will be so different that they shouldn't mach.
 
  • Like
Reactions: hagar
Who is going to search people’s home regularly without warrant? Talk about blowing things out of proportion.
Why not? Cloud storage providers such as Dropbox, Apple iCloud and Google Drive provide personal storage space for my own personal things. That's the digital equivalent of renting space. Storing my photos with them isn't as I'd publish them for the whole world in Instagram.

And searches without cause are just that.
You are using iOS software, you agree with terms and condition, you are consenting with the search. If you get problem with that, then don’t use iPhone or don’t update.
Apple forces me to update my iOS software as part of fixing the defects and flaws in their software.
Also, if I'm having my iPhone exchanged/repaired on warranty, they'll also routinely provide it to me with newer software.

They're not forcing me to enable iCloud Photos (though they do force me to enable iCloud to use many of their advertised features and functionality). It's just a very small step down the slippery slope to enable the surveillance system to everyone, regardless of whether they use iCloud Photos or not.

I'm not even sure if the on-device scanning in iOS 15 will be limited to iCloud Photo users - as it doesn't explicitly say so.
That’s not how that works, otherwise owning CP would be legal because “well, it’s already happened anyway, it’s not like me looking at these pics will prevent those children from being abused, it’s already happened anyway”.
Deadly violence against people is illegal (with very few exceptions).
Merely possessing or looking at visual depictions of illegal violent acts is not.

The Hollywood movie industriy even makes tons of money by professionally staging and recording graphical violence and distributing such material.

In this case, I don't think the law needs to be changed. I'm all for banning child porn and possession of such being illegal and prosecutable. The existence of visual depictions of illegal acts does not justify instituting mass surveillance to me. Especially when such systems are potentially abusive and dangerous.
 
  • Disagree
Reactions: hagar
Why not? Cloud storage providers such as Dropbox, Apple iCloud and Google Drive provide personal storage space for my own personal things. That's the digital equivalent of renting space. Storing my photos with them isn't as I'd publish them for the whole world in Instagram.

And searches without cause are just that.

Apple forces me to update my iOS software as part of fixing the defects and flaws in their software.
Also, if I'm having my iPhone exchanged/repaired on warranty, they'll also routinely provide it to me with newer software.

They're not forcing me to enable iCloud Photos (though they do force me to enable iCloud to use many of their advertised features and functionality). It's just a very small step down the slippery slope to enable the surveillance system to everyone, regardless of whether they use iCloud Photos or not.

I'm not even sure if the on-device scanning in iOS 15 will be limited to iCloud Photo users - as it doesn't explicitly say so.

Deadly violence against people is illegal (with very few exceptions).
Merely possessing or looking at visual depictions of illegal violent acts is not.

The Hollywood movie industriy even makes tons of money by professionally staging and recording graphical violence and distributing such material.

In this case, I don't think the law needs to be changed. I'm all for banning child porn and possession of such being illegal and prosecutable. The existence of visual depictions of illegal acts does not justify instituting mass surveillance to me. Especially when such systems are potentially abusive and dangerous.
Being in possession of child porn is in fact illegal in the U.S.
 
As far as I understand Apple has the perceptual hashes of those 200K known CSA-pictures, and those hashes are different from the traditional pixel-perfect hashes, so Apple's hashing algorithm will have seen the pictures. Maybe they just gave NCMEC a Mac with a hashing GUI, told them to point it at a directory with the 200K pics and press Start, and then hand over the resulting hash-db to Apple. I agree that it does not look like they did training on the pictures, but I think it would have been fairly easy to have NCMEC do the training.

Regarding the tolerance of the perceptual hashing, I think Apple's whitepaper leaves a lot of room for interpretation there. They do speak of semantically similar images at one point, which is worrying.

That's probably how the hashes were generated by NCMEC by giving them software to generate the hashes.
Also the Hyperplane LSH is not something Apple invented. They are relying on others peoples work. The general way such an algorithm works is known.

The algorithms used for photos are of course optimised for photos as a media type and won't work for audio, but you don't need CSAM material to tweak or get good results. You might get a bit better results by testing it on photos but you just need a variety of photos preferably with som people in it.

If Apple wanted to catch all kinds of child pornography there would be other algorithms which are much more effective. Which algorithms? The ones they have already implemented in the Photos app for several years.

And those algorithms would be much better to find people who wear Maga hats, have photos of confederate flags, handguns and other guns, photos taken within Congress etc.
 
  • Like
Reactions: hagar
Why not? Cloud storage providers such as Dropbox, Apple iCloud and Google Drive provide personal storage space for my own personal things. That's the digital equivalent of renting space. Storing my photos with them isn't as I'd publish them for the whole world in Instagram.

And searches without cause are just that.

As soon a as you upload your files, your photos to cloud storage service, it is no longer private. It is not digital equivalent of renting space.

When you upload files to cloud storage, you are storing your files to third party’s server. When third party comes to play, it is no longer private. It is like when you have private conversion between to person, when third party heard, it is no longer private.

It is well defined by law.

Apple forces me to update my iOS software as part of fixing the defects and flaws in their software.
Also, if I'm having my iPhone exchanged/repaired on warranty, they'll also routinely provide it to me with newer software.

They're not forcing me to enable iCloud Photos (though they do force me to enable iCloud to use many of their advertised features and functionality). It's just a very small step down the slippery slope to enable the surveillance system to everyone, regardless of whether they use iCloud Photos or not.

I'm not even sure if the on-device scanning in iOS 15 will be limited to iCloud Photo users - as it doesn't explicitly say so.

Apple does not force you to update iOS software to iOS 15. They will provide software update for iOS 14 for awhile. It is up to you to decided what to do.

Talking about blowing out of proportion, we are far away from surveillance state.
 
  • Like
Reactions: hagar
Deadly violence against people is illegal (with very few exceptions).
Merely possessing or looking at visual depictions of illegal violent acts is not.

The Hollywood movie industriy even makes tons of money by professionally staging and recording graphical violence and distributing such material.

In this case, I don't think the law needs to be changed. I'm all for banning child porn and possession of such being illegal and prosecutable. The existence of visual depictions of illegal acts does not justify instituting mass surveillance to me. Especially when such systems are potentially abusive and dangerous.

Whatever, it is dumb to say “Punishing possession does nothing to prevent the actual child abuse”. It’s false and that’s not what lawmakers think in most places. Of course possession incentivise the production of more material of that genre. And the correct example you should have made are snuff movies, not staged violence.

Now you’re shifting the goal post by saying “I get that punishing possession would be helpful for those kids, but I don’t feel it’s worth it ENOUGH to implement this kind of mass surveillance”. That’s your opinion. Maybe you feel the risk of being blown up in a plane as more immediate for you in particular than being a victim of child abuse.
 
However you slice it, it’s matching against a limited known database and it’s said to make an error every 1176 years (I converted Apple’s 1 in a trillion claim in years in a previous post), it’s not “scanning” to try to figure out if an image constitutes a completely new offence on the spot.

That’s sniffing blindfolded, not looking at our pics.

I still don't understand why they can't just do this exact same process in iCloud instead of on my phone...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.