Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Hmmm, I wonder which company has touted itself as a protector of privacy and has a LOT of billboards around saying that they are focusing on privacy...

No contradiction here.
Being for privacy doesn’t mean becoming literally Silk Road or Megaupload.
 
  • Like
Reactions: mw360
Over simplified I think; a flagged image gets checked for a false positive. At this point there is a window for Apple Employees to access your photo library for 'review'.
Completely false. Apple employees never get to look in your library. When you trigger enough positives to cross a threshold, a reviewer gets to see a visually derived image of only the offending pictures. This is not a flexible Apple policy or a server-side variable, it's a baked-in feature of the encryption that happens on your device.
 
The solution to remediate this would be to have the NCEMC release the list of hashes for open audit and review, it’s not like you can reconstruct the images from the hashes. If people find that non-CSAM images are being added to the database, it would be uncovered very quickly. The biggest issue with this system is that the hash list is presumed to only contain illegal images, yet is controlled by government funded entities. Apple’s response to this is that they check images first before reporting to law enforcement, and that response isn’t good enough.
That's a splendid idea, maybe not open audit, but perhaps a trusted 3. party who does process and security review professionally?

Also, a 3. party as a minimum needs
1 - access to review the source code for the implementation
2 - disclosure of Apple's review process once a positives threshold had been exceed. Full audit trail from their staff's reviews and independent, unannounced onsite spot check of process compliance
3 - review of results and outcomes from scans compared to analytical predictions (summaries should suffice) , and
4 - full access to test systems to independently verify matches, probe for false positive rates, hunt for security flaws, and do a end-to-end check of implemented solution against source code and formal specification.

1-4 will have to be a continuous commitment from Apple, then maybe we could regain some trust in their commitment to privacy?
 
  • Like
Reactions: Frustratedperson
I could have images of drugs and guns on my device, but that isn't illegal to own those images, and it would be impossible to say if because of those images I was doing anything illegal.
In my country, it's illegal to have pornographic materials. And the wording of the law includes "pornoactions," which was defined as "indecent acts," and it ranges from bikinis to kissing. There was a case about a couple (forgot the details) having a private video that was stolen and leaked. That couple was fined and jailed, longer than the people spreading the video itself.

You can kinda have an idea the implication if a Blackbox mass scanning system from a private company is implemented in a country with these kinds of laws.
 
Oh god. I’ll have to retract my earlier retraction now.

Apple: “The same set of hashes is stored in the operating system of every iPhone and iPad user”.

So they will be embedding these hash codes in the OS for sure. With many millions of known pictures, and a non-trivial hash code for each one, that’s a big chunk of data probably hundreds of megabytes in size that will come along with your ios15 update. And that can only grow in size year on year.
 
  • Haha
Reactions: ohio.emt
In my country, it's illegal to have pornographic materials. And the wording of the law includes "pornoactions," which was defined as "indecent acts," and it ranges from bikinis to kissing. There was a case about a couple (forgot the details) having a private video that was stolen and leaked. That couple was fined and jailed, longer than the people spreading the video itself.

You can kinda have an idea the implication if a Blackbox mass scanning system from a private company is implemented in a country with these kinds of laws.

I agree, but should we just outright ban this implementation because it may be implemented in other countries? Shouldn't the push back be if Apple try to implement it in your country? Don't you think Apple will be aware of the push back and outrage if they try to implement it in your country?
 
  • Like
Reactions: giggles
Again, more I think/could/maybe/if/can

It's like saying If Apple let me put an RTX 3080 in my Mac Pro, they would dominate the desktop gaming market...
Thats completely false comparison and I hope you understand that. Always every action starts with make/could/if. And if people are silent and do not pay attention it escalates to the point that it is too late for any action. Look at the history, look at governments. heck look even now at cOVID restrictions in some countries.
It is not maybe. It is for sure implementing the code that is capable of scanning on device for the files. No more not less. What files is irrelevant.
 
Portugal a functioning democracy? 🤣

That score is highly influenced by high abstention rates in elections. In terms of civil liberties, electoral process and pluralism, and functioning of government we're classified as a full-democracy, and well ahead of the US in the three dimensions.

We are also under the General Data Protection Regulation. And we have some of the best standards of internet privacy in the world (in front of the EU).

The point is... no, my government is not gonna turn full-on dictator and collude with Apple to access my photos. That's a tin-foil hat level prediction right there.
 
Oh god. I’ll have to retract my earlier retraction now.

Apple: “The same set of hashes is stored in the operating system of every iPhone and iPad user”.

So they will be embedding these hash codes in the OS for sure. With many millions of known pictures, and a non-trivial hash code for each one, that’s a big chunk of data probably hundreds of megabytes in size that will come along with your ios15 update. And that can only grow in size year on year.

My question is about the images being used to test; will Apple use original images?

My iPhone stores optimised versions of photos, not the original. Downloading the original photos to scan for CSAM is going to require hundreds of GB of data.

And then be repeated every time the CSAM database is updated.
 
  • Like
Reactions: TakeshimaIslands
I agree, but should we just outright ban this implementation because it may be implemented in other countries? Shouldn't the push back be if Apple try to implement it in your country? Don't you think Apple will be aware of the push back and outrage if they try to implement it in your country?
who said this should be banned? Did I say that? At the same time, Are you implying that such system without checks and balances are a-okay?

Like I said in another comment, the balancing act on things like this are usually transparency for public audits and appeal process. But I don't see it here. It's just a Blackbox.
 
Yea, that terrible agenda of protecting innocent children. How dare they push that onto us Adults.
How is Apple invading your privacy? A computer is using a set of A.I instructions to check if features of your photos match a separate database - all whilst being encrypted. There is no employee or middle-man "scanning" your photos visually and taking an interest in that restaurant you visited a while back.
what do you mean? it's like "Police use a set of A.I instructions to check if features of your face, while you are at home, match a separate terrorist database"... And off course they start with child material to justify it to the masses, it's the prefect Trojan horse.
 
Last edited:
Anyone that believes Apple will stick to their script of only scanning what they said they will scan needs to have 'FOOL' tattooed on their forehead because it is a well known fact that tech companies lie not only to consumers but to governments. Google, Facebook, Apple, Microsoft, Amazon have all been caught out at one time or another lieing, sometimes on numerous occasions. It's in a companies nature to tell the comsumer they are doing one thing when in fact they are doing the complete opposite.

A few months or even a year or years from now, icloud users will start to complain that their stored pictures have been erased and Apple will deny it's their fault. Then some security researchers will get involved and they will report that Apple's CSAM scanning as quietly been scanning everyone's icloud account and been accidently erasing users pictures due to a glitch with the scanning process. Then people will say 'hold on Apple, you said you wasn't going to do this type of invasive scanners on users icloud accounts'. Apple will then issue a PR release saying 'sorry, we will update our procedures so something like this does not happen again'.

It's going to happen, you just know it will
 
what do you mean? it's like "Police use a set of A.I instructions to check if features of your face, while you are at home, match a separate terrorist database"
Ah, so all this only applies when you are outside your home?

Also, you are unaware that CCTV cameras are already capable of matching faces to known criminal databases? That literally one of the roles of a police officer is to prevent crime - and they could do so by many means? You fear their body-cams used to record your actions?
 
  • Angry
Reactions: TakeshimaIslands
So, you oppose your law enforcement? CCTV? Speed cameras? Didn't know the Chinese invented those...
Judging by your comments you obviously support all kind of restrictions and invasions in the name of false "greater good". I do not think we can find the "common ground" here. I do not support CCTV implementation all over the place as it is used for malicious intent very often, speed cameras don't bother me as they take photo only if you speed. However also find them somewhat annoying. Law enforcement is a completely different level of discussion...
 
You are missing the point here. Its not about CSAm. Its about that it can be used for everything and its on device

I’m not missing anything…read the quote…complete idiots are posting concerns about their innocent baby pics being viewed by an Apple employee.

There is no greater chance of that happening now than there was the moment iCloud storage was created.

Saying anything other than what Apple has stated is and isn’t possible with this tech is nothing more than conjecture and fear mongering at this point.

They already track you fifty different ways with similar privacy gates put in place and I don’t see a single person on here complaining about those or when they were introduced.

Apple literally introduced and touted a feature where they scan your photos to tell you what type of plant or dog is in the picture and no one said jack squat about privacy being invaded.

They use a much more advanced tech to scan predetermined database images to help stop the flow of child pornography in the world and people are up in arms about it.

EDIT: Fixed multiple spelling errors
 
Last edited:
People do realise that companies such as Google, Adobe, Facebook et. al already use some form of automated technology to scan for and detect CSAM? Adobe does it with Creative Cloud:


That's just one example.
Then we should all be grateful this foray into law enforcement without warrants is being brought to light. They should all stop.

This is being pushed by organizations with a noble cause - so can we get some data on how often they are catching people using this? What is the trade off in convictions per year?
 
  • Haha
Reactions: ohio.emt
That score is highly influenced by high abstention rates in elections. In terms of civil liberties, electoral process and pluralism, and functioning of government we're classified as a full-democracy, and well ahead of the US in the three dimensions.

We are also under the General Data Protection Regulation. And we have some of the best standards of internet privacy in the world (in front of the EU).

The point is... no, my government is not gonna turn full-on dictator and collude with Apple to access my photos. That's a tin-foil hat level prediction right there.
6 years ago Poland was also functioning democracy (for better or worse). Look now...They use CCTV and google data to accuse people of breaking COVID law after they put a total ban on abortion.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.