Maybe you do, but your data doesn't.
How's that relevant? The issue is not where the data is physically stored. The issue is whether governments can potentially, maybe, perhaps collude with Apple to make this a precedent for abuse.
Maybe you do, but your data doesn't.
Hmmm, I wonder which company has touted itself as a protector of privacy and has a LOT of billboards around saying that they are focusing on privacy...
Dear lord... this keeps getting better and better...The issue is whether governments can potentially, maybe, perhaps collude with Apple to make this a precedent for abuse.
Completely false. Apple employees never get to look in your library. When you trigger enough positives to cross a threshold, a reviewer gets to see a visually derived image of only the offending pictures. This is not a flexible Apple policy or a server-side variable, it's a baked-in feature of the encryption that happens on your device.Over simplified I think; a flagged image gets checked for a false positive. At this point there is a window for Apple Employees to access your photo library for 'review'.
I know. Security is a terrible thing, and removing shoes is an equally distressing experience. Personally I wear mine to bed, less I have to bend over and untie them.But one guy tries to blow up a plane with his shoes 20 years ago... and EVERYBODY has to take off their shoes at the airport forever...
![]()
That's a splendid idea, maybe not open audit, but perhaps a trusted 3. party who does process and security review professionally?The solution to remediate this would be to have the NCEMC release the list of hashes for open audit and review, it’s not like you can reconstruct the images from the hashes. If people find that non-CSAM images are being added to the database, it would be uncovered very quickly. The biggest issue with this system is that the hash list is presumed to only contain illegal images, yet is controlled by government funded entities. Apple’s response to this is that they check images first before reporting to law enforcement, and that response isn’t good enough.
In my country, it's illegal to have pornographic materials. And the wording of the law includes "pornoactions," which was defined as "indecent acts," and it ranges from bikinis to kissing. There was a case about a couple (forgot the details) having a private video that was stolen and leaked. That couple was fined and jailed, longer than the people spreading the video itself.I could have images of drugs and guns on my device, but that isn't illegal to own those images, and it would be impossible to say if because of those images I was doing anything illegal.
In my country, it's illegal to have pornographic materials. And the wording of the law includes "pornoactions," which was defined as "indecent acts," and it ranges from bikinis to kissing. There was a case about a couple (forgot the details) having a private video that was stolen and leaked. That couple was fined and jailed, longer than the people spreading the video itself.
You can kinda have an idea the implication if a Blackbox mass scanning system from a private company is implemented in a country with these kinds of laws.
Thats completely false comparison and I hope you understand that. Always every action starts with make/could/if. And if people are silent and do not pay attention it escalates to the point that it is too late for any action. Look at the history, look at governments. heck look even now at cOVID restrictions in some countries.Again, more I think/could/maybe/if/can
It's like saying If Apple let me put an RTX 3080 in my Mac Pro, they would dominate the desktop gaming market...
So, you oppose your law enforcement? CCTV? Speed cameras? Didn't know the Chinese invented those...Mass observation for the sake of the greater good is very Chinese, and criticized by the same people applauding this very stupid idea…
We're all aware that. What's your point?It is not maybe. It is for sure implementing the code that is capable of scanning on device for the files.
Actually, it is relevant. The device is scanning images only within iCloud Photos.No more not less. What files is irrelevant.
Portugal a functioning democracy? 🤣
Oh god. I’ll have to retract my earlier retraction now.
Apple: “The same set of hashes is stored in the operating system of every iPhone and iPad user”.
So they will be embedding these hash codes in the OS for sure. With many millions of known pictures, and a non-trivial hash code for each one, that’s a big chunk of data probably hundreds of megabytes in size that will come along with your ios15 update. And that can only grow in size year on year.
who said this should be banned? Did I say that? At the same time, Are you implying that such system without checks and balances are a-okay?I agree, but should we just outright ban this implementation because it may be implemented in other countries? Shouldn't the push back be if Apple try to implement it in your country? Don't you think Apple will be aware of the push back and outrage if they try to implement it in your country?
Yea, that terrible agenda of protecting innocent children. How dare they push that onto us Adults.
what do you mean? it's like "Police use a set of A.I instructions to check if features of your face, while you are at home, match a separate terrorist database"... And off course they start with child material to justify it to the masses, it's the prefect Trojan horse.How is Apple invading your privacy? A computer is using a set of A.I instructions to check if features of your photos match a separate database - all whilst being encrypted. There is no employee or middle-man "scanning" your photos visually and taking an interest in that restaurant you visited a while back.
Ah, so all this only applies when you are outside your home?what do you mean? it's like "Police use a set of A.I instructions to check if features of your face, while you are at home, match a separate terrorist database"
Judging by your comments you obviously support all kind of restrictions and invasions in the name of false "greater good". I do not think we can find the "common ground" here. I do not support CCTV implementation all over the place as it is used for malicious intent very often, speed cameras don't bother me as they take photo only if you speed. However also find them somewhat annoying. Law enforcement is a completely different level of discussion...So, you oppose your law enforcement? CCTV? Speed cameras? Didn't know the Chinese invented those...
You are missing the point here. Its not about CSAm. Its about that it can be used for everything and its on device
Again, discussion is pointless. You refuse to acknowledge basic ramification of the code implementation long term. You are allowed to see it your way.We're all aware that. What's your point?
Actually, it is relevant. The device is scanning images only within iCloud Photos.
Then we should all be grateful this foray into law enforcement without warrants is being brought to light. They should all stop.People do realise that companies such as Google, Adobe, Facebook et. al already use some form of automated technology to scan for and detect CSAM? Adobe does it with Creative Cloud:
![]()
Child Safety
Adobe’s commitment to keeping children safe online and fighting the spread of child sexual abuse material.www.adobe.com
That's just one example.
6 years ago Poland was also functioning democracy (for better or worse). Look now...They use CCTV and google data to accuse people of breaking COVID law after they put a total ban on abortion.That score is highly influenced by high abstention rates in elections. In terms of civil liberties, electoral process and pluralism, and functioning of government we're classified as a full-democracy, and well ahead of the US in the three dimensions.
We are also under the General Data Protection Regulation. And we have some of the best standards of internet privacy in the world (in front of the EU).
The point is... no, my government is not gonna turn full-on dictator and collude with Apple to access my photos. That's a tin-foil hat level prediction right there.