Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
They do. But they are being scanned on server. Not on device. Thats thenpivotal factor here

The on-device stuff stays on-device (blurring photos, warnings about potentially sensitive material) the part that could be reported to authorities is server-side
 
Give everyone keys to your house? You get that this is an opinion-less, memory-less, mindless bot checking for child abuse material, right? That’s nothing like giving a neighbor a key, that’s like taking a lie detector test where the only question is “do you traffic children?”
Apple is scanning items ON PHONE. ON PRIVATE PROPERTY. This is equivalent to allowing ANYONE to walk in to your house at ANY TIME to look through all your files and things to make sure you are not doing ANYTHING illegal. Warrantless searches.
 
I’m saying I don’t fear the Astrazeneca vaccine because of the rare adverse events as much as I don’t fear Apple’s CSAM matching system because the 1 in a trillion ballpark quote sounds perfectly plausible once you’re compounding multiple rare events.

In the past we had comparisons to cars... now it's to vaccines.... :oops:
 
  • Like
Reactions: PC_tech
Apple has been relentless on its stance concerning protecting user privacy, and I don't see this as any indication of the contrary. There are many reasons why some here might want to take a step back, breath, and reconsider their reactions:

First, I'm quite confident that there are countless "features" that Apple does not reveal to the public domain. This should go without saying considering the notoriety Apple has for secrecy. The fact they chose to reveal this is, in itself, quite telling. Apple does not want to tarnish its pro-privacy image in any way --- consider that for a moment while trying to think through what these two distinct features actually do.

Second, the peer-reviewed papers describing some of the cryptographic techniques being used are available for your perusal. The implementation described, if we are to trust that they are describing their actual techniques, in no way diminishes user privacy. The only caveat to this would be the iMessages alert to parents for children who have parental controls set on their devices, but the politics of surveilling your own children is outside of the scope of the general argument this thread has belabored.

The second feature, where you iCloud Photos library is "scanned" does not diminish privacy, either --- first of all, an actual "scan" of the image does not take place, the image is simply run through a neural network and outputs a hash; for those of you unfamiliar, this trained neural network is essentially a function f(x) = h, where it takes an input (x) and outputs a unique and unintelligible hash as a string of integers. It is the hash that is compared against hashes of known child abuse content.

The only real room for complaint here is if Apple were to update iOS to store hashes other than hashes of child abuse content; this would not be some server-side update that the government or some bad actor could manipulate -- Apple would have to intentionally change the database of hashes in iOS to include other hashes, and even then, it would still only work against images you upload to iCloud that would exactly match those hashes -- in short, this is not blanket surveillance. If an analogy could be used, this is more akin to a customs agent looking through your items as you cross into a different country, but even less than that, as the customs agent can open your suitcase and peek around, whereas this simply looks at a hash of a single item you are purposefully uploading to iCloud, not at everything on your phone, and not even the files themselves.

Simple: don't upload child porn to iCloud and you won't be tagged for human review. Stop worrying that this is enabling some backdoor for governments to spy on you, it doesn't.
 
I’m out.

The same people again and again not acknowledging that for all intents and purposes Apple’s on-device_but_actually_cryptographically_frozen_until_it’s_uploaded scan is EQUIVALENT to server-side scan privacy-wise will soon be forgotten like tears in the rain.

What a sad example of sticking with an initial bad take.
 
Interesting thing to think about... if a child sends a nude of themselves they would be guilty of distributing CP...

How long until this is expanded to proactively scan for questionable material?
As far as I'm concerned that IS questionable material. Why in the world would a child send a nude of themself? Something is wrong with the parent if that's going on.
 
(...) This tech is like that superpower, but using ML instead of fiction. (...) IE, a soulless robot does the checking, which has no memory, no prejudice, not even the slightest bit of interest or understanding of what is in your photos except that it doesn’t match a predetermined data set of child abuse material.

If a human came to check your home regularly that would be one thing, but having an autonomous system simply verify that you are not a child abuser is another thing entirely and you would do well to note the difference.
You should know by now that robots do have the prejudices of their programmers (and of their boss and, on its turn, their client).
First you legitimate the control of your pictures for one type of purpose (child abuse control) and then they legitimate many other things for you (because they know better). This is called enlightened despotism. There is a very thin line here going on. You are partly right. Hey! What about an algorithm on a smartwatch that detects when a woman is bitten by her husband, for instance. Do you think there will be enough police officers, judges, prisons to attend the automatically generated call? What about an algorithm on the HomePod detecting agressive screaming in a home? Do you think any country has enough resources to control this? Whatever is done to punish child abuse is very welcome but let's be cautious: we citizens should have the right to control whether we want our pictures to be analyzed by robots or not. It must at least be an optional feature. If you open a YouTube account, you are under Google's rules. But if I purchase a +$600 pocket device I need to feel that I'm controlling the device and not that the device is controlling me.
Not even Apple is willing to give you the right to choose upon your digital sovereignty. So, they start using the child abuse excuse which no one will dare to disapprove (I insist, whatever is done to pursue and punish child abuse is more than welcome, of course) and then they continue punishing people questioning the vaccine obligation (nothing against vaccination from my side but everything against not being allowed questioning it) or complicating the life (even if it is only their life on the internet) of people who are against Apple, Amazon, Google, Facebook, etc. not paying the same amount of taxes as everyone does.
So, hey, just let them do their good and fantastic superhero will today and guess who will be going to decide for you tomorrow.
 
I’m out.

The same people again and again not acknowledging that for all intents and purposes Apple’s on-device_but_actually_cryptographically_frozen_until_it’s_uploaded scan is EQUIVALENT to server-side scan privacy-wise will soon be forgotten like tears in the rain.

What a sad example of sticking with an initial bad take.
Except....its not? I am treated like a criminal now. Its no different than a warrantless search in my house with the HOPE of the police finding something in SOME house SOMEWHERE.

And lets be real, it is SUPER easy to avoid if you are the actual criminal here. It will not be enough and Apple WILL change it to where even if you turn off iCloud Photos it will still scan. Apple will get pressured to do so.
 
Apple is scanning items ON PHONE. ON PRIVATE PROPERTY. This is equivalent to allowing ANYONE to walk in to your house at ANY TIME to look through all your files and things to make sure you are not doing ANYTHING illegal. Warrantless searches.
Apple is actually not scanning anything. In the one case, your own phone is generating a hash of single items and those are then compared (by your phone) against hashes of known child abuse content when you intentionally try to upload a photo to iCloud Photos. In the other case, a child's phone that is put on parental control generates a likelihood score of a photo containing private parts when send/received in iMessages. The child would be warned of the nature of the photo and the parents would be notified that the parental-controlled child is doing something nefarious. Again, Apple doesn't scan anything, and does not have free-reign to look through all of your files.
 
Simple: don't upload child porn to iCloud and you won't be tagged for human review. Stop worrying that this is enabling some backdoor for governments to spy on you, it doesn't.
Then why I am treated like a criminal? What gives Apple the right to become law enforcement? Why is Apple introducing this when they were NOT MOVING AN INCH to unlock a phone of a KNOWN terrorist, but now they have effectively introduced a backdoor that can be exploited (no software is 100% error proof).

I would be against police randomly searching my house, even though I am doing nothing wrong.
 
Apple is actually not scanning anything. In the one case, your own phone is generating a hash of single items and those are then compared (by your phone) against hashes of known child abuse content when you intentionally try to upload a photo to iCloud Photos. In the other case, a child's phone that is put on parental control generates a likelihood score of a photo containing private parts when send/received in iMessages. The child would be warned of the nature of the photo and the parents would be notified that the parental-controlled child is doing something nefarious. Again, Apple doesn't scan anything, and does not have free-reign to look through all of your files.
"scan" is a broad term used by many people on these threads. It is basically "scanning" because it is looking at my pictures, producing a hash and checking with the database. Going through my pictures one by one comparing them. Basically scanning in that regard.
 
  • Like
Reactions: Philip_S
For now it is. Once again how does a user verify that the CSAM software is not running on the device when iCloud Photos are disabled? All we have is Apple's word that is the problem.
I've got news for you, all you have is ANYBODY'S word (Google, Microsoft, Samsung, Sony) when it comes to using internet-related products. Until you unplug your SMART TV, unplug all connections and shut down your desktop PC, kill the batteries on your laptops and shutdown your internet you have zero idea who is big-brother-ing you. Just because your TV manufacturer hasn't made public announcements like Apple did doesn't mean they aren't spying on you in your own home.
 
Except....its not? I am treated like a criminal now. Its no different than a warrantless search in my house with the HOPE of the police finding something in SOME house SOMEWHERE.

And lets be real, it is SUPER easy to avoid if you are the actual criminal here. It will not be enough and Apple WILL change it to where even if you turn off iCloud Photos it will still scan. Apple will get pressured to do so.
Nope, it’s like sniffing the mail going out of your home for explosives. It’s data that’s about to be uploaded. Not data sitting there in your home. The trick here is that physical objects can’t be duplicated, whereas digital data can at the same time be local and be departing for apple servers, that’s how you people waste everyone’s time with your stupid unrelated real-life examples.
 
  • Like
Reactions: JahBoolean
do you use facebook / google services ? nothing new under the sun simply sad for it to come from Apple.

But they are all in PRISM anyways
Nope, and google is just emails for things I don't really care too much about (e-letters, game subscriptions and whatnot). I use encrypted email with a public/private key pair for anything I want to stay private.

Facebook and Twitter are no go for me due to privacy.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.