Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Shorter “If your not breaking the law you have nothing to fear”.

I am sure am glad governments never change laws, have poorly defined laws, arbitrary enforcement, and executive orders/mandates etc that might change my status as a law abiding citizen at any moment.

Obviously this power could never be abused. Thank goodness. Go get those bad people with the pictures while the rest of us rest easy knowing they are not after us.
I have not broken a single law. Matter of fact, I work with law enforcement. (Police Department)

I just don't want Apple scanning my iPhone and going through my privacy. This is not the Apple we know. Apple is up to something.
 
Last edited:
meh
They should at least be forward about how they go along with court orders but keep the face up that they refuse to cooperate with handing data over.
 
  • Like
Reactions: jk1221
The vast majority of the General Public does NOT yet know about this issue.

It ONLY started hitting the wires last Thursday.
you mean the same vast majority of the public who posts every little detail of their life on Facebook and Instagram?
The vast majority of the public who looks UP everything they ever needed to know, from the mundane to the extremely personal, on Google search?
The vast majority of the public who does the majority of their communications with Facebook messenger, Twitter direct message and SMS?
do you know what Google, Instagram, Facebook, and Twitter all have in common? They’ve been doing this exact thing for years. The general public do not and will never care about this kind of stuff until it actually affects them, and at the moment it does not.
sure, in the hypothetical future when hypothetical governments get their hypothetical hands on this type of technology and hypothetically force tech companies into hypothetically adding all sorts of restrictions, then we can be worried. But as for the current time, 99.999% of people do not care.
they’re not gonna get a new phone because of this, they’re not gonna turn off iCloud because of this, they’re definitely not gonna give up any of their social media because of this. They just don’t care.
 
No. They scan an iPhone if set up as a 0-12 in a family. Or when you use iCloud storage for the photo app AND are in the USA.

turn off iCloud and they don’t CSAM. But, if you upload to google or Dropbox, guess what they do?
Pretty sure they're not reporting to law enforcement. Don't think we have seen any reports out. At this point, it's a false accusation.
 
he states, it is clear that Apple will tolerate a single CSAM file within an iCloud Photos account. They designed it to do so. So what is the point of this? That fact alone gives law enforcement a
Read it again? If there was a single CSAM match, nothing would happen, no one would be notified.

And secondly, the system requires the threshold of images to be exceeded so trying to seek out even a single image from a person's device or set of people's devices won't work because the system simply does not provide any knowledge to Apple for single photos stored in our service.
 
What if, and hear me out, instead of Apple scanning everyone’s photos and messages over and over looking for unilaterally defined contraband we let judges issue a document to the police, based on some existing compelling evidence, to give them authority to search a specific device.
 
Sounds like BS bringing up encryption. Image analysis can only be done on unencrypted images. Every unique image has to be analyzed at least once to get the hash but checking hashes is useless if usual countermeasure is just to append random data to change the hash and that will just give Apple more reason to do more scanning. Apple needs to stop flexing and go back to the drawing board to rethink this out. Maybe something along the line of only scanning when images are in flight shared with strangers while exempt for sharing amongst family/established friends and at rest on local and cloud storage.
 
  • Like
Reactions: a139 and mikeleeuk
Well, pretty much the same points made before. And as I suspected, they claim that this is actually good for privacy.

But as it turned out during the last couple of days, it seems to mostly depend on your outlook on Apple, your government and your attitude towards privacy in general if you consider this a problem or not.

I'm still quite confused why they chose the on-device implementation instead of a server-sided one. They must have been aware that the former would cross the line for a lot more people than the latter.
 
The hash list is built into the operating system, we have one global operating system and don't have the ability to target updates to individual users and so hash lists will be shared by all users when the system is enabled. And secondly, the system requires the threshold of images to be exceeded so trying to seek out even a single image from a person's device or set of people's devices won't work because the system simply does not provide any knowledge to Apple for single photos stored in our service. And then, thirdly, the system has built into it a stage of manual review where, if an account is flagged with a collection of illegal CSAM material, an Apple team will review that to make sure that it is a correct match of illegal CSAM material prior to making any referral to any external entity. And so the hypothetical requires jumping over a lot of hoops, including having Apple change its internal process to refer material that is not illegal, like known CSAM and that we don't believe that there's a basis on which people will be able to make that request in the U.S. And the last point that I would just add is that it does still preserve user choice, if a user does not like this kind of functionality, they can choose not to use iCloud Photos and if iCloud Photos is not enabled, no part of the system is functional.

Wait, so he's confirming that Apple is forcing a US-centric policy/scan/hash list on all iPhones worldwide. That basically answered the question that yes, Apple can be forced to comply with a country and it will affect everyone around the world. He didn't answer the question the fact that Apple has advertised how the system can be tailored on per country basis.

As for manual review, we already know how Apple outsourced privacy related things including listening to SIRI's confidential data to contractors, and how the integrity of those contractors are.
Neuenschwander continued that for users who are "not into this illegal behavior, Apple gain no additional knowledge about any user's cloud library," and "it leaves privacy completely undisturbed."
Ah, so it's the "We have higher morality than you" kinda guy. If that's the case, open source the algorithm and have the NCMEC database open for public scrutiny. No? Does Apple have something to hide?
 
This is what I don't get. How is it scanning this content? Is it just general nudity? A link to a porn site? People keep mentioning there isn't anything to fear because it is just a hash, not a naked picture of your child running through the sprinkler. Because while they are two very different things, it is the tattling feature that is more concerning. People that do store illegal content in the cloud should get what comes to them because it is just plain stupid to trust anyone. But the iMessage scanning, to me, and definitely should be to others, more alarming....that is what can be weaponised much easier, I gather.
What? This is the feature you’re concerned about?
it’s literally using on device machine learning to identify a picture with explicit content in it, like genitalia or A sexual act. And then, it blurs the picture. That’s literally it, that’s all it does. And it’s only for minors.
if you’re worried about anything here, it should not be that.
it’s literally using the same technology that Apple has been using for over a decade with faces.
someone sends your child Who is under the age of 13 an explicit photo? The photo gets blurred and you get notified as the parent.
that’s it. I don’t know why you would even think of being concerned about that feature.
 
This is what I don't get. How is it scanning this content? Is it just general nudity? A link to a porn site? People keep mentioning there isn't anything to fear because it is just a hash, not a naked picture of your child running through the sprinkler. Because while they are two very different things, it is the tattling feature that is more concerning. People that do store illegal content in the cloud should get what comes to them because it is just plain stupid to trust anyone. But the iMessage scanning, to me, and definitely should be to others, more alarming....that is what can be weaponised much easier, I gather.
Its hard to understand at points (this interview was incoherent at times) but their argument is they're looking at a piece of text (the hash) of photos one puts in iCloud to confirm if its known child porn. The problems with this are what happens when a positive result happens. Do they confirm it by looking at the photo? And what happens with all of the hashes of unknown photos? How do they confirm a false positive doesn't occur? Essentially we're told to trust them, even though they have access to everything in this process and can easily impersonate a user, even if they say they won't.

Apple has made it clear in the past they don't work with governments even in cases of national security being at risk, but that has apparently changed. I don't see a case where they wouldn't report users who they find holding data, but I don't trust a technology company to be able to handle this properly either.
 
In principle, I trust Apple's statements.
But: why do image analysis at all, even if it is local. I was wondering the other day that even grain varieties are hashed. Or pets. It would be good if Apple would ask its customers in a survey who is actually excited about this functionality. Technically, I'm impressed with how perfectly facial recognition works, but suddenly I realize that a hash has already been created for every face in my library. This can end badly because it arouses the curiosity of people who are not interested in privacy. Then it can easily happen that hashes from terrorists, for example, boost the CSAM stuff in every private computer, and representative of this example, the dam is quickly broken. Edward Snoden is not a crank, and he warned about exactly such scenarios a few days ago because he knows this world pretty well.

And no, I don't think Apple will be strong enough to put this demon in its place either.
Never play with fire. They should know that in Cupertino.
 
Last edited:
What if, and hear me out, instead of Apple scanning everyone’s photos and messages over and over looking for unilaterally defined contraband we let judges issue a document to the police, based on some existing compelling evidence, to give them authority to search a specific device.
Because these creeps hide in the shadows. They are your neighbour, your kids teacher, a close family friend. Morality aside, if this person is so stupid as to have these images on their phone and let them be uploaded to the cloud, they are too dumb to bother protecting.
 
So like, TLDR: Apple just told those who the entire purpose of the system is to find, to turn off iCloud photos and they’ll have no issues continuing to abuse children. TF?
 
No. They scan an iPhone if set up as a 0-12 in a family. Or when you use iCloud storage for the photo app AND are in the USA.

turn off iCloud and they don’t CSAM. But, if you upload to google or Dropbox, guess what they do?

For now.

Just wait. It won't be long until someone will ask why it shouldn't happen for all iPhone photos regardless of the iCloud status. After all, it is for the children, regardless of what it does to the privacy of everyone else in the world.
 
Again I don't think Apple has a choice in the matter but has already been decided for them by unelected government officials.
Apple waited the last minute to make this public before the developers and beta testers discovered this “feature” on their own.

I don’t know if Apple realizes this but by implementing CSAM it will be targeting a certain religious group who’s religious practices and laws condone such behaviors especially in the state of CA signed and approved by the governor. Will they receive an religious exemption when positives are found? We will see.
When it's really about the issue, there are so many potential complications.

Thus it begs the question, is it really for this thing? Or is this just an excuse to have the system setup so future other things can be done because, hey, why not.
 
Its hard to understand at points (this interview was incoherent at times) but their argument is they're looking at a piece of text (the hash) of photos one puts in iCloud to confirm if its known child porn. The problems with this are what happens when a positive result happens. Do they confirm it by looking at the photo? And what happens with all of the hashes of unknown photos? How do they confirm a false positive doesn't occur? Essentially we're told to trust them, even though they have access to everything in this process and can easily impersonate a user, even if they say they won't.

Apple has made it clear in the past they don't work with governments even in cases of national security being at risk, but that has apparently changed. I don't see a case where they wouldn't report users who they find holding data, but I don't trust a technology company to be able to handle this properly either.
No one gets notified if there’s just a single instance of matching.
there has to be several instances that reaches the threshold before A manual review is triggered. And then even then it’s even less likely that the manual review will lead to anyone being reported.
as Apple have said, there’s a one and 1 trillion chance that false positives will lead to anyone actually being reviewed.
 
  • Disagree
Reactions: opfreak
If it only looks at the hash, then all someone would have to do, is slightly crop or edit the image, and the hash is different.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.