Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
sooo…looks like some people in “your camp” literally consider it of zero personal benefit to fight CSAM as a society.

(incidentally some aspects of this debate sound eerily similar to what we have heard for almost 2 years now in the covid debate concerning individualism vs collectivity, “unchecked” freedom vs real-world we-need-to-care-for-each-other-as-a-society freedom, etc.)
 
Wow, this is a new one. So you are associating those who are concerned with their privacy with flat earthers.

Yup, if you don't have anything else to argue with, start calling names.

It continues to amaze me how people don't understand how analogies work. I'm simply pointing out the fact that SOME people will always continue to argue a point until the day they die, in spite of overwhelming contrary evidence. I just thought it was funny how the person whom I was replying to thought that more evidence would put the argument to bed. I used the example of flat-earthers, because it's a well-known example that proves that's not true. Apple, I, and many others have already explained how what Apple is doing doesn't lessen your privacy, but people either keep misunderstanding (whether genuinely or purposely) or resort to fallacious slippery-slope arguments to keep the argument alive.

P.S. I never called names.
 
  • Like
Reactions: JBGoode
looks like some people in “your camp” literally consider it of zero personal benefit to fight CSAM as a society.
Actually not, you're forming a conclusion that doesn't follow. I said it's not my job to police it, but I also said Apple could freely scan my iCloud pictures, now why would I say that if I didn't think that was valuable...

All I'm saying is they can't use my property to do the scan, and that's because of the potential for abuse.
 
Except I didn’t say that.
I just said that the sentence “not for my own benefit” sounds weird to me personally.
And literally 2 posts above yours we have a user that (legitimately) does not super duper care about abused children if they’re not his own, sooo…looks like some people in “your camp” literally consider it of zero personal benefit to fight CSAM as a society.
Neither do you it seems. The 'solution' you're so ecstatic about ignores existing CA in iCloud.
 
  • Like
Reactions: PC_tech
Actually not, you're forming a conclusion that doesn't follow. I said it's not my job to police it, but I also said Apple could freely scan my iCloud pictures, now why would I say that if I didn't think that was valuable...

All I'm saying is they can't use my property to do the scan, and that's because of the potential for abuse.
So you care enough to accept server-side scanning but not enough to get over this so-called on-device scan technicality. (even though if one dives deep in how it actually works it’s basically for all intents and purposes a server-side verification based on a previous air gapped on-device scan made on-device out of convenience)
 
Last edited:
It might be that Apple was being a good corporate citizen to fight CSAM, a truly horrific plague. It might be for privacy. The one thing it ain't is required by law.....

In summary, 18USC 2258A clearly states a company must act on "actual knowldege". Moreover, a company is "not required to monitor", "not required to affirmatively search, screen or scan files and data".

So, the question is why?

Seriously? So just because something isn't required by law, companies that are proactive about not wanting CSAM on their servers are automatically suspect in their motives? What kind of twisted logic are you employing to arrive at such a conclusion? It's not required by law - so what? It's not prohibited by law.
 
Actually not, you're forming a conclusion that doesn't follow. I said it's not my job to police it, but I also said Apple could freely scan my iCloud pictures, now why would I say that if I didn't think that was valuable...

All I'm saying is they can't use my property to do the scan, and that's because of the potential for abuse.
As far as I'm concerned too.
Apple filed only 265 cases with the NCMEC in 2020. Facebook 20 million+.
If Apple really is so worried about CA, why don't they just scan iCloud photos? Why do they ignore existing CA in iCloud?
Instead, they give us this tool, that's WIDE OPEN for abuse by intelligence services.
 
German government / Bundestag just asked Apple to drop this feature !
I am sure more will follow from other countries.

Apple should just put the mess in the cloud, then it would not scan private data on user devices (what is prohibited in many countries).

Currently it's just a spyware / backdoor completely controlled by a company – this is just illegal (and dangerous if it gets hacked - think Pegasus).
 
Nobody ever actually challenges me on the fact that the actual verification of the results of the scan is on-server and only after you’ve already surrendered the pics to Apple’s server anyway. Hence making it for all intents and purposes a server-side scan.

They seem to fall back to just not wanting any CPU cycle of their local device being devoted to the preliminary airgapped scanning. Unreasonable, weird, petty, unfair, dramatic, etc. or in one word: specious.


It is not a technicality, it is scanning, and the answer is yes. My device, my data, Apple's Servers, their data. Since the goal is to keep CSAM off of Apple servers, it's best to scan there.
 
German government / Bundestag just asked Apple to drop this feature !
I am sure more will follow from other countries.

Apple should just put the mess in the cloud, then it would not scan private data on user devices (what is prohibited in many countries).

Currently it's just a spyware / backdoor completely controlled by a company – this is just illegal (and dangerous if it gets hacked - think Pegasus).
Let's wait and see if Brussels welcomes it.
 
Last edited:
Nobody ever actually challenges me on the fact that the actual verification of the results of the scan is on-server and only after you’ve already surrendered the pics to Apple’s server anyway. Hence making it for all intents and purposes a server-side scan.
I didn't call you on that because it doesn't matter to me, the scanning is done at least partially on device, that is not acceptable.
 
I didn't call you on that because it doesn't matter to me, the scanning is done at least partially on device, that is not acceptable.
Why?
Jealous of your CPU/NPU cycles?
“Your data” is already en route to Apple’s server if you enable iCloud Photos so it’s not just “your data” any longer, you’re just showing a weird affection for the copy of that data living on your local device (even tho you’re giving Apple full access to a perfect copy of it). That shows an interesting psychological divide between human understanding of ownership of data and the fact that data can exist in multiple places in perfect copies. This conundrum could maybe be solved one day when data will be actually unique with provable ownership (on the blockchain).
 
Because it's mine.

Jealous of your CPU/NPU cycles?
No, not jealous, though my battery charge is important. usage-wise.

“Your data” is already en route to Apple’s server if you enable iCloud Photos so it’s not just “your data” any longer,
The data on my phone is mine, period. Once I transmit it elsewhere, they have their data, but mine is still on my device.

. That shows an interesting psychological divide between human understanding of ownership of data and the fact that data can exist in multiple places in perfect copies. This conundrum could maybe be solved one day when data will be actually unique with provable ownership (on the blockchain).
I'm a corporate IT guy, of course I know exact copies of data can be on multiple devices, and only the copy on my personal devices are "mine". It's actually part of the phone. I don't really have a problem with that, so blockchain really wouldn't buy me personally anything. It really is different to me.

I realize content creators might look at it differently, but I'm not a content creator. Everything that I have created is either public domain, or is owned by my employer.

(There's a use for blockchain for company data, and e-currency)
 
  • Like
Reactions: oldoneeye
For the first few months of iOS 15, I'm confident that the database just contains CSAM image fingerprints. However as time passes (and as Corellium's interest wanes), other authorities will push their agenda and force Apple's compliance to include "extra hashes" that are not part of CSAM....

Exactly. I hope this initiative sets a precedent for corporations to be more and more accountable for these technologies... however, if governments and institutions truly cared about this issue, they would invest heavily in addressing the root causes of child abuse instead of policing and punishing after the abuse has already happened. It doesn't make any sense whatsoever. This is not about protecting children, it is about control. I am glad the public and organizations are speaking up and I hope the momentum is not lost.
 
  • Like
Reactions: Playfoot and ececlv
For the first few months of iOS 15, I'm confident that the database just contains CSAM image fingerprints. However as time passes (and as Corellium's interest wanes), other authorities will push their agenda and force Apple's compliance to include "extra hashes" that are not part of CSAM....

Reminds me of when the government said

"We're not listening to everyone's calls. We're only looking at metadata about everyone's calls ... and it's to catch the terrorists."
 
Good one! 🤣 You know, if we could demonstrate that the earth is a sphere, we can put all the flat earth discussions to rest too.

Unfortunately, some people will discount any evidence that goes against their conspiracy theory (or their misunderstanding . . . or their purposeful twisting of the truth) - often by claiming there was a payoff to falsify data/findings, etc.
And how do you propose to deal with flat earthers? Educating, bringing evidence, ignoring, even ridiculing, don’t work.
There will always be prople who don’t trust Apple. Nothing you can do about that.

if the CSAM feature turns out to be safe, I do expect that main stream discussions about it will stop.
 
And how do you propose to deal with flat earthers? Educating, bringing evidence, ignoring, even ridiculing, don’t work.
There will always be people who don’t trust Apple. Nothing you can do about that.

Precisely my point.

if the CSAM feature turns out to be safe, I do expect that main stream discussions about it will stop.

You're more optimistic than I, but I hope you're right.
 
I wonder what the people behind the site would say about that feedback.
Actually, they partly covered themselves on the previous page:
Not quite the problem at hand, but close!

Trouble is, their description is "not even wrong" and that's always hard to refute. This particular site has chosen a very simplified tweet-friendly format, and I doubt that they're interested in changing their name to "yourpossiblyfallaciousargumenttropeis.com".

Other sites (e.g. wikipedia) give a more verbose description and discuss the faults and exceptions, of "slippery slope" and I've just found this, which pretty much echoes my point:

 
  • Like
Reactions: star-affinity
And when it happens, Tim Cook will get up on stage and in his soothing southern drawl claim to be the good guy as they had the best of intentions. They won't even lose any customers over because most people are oblivious to privacy (Amazon has sold 100 million Alexa powered products), and the people that do care will have nowhere to go after the precedent is set and Google / Amazon / Microsoft have joined in.

You are right, but here is what. Apple no longer will have the "privacy" label. They will be alongside Google and Facebook. Those who care for their privacy should abandon their service, or just like Facebook users, accept surveillance.

The fall of the last of Big Tech from defending their users privacy hopefully will create a shock in society where new and more serious attempt at building alternatives exist, so FOSS community wins...I hope...

So you don't think the below applies in this case?


I guess we'll have to wait and see and hopefully Apple will be open with that they add to that hash list. If it can also be monitored by external initiatives such as Corellium I think that's good.

This is no logical fallacy. This is truth to logic.

EX.
-If we let all criminals in jail out -> crime incidents will increase
-If we lower interest rate -> more people will take loans
-If we make a discount -> our sales will rise

Sure we do not know whats going to happen in the future, but you can predict it.
 
  • Like
Reactions: BurgDog
motm95 said:
The vouchers include a “visual derivative” of the image — basically a low-res version of the image.
motm95 said:
Apple does not have the actual CSAM images - they cannot. Only NCMEC is allowed to have the actual images.
You're really not seeing the contradiction there?
OK, lets take "basically a low-res version of the image" to refer to the user's image, not the NCMEC's illegal-to-possess (even in 'low res' form) image. How are they going to "check" that? They can't compare it with the NCMEC's original (which they aren't allowed to have) and they can't compare it with the user image (because the whole point of this is so that they don't need to 'open' your photos).

It might seem like a Gordian knot, but it's easy to cut.

Apple's representative just looks at the 30 or more images in the security vouchers and if at least one of them looks like child pornography to that person, Apple reports to NCMEC, as required by law.

At this point it doesn't matter if the photo from the user's photo Library (on the phone) is a copy or looks like an image in the NCMEC's database.

And unless Apple's judgement call is wrong, the system hasn't failed in the sense of reporting someone without such material.
 
  • Like
Reactions: Maximara
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.