Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Thank you! I didn't know "trials" was referring to the ongoing battle with Epic games etc.

Interesting read. This then gets into the territory of whether the issue of bad actors in the Apple ecosystem (per child pornography) is actually an Apple problem. Do companies have obligations about their consumer's behaviour....
Companies don't have an obligation to look for CSAM (only to report if found) b/c then they would be considered government agents and run afoul of the 4th amendment. But, no company wants to be the place for CSAM. That's why all the big cloud providers (other than Apple iPhotos) already scan for CSAM.
 
It is unbelievable to me how much money and mental energy people are spending against what I consider to be a non-issue. All kinds of existing technology can be abused, yet we don’t campaign for it to be eliminated. And it still seems that a huge amount of people don’t even understand that 1. The scanning would only be active if you enable iCloud for photos (or elect not to turn it off, as the case may be) and 2. Apple wouldn't be able to see anything on your phone with the scanning process. The only time any scanning information gets exported from your phone is if you upload an illegal image to iCloud and even then Apple can’t decrypt that until there are 30+ Illegal images uploaded.
 
Not to "whataboutism" this, but did the EFF protest like this when Google started doing it?
Why would they? It's not the same thing. They're about protecting our usage on our devices. In an odd way to think for the current discussion, the EFF should want to protect Apple and what they do on their servers too. Apple's choice, Apple's responsibility. Just like my phone is my choice and my responsibility.
 
Apple can and should still introduce the feature they intended for the Messages app to protect children from graphically violent and sexually explicit images. They can even extend that as an option for all ages. I wouldn’t want to receive gory images and videos as a cruel prank, for example.
 
It is unbelievable to me how much money and mental energy people are spending against what I consider to be a non-issue. All kinds of existing technology can be abused, yet we don’t campaign for it to be eliminated. And it still seems that a huge amount of people don’t even understand that 1. The scanning would only be active if you enable iCloud for photos (or elect not to turn it off, as the case may be) and 2. Apple wouldn't be able to see anything on your phone with the scanning process. The only time any scanning information gets exported from your phone is if you upload an illegal image to iCloud and even then Apple can’t decrypt that until there are 30+ Illegal images uploaded.
Maybe you should ask that to all the experts and privacy organizations world wide who clearly do not consider this a non-issue. Also you should ask Apple themselves (who decided to put it in the fridge for now and at least consider the concerns of all those organisations). According to your logic and reading your relentless posts in other threads all those experts and organisations with legitimate concerns must be lunatics with tin foil hats.
 
Last edited:
If you start from the position that all people are inherently good, then CSAM could seem reasonable - why wouldn’t we have full trust in established governments and corporations? The “good guys” always rise to positions of authority and they would never hold harmful political ideologies or knowingly hurt others, especially people with dissenting thought.

On the other hand, if you start from the position that all people are inherently bad, and that bad guys routinely rise to the top, then laws and personal protections are required because ALL governments become corrupt. In this environment, CSAM is the pinnacle of authoritarian control.

CSAM is pitched as a way to “catch the bad guys” while simultaneously putting a government camera into every citizen’s pocket. The political talking point is “stopping sexual exploitation”, but that is merely the excuse for the government overreach. Law enforcement wants to play God, scanning your data for sins and convicting based on the current norms of the day. Did a politically inconvenient event occur? It will be called “extremist” or “terrorist in nature” and photos from all phones in the vicinity can be polled for evidence.

In the United States, our elected president is asking major communication platforms to restrict free speech, and they are complying. Twitter was blocking users from sending news story links via _private message_. Who needs end to end encryption when the messenger reads your message and says they won’t deliver it? And you think CSAM isn’t going to be abused? First it will sound reasonable: “yes, that girl was kidnapped - we should absolutely put her face in the CSAM algorithm and see who has photos of her”. Then it will be a little more far-fetched: “Well this guy was known to live in city X, so we will scan every citizen’s phone pics in the city”. Finally it will come to fruition: “anyone found with evidence of dissent or rebellion will be brought in for questioning.” Because we don’t have enough law enforcement to arrest everyone, people will live in fear that they might be targeted. Highly publicized arrests will maintain a heightened state of fear.

The world is corrupt and sinful. Maybe you don’t want to believe it, or you think that’s hyperbole, but I hope you consider it. Go read a history book and see man’s depths of depravity. Any opportunity for one man or government to exploit another will be taken, every time, without exception.

CSAM starts with the premise that every user is an untrusted potential criminal. The problem is that corporations/governments are treated as completely trustworthy and able to make any inquiries of the system without your knowledge. What could possibly go wrong?
 
How did you come to that conclusion from your posted quote?

🤷🏼

Sounds Straw Man to me.

I was just wondering why the EFF isn't upset about CSAM scanning on the popular cloud services. Have they said anything about them? They hired an airplane for Apple... but what about everyone else?

Six months ago I hadn't even heard of CSAM scanning. But apparently ALL the big services have been doing it for years: Google, Facebook, Amazon, Microsoft, Dropbox, etc.

Yes... I know it freaks people out about on-device scanning. Point taken.

But scanning is still happening elsewhere. You can't really do anything on the internet without touching the services of these major corporations.

So... I was just curious about what the EFF thinks of other types of CSAM scanning. That's all.
 
The problem is that this same technology could be used for different purposes, which are not that honorable as trying to catch pedophiles. The whole point is that back doors are not good, period. If you want to catch pedophiles, that's good, but don't do it while putting in place a massive data collection from people's smartphones or cloud storage, even if its only the Hash. If for an example a hacker breaches your iCloud password, he could upload this type of indecent images and you will have a hard time explaining to the cops that these photos are not yours. This same technology could also be used for different purposes if you substitute child violence with for example images of political pamphlets. Virtually half the world's population lives under some level of totalitarian regime where only a few care about privacy or democracy, so the iPhone should be a fortress as far as personal information is concerned IMO.
This technology can’t scan for “political pamphlets”. Not how it works.
 
Maybe you should ask that to all the experts and privacy organizations world wide who clearly do not consider this a non-issue. Also you should ask Apple themselves (who decided to put it in the fridge for now and at least consider the concerns of all those organisations). According to your logic and reading your relentless posts in other threads all those experts and organisations with legitimate concerns must be lunatics with tin foil hats.

Ah, appeal to authority strikes again. I prefer to deal with the actual facts on the ground presented by Apple that are there for all to read. And for the record, I don't believe I've called anyone a "lunatic" (or anything remotely similar to that term) regarding this issue. So kindly don't put words in my mouth (even by implication), or link me to a post where I said anyone in these organizations are lunatics, so I can retract it, because that's not what I believe. Now, I DO believe there are some on this forum that wear tin foil hats, but that doesn't mean they're lunatics - just a combination of misinformed* and an overactive imagination. I have no problem with people expressing "legitimate" concerns, but then you have the folks who think there's some grand conspiracy between Apple and authoritarian governments to use CSAM as a smokescreen to get a backdoor into our phones, yet they have ZERO evidence to back that up.

*There are STILL people on this forum that think what Apple is proposing is "surveillance" or "spyware" or that they're going to be flagged and arrested for innocent pictures, etc. which is patently false based on the information we've been given. That's what I mean when I say people are misinformed (and I'm assuming the best there, vs. calling them downright dishonest, purposely twisting the facts to try to convince the uninformed).
 
Last edited:
Companies don't have an obligation to look for CSAM (only to report if found) b/c then they would be considered government agents and run afoul of the 4th amendment. But, no company wants to be the place for CSAM. That's why all the big cloud providers (other than Apple iPhotos) already scan for CSAM.
Interesting. I can see an argument for "only to report if found". Looks to me that the CSAM process Apple was/is hoping to implement is more than "if found". They're actively seeking, no? That seems to go beyond the "if found".

There is an argument for "no company wants to be the place for CSAM". But clearly when one designs a system for great privacy, the company cannot be assuming that all of its consumers are moral actors. That would be a naive view of human beings (and perhaps Apple did have a naive view?? Hard to imagine.). So just like this great privacy/security tool allows for CSAM, it also allows for other non-illegal but morally questionable activities.
 
This technology can’t scan for “political pamphlets”. Not how it works.
It can scan for whatever it wants. It's not that smart an AI. Let's say the Communist Chinese Party add a photo hash of a pamphlet to their CASM database, and Apple acquiesce by referencing that database. Or they sneakily manage to add an entry to a foreign power CASM hash database. Your lovely picture of a protest banner then identifies you as a pedophile.
 
I'm really curious about this "sneak an image into the CASM database" and the followup "sneak an image onto your phone"

Sounds like a lot of work. But clever! Like a spy movie! :p

But honestly... couldn't this already be done? Maybe not the phone part... but couldn't you add an image to the CSAM database and then sneak an image onto someone's Google Drive? Or Facebook account?

Facebook alone had 20 million CSAM incidents. Maybe some of those were plants. Framing someone.
 
I think Apple will skip this one. The risk for them is too great. Makes no sense. Why use the photo library of nearly billion devices to catch few criminals? Let the people who get paid by our taxes to do their job without using our devices.
I don’t think apple will skip this thing, they will implement it without us knowing. With the limited sharing options in photos why you would even use it for csam is beyond me. I can’t even share the family album with my family (perhaps intentionally).. will other parties, like Google foto’s or onedrive or even popular nasservices be subject to these practices?
 
  • Sad
Reactions: mainemini
It can scan for whatever it wants. It's not that smart an AI. Let's say the Communist Chinese Party add a photo hash of a pamphlet to their CASM database, and Apple acquiesce by referencing that database. Or they sneakily manage to add an entry to a foreign power CASM hash database. Your lovely picture of a protest banner then identifies you as a pedophile.
Who took this photo of this pamphlet and why would anyone else have that photo in their library? Same deal with this picture of a banner?
 
All I see on this message board: I want to help children and the fight against CSAM, as long as it doesn't involve me or have any effect on me.

The folks against this are either terribly selfish or they don't understand how hash values work. I'm thinking the latter, because I still have faith in humanity that most people want to help make the world a better place. There will not be false IDs with your family dog or kid in the bathtub. This is not Photo DNA or skin tone system, this is a hash value system.

This system will bring an unprecedented number of pedophiles to justice. The distribution and sharing of CSAM is simply out of control - the common person has no idea. Apple should just turn this thing on and let people whine and go buy an Android. Most won't do anything but complain.

Go learn about how SHA hash values work and then come back and try again.
 
I was just wondering why the EFF isn't upset about CSAM scanning on the popular cloud services. Have they said anything about them? They hired an airplane for Apple... but what about everyone else?

Six months ago I hadn't even heard of CSAM scanning. But apparently ALL the big services have been doing it for years: Google, Facebook, Amazon, Microsoft, Dropbox, etc.

Yes... I know it freaks people out about on-device scanning. Point taken.

But scanning is still happening elsewhere. You can't really do anything on the internet without touching the services of these major corporations.

So... I was just curious about what the EFF thinks of other types of CSAM scanning. That's all.
Ah! Your first statement there is a major clarification. TY! Your OP was phrased in a that conveyed a very different meaning, IMO. 😄

Yeah, great point! I would suspect the have been outspoken before, but maybe not? You should reach out to them to see what they say. Let us know!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.