Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I've been thinking about this "slippery slope = conspiracy theory" argument that has been floating around, and I think it is a little silly. Follow me down this path.

First, we have Apple denying the FBI when it means unlocking the physical phone. See articles like this for more details:

https://threatpost.com/apple-denies-fbi-request-to-unlock-shooters-iphone-again/151797/

Okay, now one of the key strengths they can argue is they don't have the physical ability to unlock the phone. There is no way to do so. They don't have a way past the encryption.

But surely we can all rejoice. Apple wouldn't cave to governments would they? That is their argument on CSAM after all. It will go no further than the present state as Apple will FIGHT! Fight any government that tries.

Unless it is China. Then they fold like a cheap suit. Back door access at the data center? Sure. Anyway, you can read all about how bad they have gotten down the "slippery slope" here from the New York Times:

https://www.nytimes.com/2021/05/17/technology/apple-china-censorship-data.html

Okay, so now we know that Apple will comply with governments to stay in business there.

Now to CSAM. It is clear that the law requires any of us to report it when we know it exists. We are not required to go look for it. But if we see it, we have to tell someone. Then due process can take over, and the authorities in question can get a search warrant.

Ah, but Apple is doing this on their servers with Mail and probably iCloud photos (although I can find no evidence of photos searching in the past or currently--it appears that this current mess is their Apple-tastic way of trying to do that). Okay, I don't like it, but I have been meaning to reduce my cloud exposure anyway. Also, they own the servers--not much I can do except support e2e encryption efforts--like using Joplin for notes.

But now we have the critical juncture. Apple scanning my phone looking for cryptographic hash equivalents for CSAM. For the record, I oppose this so much that I have already switched to a Nexus, Surface Pro 7, and my gaming PC. The fact that Android seems to care more about privacy at this point is sad and funny (and a squirrel...let's avoid that argument). And I have children of my own, but false solutions at the cost of my freedoms (especially when it is a private company so Constitutional protections likely don't apply).

"Oh, but there is no way it will go further. You crazy slippery slope people! Take off your tinfoil hats!" You say.

Let's address this idea. Most of the people in favor of gun control when pressed admit that they want tougher laws. They don't even know the laws on record, or what has already been done, they want the guns gone.

I would post links but...they are a little biased, and the logical construct is rather clear. The slippery slope argument is true with guns. No matter how much is made illegal, they want a little more.

Oh, just in case you thought I would only talk about the left, you are mistaken. Look at the new abortion law in Texas. First trimester is the law of the land. It will go no further right? No such thing as a slippery slope! Boom! Wrong. Here it is in all its ridiculousness:

https://www.texastribune.org/2021/05/18/texas-heartbeat-bill-abortions-law/

There are many, many examples of the slippery slope in action. Not just in a tinfoil hat way, but in the really real world.

So, now we come to Apple. When the Texas state government comes to them and says that they want access to check for abortions after six weeks (and they come with a state law), Apple cannot say they don't have a way to do it. They made the way to do it. And rather than lose all that sweet TX revenue, they will cave--just as they did in China and all over.

This is why we are upset. Yes it is a slippery slope argument. Yes that makes it valid. It really sucks. There is an event next week that I would have remained plastered to my chair just to watch. Now? Yawn. Nope.

P.S. has it occurred to anyone yet that a hacker could get into your icloud account. Upload CSAM photos, which are then downloaded to your devices. Which are then read by the spyware, and boom! You are arrested a month later.

https://arstechnica.com/information...an-accessed-4700-icloud-accounts-620k-photos/

This could be you! Or it could be me, well it could have been...
 
Last edited:
P.S. has it occurred to anyone yet that a hacker could get into your icloud account. Upload CSAM photos, which are then downloaded to your devices. Which are then read by the spyware, and boom! You are arrested a month later.

Honestly though, this could happen with iCloud, or any other cloud service, as well.

But your overall point makes sense. The slippery slope scenario, while not likely, is also not impossible, and shouldn't be completely dismissed.
 
  • Like
Reactions: eltoslightfoot
Honestly though, this could happen with iCloud, or any other cloud service, as well.

But your overall point makes sense. The slippery slope scenario, while not likely, is also not impossible, and shouldn't be completely dismissed.
Yes, but again, I can disable my icloud account (and have at least deleted all photos and they wouldn't show up on my device as a result). But yes, this is why if I was a business I would do e2e to protect my business.
 
  • Like
Reactions: jntdroid
That's silly. Apple doesn't know you from Adam. It would be totally stupid of them to blindly trust all their users when it comes to the use of THEIR (Apple's) servers. Therefore, everyone gets equal treatment in terms of accountability for what they upload.

You must be one of those people who go around recording themselves refusing to show their receipt at the store on the way out and arguing/fighting with the employee who's doing their job requesting it and claiming they're accusing you of theft.

I guess when police ask to see your ID on a traffic stop, they're accusing you of bad faith too, huh? I mean, shouldn't your verbal identification be enough?

Or how about when your bank asks for your ID to deposit checks or make withdraws? How dare they not take you at your word, right?!

Need I go on? Countless examples in everyday life of "trust, but verify" type situations. And that's totally sensible and appropriate. That's a car cry from an accusation. If they wanted to accuse you of dishonesty, they wouldn't ask for proof/verification/etc.

There it is: the authoritarian catchword of 'accountability' used in a context in which I and most others users have done nothing wrong. Tell me: what, exactly, I am accountable for if I have not put CSAM images on my phone? Anyway I assume that you would be OK with your phone broadcasting without your permission your id, information about all of your financial transactions, and all of the electronic sales receipts you have to interested private parties and the police. After all, 'trust but verify'...

More melodramatic and alarmist language. I know many eat this up, but I don't.

Time will tell.
 
Until they get a court order, National Security Letter, or similar.

I believe if that situation ever happened, Apple would simply abandon the CSAM detection rather than give in to an oppressive government demanding to abuse that technology.

There's a difference between satisfying your obligations and playing at detectives.
I bet you don't download lists of stolen cars and do ANPR just to check them all, or whatever.

First of all, that kind of check wouldn't be relevant to our business, whereas CSAM detection for cloud services is highly relevant. But if we decided to go above and beyond to implement ANPR, I see no problem with that.
 
All you had to do was disable iCloud for photos 🤷‍♂️
For now, and that means it still lives on my phone, mac, and ipad. Not too mention, it makes the phone less secure as it resides there, hackers can and already have taken advantage. So all YOU had to do is disable iCloud for photos. All I had to do is leave the garden. :)
 
There it is: the authoritarian catchword of 'accountability' used in a context in which I and most others users have done nothing wrong. Tell me: what, exactly, I am accountable for if I have not put CSAM images on my phone?

In that case, it's "authoritarian" for a bank to ask for ID to cash a check, right? Wrong. Funny how you failed to address my analogies.

If you use iCloud, you are indeed accountable to abide by the iCloud terms of service that you agreed to. If you don't want accountability, then don't use the service. Simple as that. No one's forcing you to.

Anyway I assume that you would be OK with your phone broadcasting without your permission your id, information about all of your financial transactions, and all of the electronic sales receipts you have to interested private parties and the police. After all, 'trust but verify'...

Huh? We're not talking about anything happening without your consent/permission. Again, you have every right NOT to use any cloud service.
 
Yep, until the next manufacturer implements that same type of thing. :p
There are options with other manufacturers that I don't have inside the garden. Android is open source. I can install Linux. And they sure haven't done "the same type of thing" on the device yet.
 
For now, and that means it still lives on my phone, mac, and ipad. Not too mention, it makes the phone less secure as it resides there, hackers can and already have taken advantage. So all YOU had to do is disable iCloud for photos. All I had to do is leave the garden. :)

You're obviously free to make your own choices. I think you're basing them on unreasonable paranoia, but I hope whatever you moved onto brings you satisfaction.
 
I know I'd want to know if a match was found, so I echo his question, why hide it from me.

Because there's absolutely no reason you would need to know that. If you have 30+ matched images, it's almost certain you're in possession of illegal material, and you'll definitely find out about that once your account is locked. If in the extremely, unbelievably unlikely event that you have 30 false positives, they'd be immediately dismissed under manual review and nothing would ever come of it.
 
But all the time you have had all the rights your content, your images… What it is differend that Apple start acting like everyone is criminal and every phone should be scanned. Then they start taking control to be your nanny and watch and warn about contents and inform other the messages you get.

It is not only scanning illegal material, but is is also scanning messages, emails and then photos sends to icloud and lso opening a port for misusing.

What about a situation… A person working in a particular job and others dont like the results they will see.. They buy prepaids, sends some csam content to the person to get rid of him/her.. When the investigation is done, properly, the damage has already happened… Or you dont like a teacher, send some csam photo from prepaid… Or you dont act as wanted in some country, use some hashed just a normal photos to follow the activity… The misuse of the feature is large enough to start boycotting apple.

Want to try to be a gay in middle east when apple activite this feature?
I think you do not understand how the hash system works or iCloud photos for that matter. Also even if a prepaid sent a bunch of illegal photos to someone messages and email they do not upload to iCloud. You have to manually add those to your photo library and also still have iCloud photos turned on. Anyone could do this to someone right now with or without this feature on and it would have the exact same affect.

Take a moment to understand how Apple‘s neural hash system works before spouting a scenario that literally would have zero affect with this feature.

Second take a minute to understand what I wrote. Algorithms already exist on iOS (for years) that can identify all kinds of stuff in your images. Explain why folks who are against the CSAM neural hash system are okay with literally scanning their photos for dogs, people, cats, boats, etc but are not okay with the CSAM hash system? Seriously, please just go do a bit of research on the features I described and if you weren’t aware of the former thats okay. You can boycott that as well.
 
There are options with other manufacturers that I don't have inside the garden. Android is open source. I can install Linux. And they sure haven't done "the same type of thing" on the device yet.
Sure...one can go as deep as they want to try to get away from the inevitable.

Not that I am for this type of invasiveness, however, once Apple opens the floodgates there will likely be others that follow suit. However, I am not jumping ship just yet.
 
  • Like
Reactions: eltoslightfoot
Sure...one can go as deep as they want to try to get away from the inevitable.

Not that I am for this type of invasiveness, however, once Apple opens the floodgates there will likely be others that follow suit. However, I am not jumping ship just yet.
That is fine. Everyone has to decide what that line is...it isn't an easy question to answer.
 
Pretty huge difference, one is a a.I. Algorithm designed to help you find stuff on your device, the other is spyware designed to look for illegal content and report you to law enforcement …. Your just not thinking this through, the a.I. Algorithm is a sandboxed on Device indexing tool, unless you opted in to help Apple improve it, it stays on your device
How is the CSAM neural hash system not the same. It runs on device only (in your control). Only compares hashes (not images) for known CSAM. Only uploads coupons to iCloud photos if you have iCloud Photos turned on (which you can control). Only gets human review if you have at least 30 coupons that match illegal hashes. Only gets reported to authorities if after human review those images are of known CSAM.

Outside entities can’t inject CSAM onto your phone and Apple gets it from several agencies and not just one. Even if they somehow did compromise all those agencies hashes have to be exact matches. So converting an image of the confederate flag into a hash and then somehow getting past all the security to get it loaded onto an iPhone would only flag (not a pun) if the hashes were identical. If you were at a civil war reenactment and happen to have a confederate flag in the background it would not produce the same exact hash as the false CSAM hash.

Bad actors spamming your email, messages and other forms of communication with known CSAM images don’t get added to iCloud Photos unless you add them. So if you’re dumb enough to do that then you are probably the type of person this is intended to catch.

The A.I. Algorithm is actually looking at images where the neural hash system is converting images into hashes that can’t be reversed back into the original photo. You could type in confederate flag or flag and it will pull up flags. Its only a small step for that information to be passed off to bad actors. I personally don’t think that is happening but the second A.I. algorithm is much more affective for doing what people are afraid of than the CSAM stuff.
 
  • Like
Reactions: sgtaylor5
Because there's absolutely no reason you would need to know that. If you have 30+ matched images, it's almost certain you're in possession of illegal material, and you'll definitely find out about that once your account is locked. If in the extremely, unbelievably unlikely event that you have 30 false positives, they'd be immediately dismissed under manual review and nothing would ever come of it.
So I can find that picture and expunge it if it's really bad, or open a case with Apple if it isn't. That's why I'd want to know. I don't care if I'm allowed 30 matches, I want to know on every match. (if I were to allow this thing on my phone at all.)

I know the chances of it finding a bad picture on my phone is pretty much 0, since I would never put one there, but bad matches, yes, that's possible.
 
  • Like
Reactions: Pummers and dk001
So I can find that picture and expunge it if it's really bad, or open a case with Apple if it isn't. That's why I'd want to know. I don't care if I'm allowed 30 matches, I want to know on every match. (if I were to allow this thing on my phone at all.)

What, so criminals can be tipped off and hide/destroy the evidence?

I know the chances of it finding a bad picture on my phone is pretty much 0, since I would never put one there, but bad matches, yes, that's possible.

Possible and likely are two different matters. Apple has stated the chances of an account being falsely flagged are less than 1 in 1 trillion per year. Even if you reduce that 1000x to 1 billion, it's still incredibly unlikely.
 
  • Haha
Reactions: dk001
You haven't read much of the threads, have you.

the reason why typing grass and the phone showing your pictures of grass, is that it's for our personal benefit, and most importantly, it doesn't turn you in. It's not the scanning per se, it's the scanning for the government that's the problem!
I know this is a personal choice and I can respect folks different opinions. However Apple has been incredibly clear on how the process works. Its in your control. It compares hashes and not images. It gets human review to make sure that it is comparing hashes of known CSAM and that the CSAM is child pornography and not a vote for X person image.

Its near impossible to compromise the CSAM database without huge resources and even if it were it still gets human review before it goes to agencies that prosecute those folks.

Its not suseptable to spamming their inbox or messaging apps with known child porn because if a third party did do that those photos do not get added to the photos library automatically. There is not mechanism for it the user has to had them manually.

Currently the API is not available to 3rd parties and I understand that Apple stated they would be open to that, at which if left as is then maybe it could be suseptable to an attack but I’m willing to put money down that at that time the system is much more secure and other requirements of the 3rd party apps will be required of them. Speculation yes, but thats exactly what everyone that is against it is doing and they don’t address what it actually only does. They just make up scenarios and then don’t run their scenario through the CSAM system and see that if falls apart and doesn’t work or can’t work.

Looking for CSAM thankfully will not be for my personal benefit as I don’t have kids or anyone I know that is a victim. However as a decent human being, and I’m not saying you’re not, I’d gladly allow my photos to be scanned for CSAM to benefit those victims and prevent more.

This issue is that folks aren’t taking enough time to understand the system, what protections it has, and how it won’t/can’t affect them unless they are doing something illegal with child porn. Apple has been clear that they will not bow to governments that want to use it maliciously. If you don’t trust them with that then why do you trust Apple to not search for ‘grass’ on your phone and turn you into the government? You have no more of a way to know they are using that system as intended as they are the CSAM system.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.