Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
So he wasn't even discussion #"5. Then Apple looks at it and decides what to do next. They either verify that it's CSAM or they discard it if it somehow got through all the other checks." then.

Apple uses two hash technologies. It seems he wasn't looking at the last one.

Two different intents.
If I am looking for weaknesses in a system, I will take specific functionality that is part of the underpinning and see if it has any gaps. If it does then that piece gets flagged to be fixed or mitigated.

Yannic was looking specifically at a claim Apple made and has found an apparent gap. That is all that is at the moment. The fact that it fits into a growing narrative … especially the 14.3 bit.

Apple‘s claim the they have it covered is lame at best. Show us. Have a peer review. Have additional discussions demonstrating proof. Something. A verbal claim only is anything but.
 
For me its the gateway to other invasions....

Probably started with google, "we have AI scan your emails to better serve you ads". No human involvement but I still don't want anyone doing that nor do I trust that anyone, especially Google, is not harvesting other data from my personal emails.

Now Apple wants to scan hashes of pictures that will be sent to iCloud because they don't want CSAM on their servers (nor does anyone). Next to no human involvement until they think your a pedo, then someone looks through the flagged pictures to see if they are CSAM, there is a chance the picture wasn't CSAM but now some stranger has already looked at YOUR PERSONAL PICTURES. Might have been a picture of a flower pot, might be a pic of a far more personal nature, no way to know until its too late.

THEN...

Apple (or any other manufacturer) can decide at any moment that they want to compare hashes of pictures NOT being sent to cloud based services, just pictures on your device. Sure, there will be mass outrage but "think of the children".

THEN...

Once we are comparing picture hashes then comes the question "what about video"? Pedos could be uploading CSAM videos to iCloud, now we need to use "AI" to scan your videos too. Much harder to do, much more invasive and much less accurate. In order not to falsely accuse anyone humans will need to review YOUR PERSONAL VIDEOS!

THEN...

What about your security cameras from Ring, Amazon, Eufy, Logitech, etc? All of these folks store video on their cloud systems and surely they don't want illegal materials on their servers either so now they start using "AI" to listen to the audio from your cameras for certain sounds, words, phrases, etc. Just to be sure no one is falsely accused humans will need to review YOUR PRIVATE AUDIO AND VIDEO!

THEN...

The government(s) get involved and want to get info on dissidents/opponents.

See what can happen? It always starts small and with something that is difficult to come out against, like CSAM, but as technology gets better (like hashing audio or video) the invasions into your privacy get deeper and deeper.

Agree 100% it’s a slippery slope (an actual one, not the fallacy) into more-invasive tactics.

This also gets into the area that this involves Apple using my device, CPU power and battery life to do activities I don’t consent to.
 
The fact that the scan is done on the user's phone, without their consent, and *prior* to uploading makes this a warrantless search that Apple is conducting as a fishing expedition on behalf of law enforcement.

Law enforcement cannot do this without a warrant which requires probable cause.

NCMEC is a private foundation, but is funded by the US Justice Department. Anything Apple refers to them will be reported to FBI or other agencies. It's also run by longtime infomercial hawker John Walsh, father of Adam Walsh.
Bingo - this is a point I made in a discussion on Reddit about this. People pushing back, suggesting this isn’t a violation of the 4th and 5th amendment are missing that this is Apple acting as a de facto arm of law enforcement without probable cause or a warrant. Why would we want to allow a private entity *more latitude* than government (especially when they’re acting on behalf of it)? Government violates our rights quite often enough (civil asset forfeiture ring a bell?) without making it easy for them to rope in private companies to get around that pesky Constitution.
 
This will all blow over once people realize that there isn't some mass amount of people getting arrested for false positives. Nobody except the sickest of people will even have their privacy invaded.

So yeah, switch to some other device or turn airplane mode on. Whatever you wanna do for that 1 in a trillion chance your 30 innocent photos are looked at by an Apple employee.

Panic much?
And, just like the airport body scanner cancer-inducing machines we were assured no one could capture images from (yet images were all over the internet within days/weeks of them starting), we should believe nobody at Apple will be looking at, and possibly saving, people’s private images… why?

It doesn’t take much to turn this from “think of the children!!” CP searching to looking for “Chinese tank man” images, or any other political-enemy purpose. Look at the Kafka-esque “no fly list” that shouldn’t even exist. Political enemies found their way onto the list, like the author of the critical book “Bush’s Brain” who found his way onto the secret list during Bush’s presidency.

I’d argue it’s more secure to abuse if it’s done on device too and effectively will (if the implement E2E make your photos in the cloud more secure) - since there are tin foil hats here thinking folks are going to inject kiddy porn into their photos.

And if you mistakenly let someone have access to your phone while you go to the bathroom at a restaurant and they want to set you up - they visit a website with CP, press and hold on a few pics and save them to your Photos library (where they’re automatically uploaded to iCloud) and now you’re an offender and the government has a free path to prosecuting you.

And we've all seen just how "reliable" is the FISA process

Ah, yes, the “rubber stamp” warrant, where the judge can’t even ask clarifying questions (if the topic is terrorism - not sure about CP) before rubber-stamp-signing it.
 
Last edited:
  • Love
Reactions: Pummers
And if you mistakenly let someone have access to your phone while you go to the bathroom at a restaurant and they want to set you up - they visit a website with CP, press and hold on a few pics and save them to your Photos library (where they’re automatically uploaded to iCloud) and now you’re an offender and the government has a free path to prosecuting you.
What's to stop someone from doing that now?

Also, why would you go to the bathroom without your phone and also not have a password on it? Also, why are you hanging out with people that would try to frame you for CP?
 
Wait. I’m not understanding something. Can someone please explain this to me.

I’m about to become a father in 2 weeks. There are going to be pictures being taken of my child, some clothes and some “nude”. All parents take pictures of their babies just being babies.

So am I going to be arrested because I took a picture of my nude baby? Will I be considered a pedo because I took a pic of my own baby in the nude?

Please tell me I’m misunderstanding this whole thing because if that is the case then I’m burning my iPhone to ashes first thing in the morning.

Please can someone tell me I’m overreacting because I’m stupid and don’t understand this whole thing.

Thanks.
Hope you’re fine with random people viewing nude pictures of your baby… “just to be sure” they don’t constitute new CP.
 
Hope you’re fine with random people viewing nude pictures of your baby… “just to be sure” they don’t constitute new CP.
Are nudes of your new born baby in the database of known CSAM photos? If so, you have bigger things to worry about.
 
  • Like
Reactions: BigMcGuire
They have the capacity to scan every image on their cloud servers like every other cloud company. What's stopping them from scanning for anti-government images now?
It would be useful at this point to remind people that just coming up with a “cause” (“think of the children/CP!” in this case) doesn’t mean we should accept it. There are far more ”causes” than you have rights/privacy and if you stand for it every time they make up a new one you’ll be out of privacy/rights long before they run out of “causes”.
 
  • Like
Reactions: dk001
Are nudes of your new born baby in the database of known CSAM photos? If so, you have bigger things to worry about.
I'm amazed that people people still don't get how this works.

BTW, I've been following these threads and while I appreciate that you're still 'hanging in there', haven't you realized that this argument is futile? These threads are already dying and I think we should let them. It's only the un-winnable arguments that are keeping them alive.
 
  • Like
Reactions: BigMcGuire
I'm amazed that people people still don't get how this works.

BTW, I've been following these threads and while I appreciate that you're still 'hanging in there', haven't you realized that this argument is futile? These threads are already dying and I think we should let them. It's only the un-winnable arguments that are keeping them alive.
True, I guess I'm just hanging on to squash any misinformation. I noticed that so many people didn't read the actual documentation so they're just "guessing" at how it all works without really knowing. I'm just trying to keep the misinformation at a minimum.

I will let it die though. I've said enough.
 
But they could 😉

Isn't that what everyone is into these days anyway? The what-ifs and hypotheticals? 😉
You jest, but haven’t we seen basically everything privacy-related in the name of some extra security abused? It’s not as if people are simply imagining that “giving them an inch” turns into a mile…. Look at the warrantless surveillance of US citizens by the US government. If they’re willing to launch a fishing expedition on 300+ million people why *wouldn’t* they abuse any other tool they have access to?

Apple is not perfect. That is why projects have hypercare and bug fixes.
The one aspect Apple has yet to really explain is why client side. Just basic risk analysis shows areas of high risk with this type of solution that Apple has not addressed as far as we know or they have indicated.

Hilarious anyone wants to act like Apple is beyond reproach on something like this. Apple, the “you’re holding it wrong” company…
 
Last edited:
  • Like
Reactions: dk001
True, I guess I'm just hanging on to squash any misinformation. I noticed that so many people didn't read the actual documentation so they're just "guessing" at how it all works without really knowing. I'm just trying to keep the misinformation at a minimum.

I will let it die though. I've said enough.
Yeah, let it go. It’s impossible to squash misinformation on a forum with so many active posters. Let the tin-foil hatters rage on while we enjoy our new Apple products.
 
  • Like
Reactions: BigMcGuire
The same people crying over CSAM are most likely the same people sharing their life details over twitter, facebook, tiktok, snapchat, etc.

There's such a weird and strong desire to be afraid or panicked about everything, coupled with hyper outrage leading to lash out at anyone with a differing outlook.
 
Yeah, let it go. It’s impossible to squash misinformation on a forum with so many active posters. Let the tin-foil hatters rage on while we enjoy our new Apple products.
Seems to me that these days people enjoy the frustration caused by intentionally being unreasonable, making outlandish crazy statements, and pushing a narrative just to upset others - more than ever before. There's a serious lack of in the middle no extremism statements. It's either, "What, you've got CP to hide?" or "APPLE HAS RUINED PRIVACY FOR EVERYONE! TIME TO LEAVE IT FOR ______" lol. It's also frustrating to me the number of people who join in with crazy statements who clearly haven't made much of an attempt to understand what's going on.

Thanks to a poster here, I watched a Stanford Internet Observatory talk on the subject and even I learned something new. Edit: (Me not being super smart).

But long story short - it makes it difficult to enjoy these forums :/, reminds me of what Facebook was like years and years ago when I left it.
 
Last edited:
  • Like
Reactions: dk001 and JBGoode
What's to stop someone from doing that now?

Also, why would you go to the bathroom without your phone and also not have a password on it? Also, why are you hanging out with people that would try to frame you for CP?
Nobody was scanning my device or my iCloud for suspected CP before. Isn’t that what we’re discussing Apple implementing?

And the point isn’t about how I should know all possible motives of anyone I’m ever around, which frankly is ridiculous and impossible; it’s about the easy ability for this to be used for nefarious purposes and how an innocent person could be setup. All it would take it you being on a ladder working or something and having to give your passcode to someone since you were indisposed do they could do something on your device, to them remembering it and using it to do what I described. Maybe someone who discovers someone was cheating on them, does it? It’s not my job to continue coming up with scenarios or for you to try and diminish them as if they’re not realistic (they are). The fact it’s so easy to come up with such hypotheticals means it, almost inevitably, *will* happen.
 
Are nudes of your new born baby in the database of known CSAM photos? If so, you have bigger things to worry about.
Between this iPhoto/iCloud scanning and iMessages scanning they’re also looking for NEW child porn, not just existing (was my understanding).
 
Yeah, let it go. It’s impossible to squash misinformation on a forum with so many active posters. Let the tin-foil hatters rage on while we enjoy our new Apple products.
You people are how we ended up with the ‘Patriot Act’ and warrantless blanket surveillance of everyone to begin with. Don’t act like you’re on the right side.
 
  • Haha
Reactions: JBGoode
Even a genius like YOU?
I'm an old fart who is very far from a genius. That's why I usually keep my mouth shut. At least I can feign intelligence. :p <cough>.

I apologize, that statement was meant to say that even I learned something new meaning, me as an idiot (not a smart guy). (I modified original post to help make this clearer).
 
Last edited:
Seems to me that these days people enjoy the frustration caused by intentionally being unreasonable, making outlandish crazy statements, and pushing a narrative just to upset others - more than ever before. There's a serious lack of in the middle no extremism statements. It's either, "What, you've got CP to hide?" or "APPLE HAS RUINED PRIVACY FOR EVERYONE! TIME TO LEAVE IT FOR ______" lol. It's also frustrating to me the number of people who join in with crazy statements who clearly haven't made much of an attempt to understand what's going on.

Thanks to a poster here, I watched a Stanford Internet Observatory talk on the subject and even I learned something new. Edit: (Me not being super smart).

But long story short - it makes it difficult to enjoy these forums :/, reminds me of what Facebook was like years and years ago when I left it.

I have learned quite a bit over these last few days.
Has it changed my mind regarding Apple products? No. I have already been scaling back Apple due to other reasons. It does however give me another item to consider and keep an eye on.

Still a lot of questions regarding this direction taken by Apple.
Hopefully time will clarify “why”.
 
Trading posts with @Jayson A on another thread, this little tidbit tickled my couple of brains cells:

Apple is going to all this effort to try to keep CSAM off the iCloud. Yet, if I am reading things right, I can turn off Auto-Backup of Photos and just manually load the same photos onto the iCloud.

Isn’t that kind of self-defeating the whole prevention process?

I keep coming back to the “Why?” on Apple‘s design.
 
Trading posts with @Jayson A on another thread, this little tidbit tickled my couple of brains cells:

Apple is going to all this effort to try to keep CSAM off the iCloud. Yet, if I am reading things right, I can turn off Auto-Backup of Photos and just manually load the same photos onto the iCloud.

Isn’t that kind of self-defeating the whole prevention process?

I keep coming back to the “Why?” on Apple‘s design.
Are you talking about uploading them via the iCloud web app? I guess that would bypass the device-side scan, but do you really think they'd make it that easy to bypass? Of course, I'm speculating because I have no idea what they have installed on their servers. They could quietly scan everyone's photos on the server if they wanted since they hold the encryption key.
 
Between this iPhoto/iCloud scanning and iMessages scanning they’re also looking for NEW child porn, not just existing (was my understanding).
The iMessage thing is completely separate and works vastly differently. The iMessage system is opt-in by the parent as a parental-control option to ensure that any photos being sent from or received on the child's device isn't nudity.

The way it works is once the image has been downloaded to the phone, AI examines it and determines if it's nudity, then it blurs the photo. Then if the child wants to look at it, they get a message saying it could be something unsafe and if they proceed anyway, it warns them that they'll notify the parent that the image was revealed. All of this happens only on device with no way of Apple knowing what the AI did or what the child received or viewed. Same thing happens when the child tries to send a nude picture before the picture is sent, the AI attempts to intercept it and ask if they really want to send it and so on.
 
Hilarious anyone wants to act like Apple is beyond reproach on something like this. Apple, the “you’re holding it wrong” company…
Indeed. This problem is because Apple has screwed up badly for years, and instead of owning up to its failure(s), it has now pushed the problem to its users by putting the scanner locally.

Again, all of the people who have an issue with this are asking why is Apple doing it in this manner?

Maybe this is a clue https://www.nytimes.com/2020/02/07/us/online-child-sexual-abuse.html:
2019 CSAM reporting (164 companies):
Facebook
65,000,000​
Google
3,500,000​
Yahoo
2,000,000​
Microsoft
>100,000​
Twitter
>100,000​
Dropbox
>100,000​
Snap
>100,000​
Apple
3,000

This is all speculation on my part:
I think the child safety advocate groups were livid over the non-reporting by Apple and threatened to shame Apple publicly over this. In response, Apple rolled over and decided to go whole hog on invasive scanning (ie. let's turn this around and be 'the best at CSAM reporting!!!' instead of the worst).

This is why the email by the NCMEC was so 'encouraging' to Apple to hold the line, the NCMEC knew Apple was crossing a threshold and that there would be a lot of opposition but they aren't concerned about the ramifications of what Apple has implemented as long as they got their local CSAM scanning.

Again, look at how poorly Apple has done it previously, couldn't they have just scanned their cloud like every other provider has done in order to improve their terrible CSAM reporting? Instead, we get this abomination.

 
Last edited:
  • Like
Reactions: dk001
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.