Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This helps. Thanks! This is all I wanted clarification on and too bad it took so long. And the idea might be preposterous but as a developer these details help me understand how something works.

The scenario and Apples claims were contradictory. They claimed any modification would still result in a match but a similar picture wouldn’t is a bit contradictory. Like a bathtub picture for example. Legit vs offending pic but the same angle and style.
But that is the one point that is so clear in the white paper that I was having difficulty with you understanding…the hash method is so precise, simply taking a pic of a baby in a bath and having it match a database image is so astronimaclly high, much less than more than one, it shouldn’t be an area of concern.

The reason for people not accepting this on this forum is either them not understanding what one in one trillion really means or they simply do not believe what Apple is saying as it pertains to the odds. I can’t really help with either of those…much…(outside of the comical, yet true, examples of odds in other cases. 🤣 )

 
Last edited:
  • Like
Reactions: montuori
That’s why they implemented CLIENT-SIDE analysis so no data leaves your phone and Apple doesn’t access your photos.

You prefer server-side analysis where all your encrypted iCloud photos need to be decrypted first???

Theyre the first company to implement a privacy oriented solution to ban illegal SCAM content from their servers and everybody complains. Go figure.
But that means Apple is “on my phone”…oooooooooo 🤣
 
Think it's too late. The mob has already made up it's mind. For what it's worth, I think the idea behind the implementation is pretty sound and being done for the most respectable of reasons and ultimately if you don't want it, you don't use iCloud to store your photo's. I understand the outrage though because of the scope for abuse.. and the fact that today it's just looking for CSAM images but who knows what it could be used for in the future? But with all this power surely they have some duty of responsibility to look out for those who are incapable of defending themselves.
Those with more awareness and experience know that it is not "what if" it gets used for something different in the future. Those with more awareness and experience know that the only reason they're doing this in the first place is to set the stage and pave the way for something else down the line.
 
But that is the one point that is so clear in the white paper that I was having difficulty with you understanding…the hash method is so precise, simply taking a pic of a baby in a bath and having it match a database image is so astronimaclly high, much less than more than one, it shouldn’t be an area of concern.

The reason for people not accepting this on this forum is either them not understanding what one in one trillion really means or they simply do not believe what Apple is saying as it pertains to the odds. I can’t really help with either of those…much…

But then Apple throws a wrench in the concept by stating any modifications would get flagged. It is such a contradiction if the image needs to match to those such great details, and then say if it’s modified, it will get flagged. If it’s a similar bathroom, similar angle of the picture and things like that. I don’t feel like the false positive rate is as good as they state. If I can fire up photoshop and replace the subject with The Hulk and it still gets flagged. That is the discrepancy that is confusing.
 
Look, in the end Apple had a misfortune of introducing this right after the revelations about the abuse of the NSO Pegasus spyware. No wonder people are skeptical. By the way, what are you eating for lunch? 😏
I'd LIKE to say that I ate something exotic on the deck of a yacht while hanging out with Kanye in the Azores living my best life, instagram, selfie, besties, like whatever, etc. BUT......I had a boring ole turkey sandwich with side of chips and an apple while watching the local news on TV. Oh well.
 
Under that assumption, they could do whatever they want behind closed doors already. Hell, why even publish this stuff about CSAM, if they could just do it without telling people, as you say?
There is an answer to that, and it is quite complicated. The short version is: there is indeed a public and private version of reality.
 
It's about bashing Apple. No one yells at Google for their GMail scanning for child sex abuse imagery. But Apple is for some reason different.
Well, Apple is different. They are the ones pretending to be concerned with privacy at all costs. And have millions upon millions of followers who actually believe it, for no other reason than they say so.
 
How long before USA and/or China demands Apple to start scanning for content other than CSAM on all iPhones (not just the ones with iCloud Photo Library enabled), or they would not be allowed to sell devices in those markets? These two countries represents two-thirds of Apple's revenue and therefore have a lot of leverage on the company.
In my opinion, they already have, and already do.
 
Proponents of this system should think about the missing question in this FAQ: “Is the CSAM system effective?”.

The answer is “No”. It doesn’t prevent children from being abused. It doesn’t prevent the creation of new CSAM. It doesn’t even prevent CSAM from being shared.

Someone in possession of CSAM would simply disable iCloud Photos which also disables the detection system. They can still share it using websites and apps and they can even spread it through iCloud Drive by encrypting it before uploading.

In short, Apple introduces a system which is known beforehand to be completely useless for its intended purpose. So what is Apple’s real intention?
 
You want to see two pictures that do match regarding the hash? - here you go.

perchash_target.jpg
perchash_source.jpg


Hey Mr. Cook, we got a match!
Yes, I am aware, that both are NOT in that database, but they obviously match. So: If it is not allowed to have pictures of women in a white shirt, do not try to have that butterfly in your library...

Pictures originate from:

If it is a problem, publishing the pictures themselves here, please remove that post.
 
Last edited:
I went back to read the previous comment more carefully, as you can understand the first and most urgent part to answer to was the one in which I was called out for being factually inaccurate.

I’ve read the analogy (yet to read the linked blog post) and it doesn’t make much sense honestly.

Apple is already inside your sacred space with both feet, they’re already sleeping on your sofa, since the OS is not open source you don’t know what’s really happening under the hood.

They will occasionally ask you to participate to initiatives that are not in your direct interest, like submitting error telemetry (you can opt-out of this with no functionality loss).

They will for sure update the terms and conditions to ask your consent about this too. This time opting out will cost you not having access to their iCloud Photo service (you will still be able to backup your precious photos to iCloud Backup, from what I’ve gathered), but what are you gonna do, this time the stakes are pretty high: hosting or not CSAM on their servers. Of course it can’t be opt out for people willing to store stuff on their (Apple’s) property. But as far as what happens within the (not so) sacred space of the local device, they will just stick an unreadable post-it to the fridge, nothing more. That post-it note will sit there doing nothing and being unreadable. It’s just a technicality like a thousand like it, nothing much to get triggered about or to equate to unrelated examples.

As for ”why are companies and banks asked by the State to sometimes police and report their customers”, that’s how it works. You can‘t bring 1M€ cash to a bank and you can’t upload CP to cloud storage hosts. The fact Apple is using a workaround that makes finding CP even more privacy-minded is frankly just a technicality. (and you can live a perfectly normal digital life by uploading those pics to another service, a bit different from the “you can’t leave your home with the stolen items, period” example).

Thank you, and you bring up a different framing of my scenario which helps somewhat with my perspective.

(Sidenote - I'm not the one that asked about banks, but agree with you on that.)

That said... Yes they're already in my space. And no, I don't know everything they're doing in there. I don't worry about that. I can only go off what they tell us and trust what they tell us based on their track record (one of your previous points if I recall). Also, as you stated, other initiatives have had the option to opt out with no functionality loss.

But to date, they haven't used my space in this way. I can't think of anything they're knowingly doing in "my space" that behaves like this.

I'm not even worried about the "slippery slope" or conspiracy theory based concerns. They're simply using my space to do more than they have in the past, and telling me they're going to put an unreadable sticky note on my fridge without my permission, and it will stay unreadable as long as I stop using one of their better services. I have nothing to worry about with that sticky note. But I like my fridge how I like it - don't put a sticky note on there without me being ok with it, just because someone else deserves to have that sticky note clutter up their fridge.

That's the mentality I think many are coming from, for better or worse... Is it subtle? Sure. Is it too dogmatic? Maybe. But it's a legitimate thought process.
 
Just because they claim to and make it part of their marketing, does not mean they actually do.
Apple does care about their user's privacy in democracies. It's just that they shot themselves in the foot by introducing CSAM in the US.

There would be less of an outcry if Apple rolled this out in an authoritarian state such as China.
 
  • Like
Reactions: Dj64Mk7
It's about bashing Apple. No one yells at Google for their GMail scanning for child sex abuse imagery. But Apple is for some reason different.
Google does not install spyware on your device that sifts through your files. Everybody understands that if you choose to upload data to the cloud (i.e. someone else's computer) without first encrypting it, they can access it and have the right to keep illegal material off of their servers. What Apple has implemented is fundamentally different and new. Your own device is now treating you like a suspected criminal and may report you if Apple's secret algorithm thinks it sees something "suspicious". And yes, if they expand (or are forced to expand) the use of the infrastructure they built, it can absolutely undermine end-to-end encryption.

Apple's claims that they will "not accede" to government demands are ridiculous. Of course they will accede when either forced by law or threatened with severe financial repercussions. We have already seen that in China, for example.
 
You want to see two pictures that do match regarding a possible hash? - here you go.

View attachment 1816592View attachment 1816591

Hey Tim, we got a match!
Yes, I am aware, that both are NOT in that database, but they obviously match.

Pictures originate from:

If it is a problem, publishing the pictures iteslf here, please remove that post.

Oh great thank you! This was my worry but you put pictures to my thousands of words. Exactly this. An image can be a match even if it’s unrelated. Especially with Apple’s claims that modification to the images will also be a match, this is more concerning.
 
Proponents of this system should think about the missing question in this FAQ: “Is the CSAM system effective?”.

The answer is “No”. It doesn’t prevent children from being abused. It doesn’t prevent the creation of new CSAM. It doesn’t even prevent CSAM from being shared.

Someone in possession of CSAM would simply disable iCloud Photos which also disables the detection system. They can still share it using websites and apps and they can even spread it through iCloud Drive by encrypting it before uploading.

In short, Apple introduces a system which is known beforehand to be completely useless for its intended purpose. So what is Apple’s real intention?

I do wonder about this. That doesn't mean we shouldn't do anything possible to slow down / hinder / catch child predators. But if it can be turned off anyway (unlike server side scanning), how much does it truly help? Doesn't that essentially make it somewhat of an opt-out feature from the criminal's mindset?
 
But that means Apple is “on my phone”…oooooooooo 🤣

🤣

Breaking news: Apple is inside our iPhones! And we can’t arbitrarily ”remove code” as we like from an OS we only have a license to use and we do not own! And maybe Apple will decide to add a couple more megabytes (for the hashes database) to the “System” storage space on the iPhone! Shocking!

Some people..
 
  • Like
Reactions: MozMan68
I have to say, I find it really disconcerting that this is the issue that suddenly so many people think Apple is overstepping. A million things you let go. But Apple decides to start doing what ever other major provider does, and not allowing iMessage to continue to be a safe haven for child porn...and that generates outrage?

The reaction alone tells me this is probably something Apple really does need to do.
 
  • Like
Reactions: MozMan68
Proponents of this system should think about the missing question in this FAQ: “Is the CSAM system effective?”.

The answer is “No”. It doesn’t prevent children from being abused. It doesn’t prevent the creation of new CSAM. It doesn’t even prevent CSAM from being shared.

Someone in possession of CSAM would simply disable iCloud Photos which also disables the detection system. They can still share it using websites and apps and they can even spread it through iCloud Drive by encrypting it before uploading.

In short, Apple introduces a system which is known beforehand to be completely useless for its intended purpose. So what is Apple’s real intention?

I’m honored we have the Interpol director here on the forum, I didn’t know.

Good evening sir.
 
I have to say, I find it really disconcerting that this is the issue that suddenly so many people think Apple is overstepping. A million things you let go. But Apple decides to start doing what ever other major provider does, and not allowing iMessage to continue to be a safe haven for child porn...and that generates outrage?

The reaction alone tells me this is probably something Apple really does need to do.

Well I don’t think that’s fair. I don’t use Facebook and Twitter for other privacy things too.
 
I do wonder about this. That doesn't mean we shouldn't do anything possible to slow down / hinder / catch child predators. But if it can be turned off anyway (unlike server side scanning), how much does it truly help? Doesn't that essentially make it somewhat of an opt-out feature from the criminal's mindset?

Also a criminal is probably going to use a fake name, with fake address and itunes gift card for payment. Does apple even verify ID for accounts?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.