Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
What I can’t figure out is WHY they are doing the checks on device if it’s only impacting images through apple services like iCloud or iMessage? Why not just do that same check server side and not add some random “backdoor”. All the cloud providers already do the same crap, so why are they moving the check onto devices, are they really lacking compute that much?

in its current iteration, sure it seems quite safe, but it seems pretty easy for china for example to make a law where they control the hash list.

This is pure speculation on my part...

1. Apple doesn't want those images on their servers to begin with.

2. Transmitting those images to Apple's servers would be a separate crime the user has committed. Not that Apple would/should necessarily be concerned. But I have no doubt Apple execs and legal team gamed this out in detail, no doubt with pressure from the government, and this is what they came up with.
 
It’s like trying to explain to someone that vaccines are good. People won’t understand things that negate their world view.
There’s a hard line and a big difference between known microbiology and theoretical microbiology.

It might be beneficial for you to explain to everyone your deep understanding and the accuracy rates for researching like viral particles around 1m with best available processes, like TEM.

You should probably also explain what are the actual capabilities of the research methods for the field in question, at the scale in question, and what the exact (or best estimated) margin for error for these best available tools and methods.

We will await your list of detailed findings on this area of study, and look forward to seeing an impressive demonstration of your intellect regarding these basic capabilities/limitations of current microbiology/organic chemistry.
 
  • Like
Reactions: Violet_Antelope
What I can’t figure out is WHY they are doing the checks on device if it’s only impacting images through apple services like iCloud or iMessage? Why not just do that same check server side and not add some random “backdoor”. All the cloud providers already do the same crap, so why are they moving the check onto devices, are they really lacking compute that much?
I don't think it's a matter of compute. Even taking compute into the equation, this will cost most users maybe $0.01 per year in compute costs --- its some basic math being performed only during upload to iCloud. If he's to be believed, Craig suggested that they have been working through a way to do this for a long time, and they finally found a way to increase user privacy while still being able to check for CSAM. It's a nuanced argument, but scanning your entire library's actual photos in the cloud (Apple is looking at your actual photos in this case) vs. scanning locally (Apple is not directly looking at your photos) to create hashes to match against known CSAM cuts out a lot of Apple looking at your photos --- iOS has already been scanning all of your photos for over a decade, now they scan in a new way to create a hash during the upload-to-iCloud pipeline to check against CSAM hashes, so that the only reason they will look at any of your photos (and even then, only a derivative), would be if you are flagged for uploading >= 30 photos containing know CSAM.

In short: without this, Apple looks at ALL your photos you upload to iCloud; with this, Apple looks at derivatives of flagged photos only in the case that >= 30 of those photos contain near-exact matches to known child porn. In other words, a lot less surveillance than what is currently taking place.


in its current iteration, sure it seems quite safe, but it seems pretty easy for china for example to make a law where they control the hash list.
The algorithm's reliance on the intersection of hashes from multiple sovereign jurisdictions makes this a non sequitur.

Giving them access to my data was not part of the terms my original transaction.
Perhaps some good news: if you weren't using iCloud Photos before (implied by you not giving them access to your data), then you can continue to not use iCloud Photos and this system will not run in any way.
 
Last edited:
This is a distraction strategy run by PR to sell the idea that there is confusion.

This will continue until Tim has to step in and announce that it's dead and they are rethinking their strategy.
It is interesting that we have not heard anything from Tim Cook. Usually he's quite vocal about virtue signaling stuff.
 
  • Like
Reactions: boswald
There’s a hard line and a big difference between known microbiology and theoretical microbiology.

It might be beneficial for you to explain to everyone your deep understanding and the accuracy rates for researching like viral particles around 1m with best available processes, like TEM.

You should probably also explain what are the actual capabilities of the research methods for the field in question, at the scale in question, and what the exact (or best estimated) margin for error for these best available tools and methods.

We will await your list of detailed findings on this area of study, and look forward to seeing an impressive demonstration of your intellect regarding these basic capabilities/limitations of current microbiology/organic chemistry.

Doctors Larry Brilliant and Anthony Fauci have decades of experience in infectious diseases, their transmission, and pandemics. Do you trust their views?
 
And how are hashes generated? Do you even understand? Or do you just look at Apple's explanation, see the buzzword "hash" and think it's all good?
It Generated from known database image entries where the content is known to the national Institute of protecting exploited children so you would have to be a conspiracy theorist that people are governments will start hacking that database when this is the same database that has been used by every single tech company since 2001. Where was this concern trolling for 20 years . Suddenly it’s a concern if Apple does it ?

there has been no talk of the database being hacked or any concern trolling about the state of this being hacked but suddenly when Apple starts to implement this feature the talk rises from folks who don’t even know how the technology works and consider it some kind of government encroachment or Backdoor

The db simply has hashes of CSAM not of Winnie the Pooh
 
Since you don't believe me, this is from Apple's privacy chief, Erik Neuenschwander:



It Generated from known database image entries where the content is known to the national Institute of protecting exploited children so you would have to be a conspiracy theorist that people are governments will start hacking that database when this is the same database that has been used by every single tech company since 2001. Where was this concern trolling for 20 years . Suddenly it’s a concern if Apple does it ?

there has been no talk of the database being hacked or any concern trolling about the state of this being hacked but suddenly when Apple starts to implement this feature the talk rises from folks who don’t even know how the technology works and consider it some kind of government encroachment or Backdoor

The db simply has hashes of CSAM not of Winnie the Pooh
Again it is happening on your Apple device scanning your personal photos, and again you can’t trust their bs because you don’t have access to the source code and algorithm.

You are just trusting a company that used to call privacy a human right. Can you understand all their rhetoric is just bs?.
 
  • Like
Reactions: BurgDog
Again it is happening on your Apple device scanning your personal photos,
It used to happen on Apples servers where Apple would scan all of your personal photos. Now it happens only when you're uploading to iCloud on your device, where your device scans your photos (instead of Apple), generates hashes, and only allows Apple to see derivatives of your photos if you have >= 30 photo hashes matching child porn in your upload.

and again you can’t trust their bs because you don’t have access to the source code and algorithm.

You are just trusting a company that used to call privacy a human right. Can you understand all their rhetoric is just bs?.
To be fair, none of us has ever had their source code, so they could have full-scale surveillance running already, able to look for things a lot easier than this convoluted algorithm does.
 
  • Like
Reactions: cupcakes2000
It is interesting that we have not heard anything from Tim Cook. Usually he's quite vocal about virtue signaling stuff.
Hmm, interesting observation, Cook is indeed the PR man when it comes Apple privacy. Have they figured out that the more they discuss this in public, the worse it'll get for them. I think they might want to bury the PR on this, and just proceed with it, hoping the public noise dies down.
 
  • Like
Reactions: boswald
Just the thought that every picture I add to the Photos App and every photo I take (a bird, a tree, a sunset) will get scanned and hashed in the background and verified to not be an illegal image makes me not want to even hold the iPhone in my hand. On iOS 15 it will seem toxic and stressful knowing all my images are always under scrutiny all the time.
Take a picture of a turtle — instantly gets scanned and hashed. Picture of a flower — instantly scanned and hashed. How can anyone feel comfortable about that?
Geezus — What a way to poison a product.
 
It used to happen on Apples servers where Apple would scan all of your personal photos. Now it happens only when you're uploading to iCloud on your device, where your device scans your photos (instead of Apple), generates hashes, and only allows Apple to see derivatives of your photos if you have >= 30 photo hashes matching child porn in your upload.


To be fair, none of us has ever had their source code, so they could have full-scale surveillance running already, able to look for things a lot easier than this convoluted algorithm does.
It should stay like that, not on my device. That’s the big problem.
 
It used to happen on Apples servers where Apple would scan all of your personal photos. Now it happens only when you're uploading to iCloud on your device, where your device scans your photos (instead of Apple), generates hashes, and only allows Apple to see derivatives of your photos if you have >= 30 photo hashes matching child porn in your upload.


To be fair, none of us has ever had their source code, so they could have full-scale surveillance running already, able to look for things a lot easier than this convoluted algorithm does.
Full scale surveillance in the background? If so, then it'd be a super smart thing for them to divulge a little bit of how they've been doing that, wouldn't it. OMGoodness.
 
Full scale surveillance in the background? If so, then it'd be a super smart thing for them to divulge a little bit of how they've been doing that, wouldn't it. OMGoodness.
Exactly this --- this whole spiel about user privacy has been a facade so that more people will buy into the system, just so they can slowly rip back the veil of what they've been doing all along: inspecting your selfies for MAGA hats.
 
Exactly this --- this whole spiel about user privacy has been a facade so that more people will buy into the system, just so they can slowly rip back the veil of what they've been doing all along: inspecting your selfies for MAGA hats.
Trust me, this is exactly how I'm cheating you. I love it.
 
Again, WTF? It's the SAME thing. If it's not ON your device, it's by definition OFF your device. And no one's saying you can't use it. If you don't like the terms then use another cloud service that probably has the same terms. Best of luck to you.

👋
It’s not and your inability to see the difference shows how open minded you approach the topic.
 
Just the thought that every picture I add to the Photos App and every photo I take (a bird, a tree, a sunset) will get scanned and hashed in the background and verified to not be an illegal image makes me not want to even hold the iPhone in my hand. On iOS 15 it will seem toxic and stressful knowing all my images are always under scrutiny all the time.
Take a picture of a turtle — instantly gets scanned and hashed. Picture of a flower — instantly scanned and hashed. How can anyone feel comfortable about that?
Geezus — What a way to poison a product.

Vote with your wallet. Easy.

Will you commit to doing that?
 
I don't think it's a matter of compute. Even taking compute into the equation, this will cost most users maybe $0.01 per year in compute costs --- its some basic math being performed only during upload to iCloud. If he's to be believed, Craig suggested that they have been working through a way to do this for a long time, and they finally found a way to increase user privacy while still being able to check for CSAM. It's a nuanced argument, but scanning your entire library's actual photos in the cloud (Apple is looking at your actual photos in this case) vs. scanning locally (Apple is not directly looking at your photos) to create hashes to match against known CSAM cuts out a lot of Apple looking at your photos --- iOS has already been scanning all of your photos for over a decade, now they scan in a new way to create a hash during the upload-to-iCloud pipeline to check against CSAM hashes, so that the only reason they will look at any of your photos (and even then, only a derivative), would be if you are flagged for uploading >= 30 photos containing know CSAM.

In short: without this, Apple looks at ALL your photos you upload to iCloud; with this, Apple looks at derivatives of flagged photos only in the case that >= 30 of those photos contain near-exact matches to known child porn. In other words, a lot less surveillance than what is currently taking place.



The algorithm's reliance on the intersection of hashes from multiple sovereign jurisdictions makes this a non sequitur.


Perhaps some good news: if you weren't using iCloud Photos before (implied by you not giving them access to your data), then you can continue to not use iCloud Photos and this system will not run in any way.
It’s not about that. It’s about opening that Pandora box. And about how this system is going to “expand and evolve”. This is pure China stuff right here. And it sucks because I love Apple. I even still have my Apple IIc. But unfortunately this is the end of the line for me.
 
Trust me, this is exactly how I'm cheating you. I love it.
Correct response:

1628900919213.png
 
Honestly, the biggest news from today's article is that Apple requires >= 2 sovereign jurisdictions to agree about CSAM content for Apple to even consider matching against it. If true, this derails a lot of the arguments that it will be used by <insert regime here> for nefarious purposes. Apple messed up not presenting this detail up front, as it's a huge reveal of one the big layers of security built in to the process.
Thats actually much much worse for me. Why would I want a foreign government reviewing my photos?
 
  • Like
Reactions: IG88
Just the thought that every picture I add to the Photos App and every photo I take (a bird, a tree, a sunset) will get scanned and hashed in the background and verified to not be an illegal image makes me not want to even hold the iPhone in my hand. On iOS 15 it will seem toxic and stressful knowing all my images are always under scrutiny all the time.
Take a picture of a turtle — instantly gets scanned and hashed. Picture of a flower — instantly scanned and hashed. How can anyone feel comfortable about that?
Geezus — What a way to poison a product.
Just the thought that every picture I add to the Photos App and every photo I take (a bird, a tree, a sunset) will get scanned and hashed in the background and verified to not be an illegal image makes me not want to even hold the iPhone in my hand. On iOS 15 it will seem toxic and stressful knowing all my images are always under scrutiny all the time.
Take a picture of a turtle — instantly gets scanned and hashed. Picture of a flower — instantly scanned and hashed. How can anyone feel comfortable about that?
Geezus — What a way to poison a product.
So I take it you have been outraged when iPhone AI is scanning your private non iCloud images for what type of
Images there are for the past 3 years ? And we’re also outraged when Apple announced your phone will scan text in images ?
Right ?
 
  • Like
Reactions: hans1972
It’s not about that. It’s about opening that Pandora box. And about how this system is going to “expand and evolve”. This is pure China stuff right here. And it sucks because I love Apple. I even still have my Apple IIc. But unfortunately this is the end of the line for me.
The issue with most of the posts like this --- "I used to love Apple, but not anymore" --- is that what was it that made you love Apple? Was it their "promise" of privacy before? If so, you should take a gander at what they're actually claiming to have done here. If you did trust them to some degree before, then look at it closely without an emotional response. If what they've done is as they've described, then this is an incredibly secure and privacy-centric way to provide iCloud Photos storage. Prior to this, all your photos were free game for scanning. Now, it's only if you're uploading child porn.

The pandora's box rhetoric is most concerning, and is in my mind the reason why Apple keeps posting PR about this --- people really don't understand what they're claiming to have done.

Again, this is only if what they're claiming is true --- they may have been lying to us all along...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.