Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
A Case could be made without the outrage that even iCloud data shouldn't be scanned but thats not what the outrage is about. The primary objections of pundits and some users is that the scan is occurring on device rather than on iCloud.

so my Questions and confusion rather is this:

If the only available data TO scan for Apple to do on device is the exact same as the iCloud data , why does it matter where the scan occurs to users? especially if the scan is done by Device AI and Apple is only contacted when there is a BULK of hashes per device which match the one in Apple hash servers.
The users are objecting why the scan isn't occurring on servers. Well. It would be the exact same data as the one available to scan on iPhone. All the other data is still locked out to Apple

another question I would ask is why the outrage against Apple specifically on doing specifically photo hash only scanning when in fact Google scans private emails and all other online content for more categories than CSAM? Microsoft scans all online storages for more than CSAM? Amazon scans private data on all its online drives? FB and Twitter scan Private DMs and Facebook Messenger chats for CSAM? So why is the outrage fixated on Apple?
when in fact Apple is not only scanning for a lot less but they have said they will not expand the category of what they scan and who to give the scans to. You could always argue you don't trust Apple but if thats the case , if you dont trust Apple, then NO tech in the mainstream industry is better for you.

These questions above are different than WHy they are scanning to begin with but to that I have a question as well. If Apple is not allowed to scan anything , how does it stop CSAM which is the worst of society.
You know the answer already, you've been told several times, we don't like scanning on device, it has access to everything, not just iCloud stuff, and it can be expanded to scan anything. Apple can no longer use the excuse that it can't be done -- because it can and quite easily. None of the other actors scan on device.
 
Getting the Pegasus software isn't easy.

Also, why not just post the picture to sosial networks? They will report and it could be devastating for their public image.

Why go the roundabout way when you can post it to Facebook, Instagram and Twitter?
Why? because I do intend to destroy your personal image and life I get Pegasus (I might the opportunity to do it), load it on your phone, load 35 of these images on your phone, activate iCloud, and away you are. You owned the images, you are guilty.

Not panicking! But it can be done...
 
Last edited:
Why? because I do intend to destroy your personal image and life I get Pegasus (I might the opportunity to do it), load it to your phone, load 35 of these images on your phone, activate iCloud, and away you are. You owned the images, you are guilty.

Not panicking! But it can be done...
You can do the same exact thing today. Has nothing to do with this new hashing system.
 
I apologize for the cross-post but I thought this thread would find value in a recently revealed internal meeting transcript:

FBI: Help us break into iPhones.
Apple: No.
FBI: Do it.
Apple: We don’t have the keys.
FBI: Make an iOS version that we can use on the phone to bypass the keys.
Apple: No.
FBI: Terrorism.
Apple: No.
FBI: Child porn.
Apple: No.
FBI: We’ve arrested Jonah Federighi. He’s looking at 30 years on federal drug charges.
Apple: We’re listening.
FBI: We found a few cases of child porn being stored on iCloud. We are prepared to prosecute you for allowing it.
Apple: There is no such legal requirement.
FBI: That will change after televised hearings.
Apple: What do you suggest?
FBI: Search everyone’s phone for image violations and report offenders to us.
Apple: No one would tolerate that policy.
FBI: Do it quietly under the guise that images uploaded to your servers require additional scrutiny.
Apple: But we are charging people for that storage space and allow privacy in how people use it.
FBI: Do the analysis on the phone, not the servers.
Apple: They’re not our phones. We sold them to people. You have to have a warrant to search each one.
FBI: An inconvenient truth. If YOU do the search, no warrant is needed.
Apple: You really think this will make a difference to protect children?
FBI: Of course not. Offenders will use other means. This is about our political power and looking like we’re doing something. It’s an opportunity for you as well. It will help sell phones. It will help us down the road.
Apple: Down the road?
FBI: Once you enable a reporting tool behind the encrypted phone, we’ll be able to ask for other things later and you will no longer be able to say it’s not technically possible.
Apple: Sneaky.
FBI: We’re the government, and we’re here to help.
Apple: We’ll report only if 100 images match up to known hashes.
FBI: 5.
Apple: 50.
FBI: 10.
Apple: 40.
FBI: 20.
Apple: Let me talk to Tim.
FBI: OK.
Apple: 30. And we want all military personnel to be given iPhones.
FBI: Deal.
Apple: We’ll make a low key announcement and jumble it in with blocking nudity on kids phones in iMessage. No way anyone is going to object and stand up for pedophiles. We’re in.
FBI: Great. We’re working now with child protection services on the hashes. You can pick up Jonah at 3. As a sign of good faith, we will give you full access to a captured Tic-Tac.
Apple: The UFOs spotted by military planes over the ocean? You have one?
FBI: We do. But we don’t understand it. We could use your help with reverse engineering.
Apple: That won’t be necessary.
FBI: Why is that?
Apple: It’s one of our prototypes.
 
I apologize for the cross-post but I thought this thread would find value in a recently revealed internal meeting transcript:

FBI: Help us break into iPhones.
Apple: No.
FBI: Do it.
Apple: We don’t have the keys.
FBI: Make an iOS version that we can use on the phone to bypass the keys.
Apple: No.
FBI: Terrorism.
Apple: No.
FBI: Child porn.
Apple: No.
FBI: We’ve arrested Jonah Federighi. He’s looking at 30 years on federal drug charges.
Apple: We’re listening.
FBI: We found a few cases of child porn being stored on iCloud. We are prepared to prosecute you for allowing it.
Apple: There is no such legal requirement.
FBI: That will change after televised hearings.
Apple: What do you suggest?
FBI: Search everyone’s phone for image violations and report offenders to us.
Apple: No one would tolerate that policy.
FBI: Do it quietly under the guise that images uploaded to your servers require additional scrutiny.
Apple: But we are charging people for that storage space and allow privacy in how people use it.
FBI: Do the analysis on the phone, not the servers.
Apple: They’re not our phones. We sold them to people. You have to have a warrant to search each one.
FBI: An inconvenient truth. If YOU do the search, no warrant is needed.
Apple: You really think this will make a difference to protect children?
FBI: Of course not. Offenders will use other means. This is about our political power and looking like we’re doing something. It’s an opportunity for you as well. It will help sell phones. It will help us down the road.
Apple: Down the road?
FBI: Once you enable a reporting tool behind the encrypted phone, we’ll be able to ask for other things later and you will no longer be able to say it’s not technically possible.
Apple: Sneaky.
FBI: We’re the government, and we’re here to help.
Apple: We’ll report only if 100 images match up to known hashes.
FBI: 5.
Apple: 50.
FBI: 10.
Apple: 40.
FBI: 20.
Apple: Let me talk to Tim.
FBI: OK.
Apple: 30. And we want all military personnel to be given iPhones.
FBI: Deal.
Apple: We’ll make a low key announcement and jumble it in with blocking nudity on kids phones in iMessage. No way anyone is going to object and stand up for pedophiles. We’re in.
FBI: Great. We’re working now with child protection services on the hashes. You can pick up Jonah at 3. As a sign of good faith, we will give you full access to a captured Tic-Tac.
Apple: The UFOs spotted by military planes over the ocean? You have one?
FBI: We do. But we don’t understand it. We could use your help with reverse engineering.
Apple: That won’t be necessary.
FBI: Why is that?
Apple: It’s one of our prototypes.
Can I make up fake conversations too?
 
So he wasn't even discussion #"5. Then Apple looks at it and decides what to do next. They either verify that it's CSAM or they discard it if it somehow got through all the other checks." then.

Apple uses two hash technologies. It seems he wasn't looking at the last one.
That's totally wrong, that was yesterday, by this afternoon Apple has 3 and a half technologies for smoking hash, and right at this very moment a group huddle of PR people doing mass quantities, trying to figure out the next part of the narrative.
Can I make up fake conversations too?
Yes, please go ahead, perhaps we can ask for a creative-writing forum and share plots!

As a reference point, for the past decade+ I've been participating here, this has been a super-useful resource for Mac Pro esoterica -- for me anyway, which is primarily where I used to engage -- and otherwise an echo-chamber for how much we all love Apple. I believe the level of ♥️ and trust has significantly decreased to the point where sentiment here is nearly identical to what I'm reading on Ars Technica, Reddit, and HackerNews. I think when Apple is being called out by the ACLU ... kinda points to the fact that they've totally lost control of their own narrative and are hitting a new low. I don't see how any creative contributions here could be worse than what their PR department has already done.

It was a dark and stormy night...
 
I'm out.

Go ahead and think what you wanna think.

Apple has done nothing wrong regardless if you think they will abuse it.

Until there's proof that they're spying on you, there is no problem.
 
  • Haha
Reactions: jseymour and Cycom
I'm out.

Go ahead and think what you wanna think.

Apple has done nothing wrong regardless if you think they will abuse it.

Until there's proof that they're spying on you, there is no problem.
Do you use that same standard for the government’s warrantless spying of US citizens?
 
  • Like
Reactions: dk001 and Pummers
Why? because I do intend to destroy your personal image and life I get Pegasus (I might the opportunity to do it), load it on your phone, load 35 of these images on your phone, activate iCloud, and away you are. You owned the images, you are guilty.

Not panicking! But it can be done...

A. But wouldn't you achieve the same thing by posting the image to Facebook, Instagram and Twitter? Immediately all their family, friends, colleagues and others will know it and Facebook and Twitter will discover it, close the account and report it.

B. If you put it in the iCloud Photo Library it might not be revealed to family and friends.

A results in immediate public shaming and being reported.
B resulta in being reported and maybe shaming later

A seems much more worse so why go exclusively for B?
 
FBI: Once you enable a reporting tool behind the encrypted phone, we’ll be able to ask for other things later and you will no longer be able to say it’s not technically possible.

Things like this have never been technical impossible when you control the software. It might require making changes to the software but that isn't an impossibility.

You seem to confuse

A. Technical impossible

and

B. Technical impossible unless changes to software is made

The cases between United States and Apple was about B. Apple could have done what the government wanted them to do. It's just that Apple didn't want to do it and felt the government didn't have a good legal case to force them. The 16 or so cases wasn't really pushed through the legal system and the United States withdrew all their cases or chose not to appeal in the one they lost.

So almost nothing is impossible if Apple wants it.
 
That's totally wrong, that was yesterday, by this afternoon Apple has 3 and a half technologies for smoking hash, and right at this very moment a group huddle of PR people doing mass quantities, trying to figure out the next part of the narrative.

So its even worse? He only covers 1 and are missing 2.5?

If Apple has 3.5 methods why doesn't the video discuss those 3.5 methods?
 
All it would take it you being on a ladder working or something and having to give your passcode to someone since you were indisposed do they could do something on your device, to them remembering it and using it to do what I described.
Hahahaha... That's hilarious. "on a ladder working..." I LOVE IT!!!
This is absolutely ludicrous.

Maybe someone who discovers someone was cheating on them, does it? It’s not my job to continue coming up with scenarios or for you to try and diminish them as if they’re not realistic (they are). The fact it’s so easy to come up with such hypotheticals means it, almost inevitably, *will* happen.
Hey, at least you've come up with a more real life possibility with the revenge motive. In fact there are women who accuse men of P to stop them from seeing their children. That is a FACT. So I don't put it past somebody to want to do that, BUT it would mean they would then have to have the photos on THEIR device which would then be easily traced back to them.

None of this setting up is going to happen to normal people. I wish people would stop thinking they are important. 99 percent of people are plebs and we are inconsequential. This isn't going to happen to normal folk. If this nefarious plot to slander somebody with CP is going to happen, it's gonna be against a politician or titan of industry etc. And even then I'd imagine it's going to be incredibly difficult to set up.
 
A. But wouldn't you achieve the same thing by posting the image to Facebook, Instagram and Twitter? Immediately all their family, friends, colleagues and others will know it and Facebook and Twitter will discover it, close the account and report it.

B. If you put it in the iCloud Photo Library it might not be revealed to family and friends.

A results in immediate public shaming and being reported.
B resulta in being reported and maybe shaming later

A seems much more worse so why go exclusively for B?
Yes, you are right for the images we are talking about here. But it does not justify the auto-scan feature, Apple wants to sell to us "on device". But in nevertheless opens an unwanted backdoor, which can be activated for whatever content, if an authority forces Apple to "obey country specific regulations".
 
Hey, at least you've come up with a more real life possibility with the revenge motive. In fact there are women who accuse men of P to stop them from seeing their children. That is a FACT. So I don't put it past somebody to want to do that, BUT it would mean they would then have to have the photos on THEIR device which would then be easily traced back to them.

Huh? No they wouldn’t. They could use that person’s phone to visit a thing called a ‘website’ that contains said images and simply download some of them.

None of this setting up is going to happen to normal people. I wish people would stop thinking they are important. 99 percent of people are plebs and we are inconsequential. This isn't going to happen to normal folk. If this nefarious plot to slander somebody with CP is going to happen, it's gonna be against a politician or titan of industry etc. And even then I'd imagine it's going to be incredibly difficult to set up.
Thanks, but I’d rather not set things up so I can be framed for CP. Especially for a “feature” that does nothing about CP content or users other than to keep it off iCloud (which is already happening). Why does anyone think this tactic will stop actual CP users or content producers? Go after them and leave the rest of us alone.
 
Huh? No they wouldn’t. They could use that person’s phone to visit a thing called a ‘website’ that contains said images and simply download some of them.
So somebody is gonna go on another person's phone (which they'd have to know how to unlock for starters) and then KNOW where to find those photos. Photos which pretty much only exist deep, deep down on the dark web that normal folk have absolutely no idea about. They'd have to research it on... their phone or computer!

Thanks, but I’d rather not set things up so I can be framed for CP. Especially for a “feature” that does nothing about CP content or users other than to keep it off iCloud (which is already happening).
Set things up?!?! Haha. You're paranoia is hilarious. Nobody is going to frame you. Your sense of self is truly stunning to think that somebody would even attempt to try this on you. Has anybody tried anything like this on you up until this point? I reckon that's a big, fat No! So what's gonna change?

Why does anyone think this tactic will stop actual CP users or content producers? Go after them and leave the rest of us alone.
Now this we can agree upon.
 
Well you got the answer to your question I think! A huge amount of technical ignorance and fear on display here, as in all the other discussions on this. The ‘outraged’ seem to fall into one of these categories:

“Apple has no right to look at photos on my device! If they start ‘scanning’ my private photos, who knows where it will end?”
My reply: Apple has been using on-device AI (e.g. face recognition) to analyse the content of your photos for years. If you don’t trust them now, why did you before? (FYI, hashing an image doesn’t involve AI at all. Correction: Apple's NeuralHash system does use AI to recognise matching images that might have undergone minor changes, like being 'slightly cropped or resized'.)

“This is a huge violation of my privacy! Apple has betrayed us all after telling us they take privacy seriously!”
My reply: Only if you store child pornography. For everyone else, nothing private gets shared at all. Your photos don’t even get analysed by Apple—they just get a meaningless string of characters to compare with those in the database, and anything short of an exact match is ignored. Even if they wanted to convert those hashes back to the original images, they can’t. That’s the whole point of a cryptographic hash.

“This opens a back door for the bad guys to get all sorts of information from my device!”
My reply: No it doesn’t. It only allows matching of images against a database of known CSAM images. The final step in the process is human review by Apple to guard against the extremely unlikely event of a false positive.

”Oppressive regimes could potentially use this to find and abuse dissidents.” This is about the only valid concern I’ve heard, but it would require them to (1) find a way to add other non-CSAM images to the database, and (2) convince Apple to hand over the matching records. Apple has responded by making clear the process of maintaining the database, and categorically stating that they will refuse all such requests. Based on their solid record in the past (and the absolute PR disaster that such a failure would be) I see no reason to suspect that they would lie about this.

So basically, we are forced to place some trust in the people at Apple, but that has always been the case. For the record, I certainly don’t agree with everything Apple does, but personally, I don’t understand all the anger over this one, and the personal attacks on Apple leadership. Remember, these are people like you and me, with families, and social consciences. Before you join the lynch mob over your false sense of privacy violation, why don’t you pause to consider the violations of human rights that could be avoided if all tech companies took their responsibilities to society seriously.
 
Last edited:
Can I make up fake conversations too?
*sigh* Satire is lost on you, is it?

A_Joke_Son.jpg
 
But Apple has always scanned iCloud content when they gave to police and feds . Where was this same outrage then ? In this case they are moving the scanning the exact same data with device ai. But the content TO scan is the same when they shared iCloud content with police before and tomorrow when they let AI do device scan
Maybe you weren’t listening to the outrage? https://www.eff.org/deeplinks/2016/02/apple-court-dont-let-fbi-failure-undermine-global-security https://fixitalready.eff.org/apple/#/
 
To me and from all the comments the same for others it is less about looking for these pictures of children it is more about what else will Apple and others start will look for on your phone using similar technology.
I also have negative thoughts about Apple going from a technology company to a tool for policing people's bad behavior.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.