Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The benefit is to society itself, as some of the people, that store such images and are stupid and/or sloppy enough to store them on iCloud, will be exposed.

Apple should not be trying to solve societies ills in this way. It's odd they are suddenly interested in this given their hesitations about it in the past with other crime related issues.

I'm not convinced -at all - that the benefits outweigh the costs and implications and potential ways this could go badly down the line -- particularly with feature expansions and Apple being compelled to implement such technologies for nefarious and subjectively beneficial goals.
 
Last edited:
In that example kid isn't the problem. The parents are the problem. We need real-world consequences for parents who allow their kids to be exposed to stuff like this. They put their child in danger by giving them a device and not monitoring how it was being used.

But I am sure people will cry 'clearly you are not a parent' and 'being a parent is hard.' As it should be.
Well, you forget the social pressure.
I remember when my daughter was 9(now 20), and a teacher asked why she is not in the whatsapp school group. I replied that she don’t own whatsapp and also just have an ipod. She 👀 at me like a bus, and said that she can’t understand that etc. started to argue that everybody uses whatsapp blah blah blah. I’m resistant to blaming, but most people would simply bend over and choose the easy route.
 
😒 it's hash data, not the actual picture...

Mac Rumors is overboard with the sensationalism lately.
Imagine if your Apple Watch measured your blood sugar and if it hit a particular number they sent your data your your insurance company who then raised your rates? It’s just sugar data, not a blood sample. Imagine a world where people would leave people alone…
 
The question is though, is there a better reasonable alternative to iPhone? Isn't it still more secure and has better privacy than Android phones? I mean, I know there are some obscure alternatives but then you lose apps and possibly great phone cameras as well.

So what are we supposed to do, we're stuck between choosing either a bad option (Apple) or an even worse option (Google) :///

But if anyone has any suggestions, I'm all ears!
Nothing can be done. The only way to make them change is to hit them where it hurts and that's not going to happen.
"iPhone 13 purchase planned by 44% of US iPhone owners"
The general public doesn't care, isn't aware and has no ability to understand the consequences. Even a previous article on this site mentioned that Apple thinks only a minority of people are concerned about this issue and they're right. It's just fun to complain sometimes. :)
 
What I don't get is they are still using this privacy and security high ground against Epic. They are now just talking out of both sides of their mouth. It's really getting worse and worse. I'm such a huge Apple fan and really in their ecosystem but damn it, this makes me want to move back to Linux more and more. I moved from Linux to Apple a few years ago.
You are going to equate Epic with the distribution of illegal pictures?
 
Nothing can be done. The only way to make them change is to hit them where it hurts and that's not going to happen.
"iPhone 13 purchase planned by 44% of US iPhone owners"
The general public doesn't care, isn't aware and has no ability to understand the consequences. Even a previous article on this site mentioned that Apple thinks only a minority of people are concerned about this issue and they're right. It's just fun to complain sometimes. :)
The general public might not care, but I do. I’ve been a Apple fan for years. Not anymore.
 
Nothing can be done. The only way to make them change is to hit them where it hurts and that's not going to happen.
"iPhone 13 purchase planned by 44% of US iPhone owners"
The general public doesn't care, isn't aware and has no ability to understand the consequences. Even a previous article on this site mentioned that Apple thinks only a minority of people are concerned about this issue and they're right. It's just fun to complain sometimes. :)

I guess that's my question :)
If you're going to hit them where it hurts, by not buying the next iPhone for instance, then what should you get instead that has better privacy?
 
Well, you forget the social pressure.
I remember when my daughter was 9(now 20), and a teacher asked why she is not in the whatsapp school group. I replied that she don’t own whatsapp and also just have an ipod. She 👀 at me like a bus, and said that she can’t understand that etc. started to argue that everybody uses whatsapp blah blah blah. I’m resistant to blaming, but most people would simply bend over and choose the easy route.

Reverse the burden of proof: ask her if she supports unsupervised minors using devices with cameras and an internet connection.
 
  • Love
Reactions: peanuts_of_pathos
Imagine if your Apple Watch measured your blood sugar and if it hit a particular number they sent your data your your insurance company who then raised your rates? It’s just sugar data, not a blood sample. Imagine a world where people would leave people alone…
Agreed, that would not be good. Except that's not the same analogy, do you know what hash data and how it works in reference with encryption? I'm not trying to be an Apple white knight, just providing factual information. We love our labels these days so if I have to be labeled a white knight because of this so be it lol.
 
depends on your semantics of "scanning".

A hash-function is applied to the picture on-device (that is an information destroying function, so I wouldn't call that a scan). In case of a match, the picture is flagged with this "safety voucher" - which probably has the necessary encryption keys, that an Apple dedicated team might inspect those photos, once the threshold has been exceeded.

So privacy is reduced only in those cases and for those pictures, where a match was found.
The probability that that happens in case of a non-CSAM picture is related to the "coarseness" of the hash-algorithm,
by using a finer hash function Apple could further reduce this risk.

If the mechanism works as designed in my view the benefits outweigh the risk of the privacy intrusion.
However Apple should first prove it can make the mechanism tamper-proof before even thinking of opening it up to third parties.
That said hash function can be used to reverse the original image as much as it can. Not all features are preserved but critical ones likely will.

It’s just that the underlying algorithm has adversary attack in mind and has some countermeasures built in to reduce the risk. In Apple’s application here, there’s two layers of encryption, one encrypts each slice of the original image, and the other one packages the whole processed image hash. There is no need for more layers since only two parties are involved. But if multiple parties are involved, there could be more slices and more layers.

Think of this way: you are sending an image (presumably CSAM image) to Apple iCloud server. The image is first sliced into smaller pieces of manageable size, compute with a function to get a grid of numbers that closely relate to the original image. This grid of number got hashed the same way original image is being reduced, until there’s one final number that represents the original image but a number alone doesn’t mean much.

Apple wants to see the content of the image, they’d need to know sufficient number of slices of the original image to recreate what it once was. Since the algorithm is reversible, Apple can then use the number to reverse back the chain and recreate an image that is as close to the original as possible. However, if only some slices are reversed (because others are encrypted), Apple would not be able to process available information to figure out what the original image is.

Or, think of this another way. A letter being torn apart into small pieces can still be recreated if sufficient number of pieces are found and contents are distinguishable.

I believe what Apple is trying to achieve here is unless they receive enough pieces of image that closely resembles existing CSAM image, what they receive is effectively as if they receive nothing.
 
Or, perhaps Apple could walk their privacy talk and not slip in a backdoor to iOS like they've been claiming they'd never do.

Slip in a backdoor to iOS? There's nothing slipping into iOS, but rather a backdoor has always been in iCloud since the beginning.
 
They said it was for the children, and I didn't do anything.

They said it wouldn't apply to me, and I didn't do anything.

People said it could never happen here, and I didn't do anything.


It's amazing how history rhymes, if not repeats. And it would seem people are indeed doomed to repeat it.

The only privacy you have is the inside of your head (for now). Act accordingly...
People seem to forget we are all humans, and we have innate flaws the nature has baked into our traits that only selected few could overcome and become powerful.

Btw, brain implants have been a thing to treat patients. But plastic chip will probably enable mass mind control that many dictatorship countries dreamed of from day 1.
 
That said hash function can be used to reverse the original image as much as it can. Not all features are preserved but critical ones likely will.

It’s just that the underlying algorithm has adversary attack in mind and has some countermeasures built in to reduce the risk. In Apple’s application here, there’s two layers of encryption, one encrypts each slice of the original image, and the other one packages the whole processed image hash. There is no need for more layers since only two parties are involved. But if multiple parties are involved, there could be more slices and more layers.

Think of this way: you are sending an image (presumably CSAM image) to Apple iCloud server. The image is first sliced into smaller pieces of manageable size, compute with a function to get a grid of numbers that closely relate to the original image. This grid of number got hashed the same way original image is being reduced, until there’s one final number that represents the original image but a number alone doesn’t mean much.

Apple wants to see the content of the image, they’d need to know sufficient number of slices of the original image to recreate what it once was. Since the algorithm is reversible, Apple can then use the number to reverse back the chain and recreate an image that is as close to the original as possible. However, if only some slices are reversed (because others are encrypted), Apple would not be able to process available information to figure out what the original image is.

Or, think of this another way. A letter being torn apart into small pieces can still be recreated if sufficient number of pieces are found and contents are distinguishable.

I believe what Apple is trying to achieve here is unless they receive enough pieces of image that closely resembles existing CSAM image, what they receive is effectively as if they receive nothing.
I am a little bit confused by your answer:

1. a hash function is per definiton a non-reversible function (lookup a "perfect hash" if you like)
2. No need for Apple to reverse engineer the image anyhow - the "safety voucher" already contains a copy

so sorry - I cannot follow you.
 
  • Like
Reactions: slineaudi and mw360
Or you know, just don’t have iCloud photos turned on.
Or be like 99.999% of people, and don’t be worried about features that Will not ever apply to you
You didn't read this bit, did you?

"Another possibility is that Apple's known CSAM detection system could be expanded to third-party apps that upload photos elsewhere than iCloud Photos."
 
Some countries like Russia or Saudi Arabia could make apps that scan for Gay material and use it to persecute LGBTQ people and China could do the same by scanning anti government photos etc.. Good luck with that!
 
Is it?
Personally, if I had a kid under the age of 13, and I had that restriction of sending or receiving explicit photos turned on, I would want it to be universal across all apps, first party and third party.
I wouldn’t want my kid sneaking onto Snapchat to send stuff or receive stuff they couldn’t send or receive on iMessage.
I don’t want my kid smoking meth. I won’t be following them around 24/7 to make sure they don’t, though.
 
Third party Apps? Come on now. Apple is doing the MOST now. Imagine if Facebook get's ahold of the photos/information. Isn't Whatsup App belong to Facebook? SMH.
I get the concern, but your logic makes no sense - they’re not talking about giving photos/info to third party apps, they’re more likely talking about offering scanning services for photos from third party apps. Facebook (et. al.) isn’t getting any data out of this.
 
Or you know, just don’t have iCloud photos turned on.
Or be like 99.999% of people, and don’t be worried about features that Will not ever apply to you
I'm pretty sure it gets turned on automatically if you just sign in. So you need to be fast enough to turn it off, and even then it might upload all of meta data first in seconds.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.