Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
OK then....it must be true if Apple says so. Remember, only a very very very very small number of users had a butterfly keyboard problem....
Ok, so set the record straight then, how many users had a problem with the butterfly keyboard?
 
At least release the source code for these CSAM programs under the GPL. Along with every iterative update. So, people can review it and be sure it's only capable of doing what Apple claims.

After all they're just doing this for child safety right? Why would they care if somebody copied them?
This. It's a big problem when a system of mass scanning and judgement are operated by a collaboration between a private company and a blackbox. There's no transparency and no way for any scrutiny/audit to happen.
 
Yes there is an easy answer: Companies don't enforce the law.

Is privacy so rare that it is no longer an option?
Bingo. That's the idea, they want to make it so privacy and free speech are now not, you know, something that's guaranteed by the constitution no matter what, but they become "difficult choices" that "are not all black and white" and so on and so forth....

Death by a thousand small cuts.

Child porn today, slightly less disgusting offenses tomorrow, and suddenly it's terrorists (as defined by "everyone who disagrees with the government") and subversives ("anyone who doesn't toe the party line").

Welcome to China! Social credit score, mark of the beast, I mean chip implant, coming...

It's not a difficult choice at all - Apple makes computers; Apple is not a law enforcement or intelligence agency. What's difficult about that?

Some speculated that Apple is preparing for iCloud E2E encryption - well then they should roll that out at the same time. And in any case, that's not a good trade-off for users.

Not because of the child porn, but because you can't get a mortgage or travel on airplanes because you have committed thought crimes by thinking thoughts that are different from the government approved thoughts.
 
Ok, so set the record straight then, how many users had a problem with the butterfly keyboard?
I do not have that figure but certainly enough that Apple set up a program for the victims and after three or four years they decided they better stop flogging a faulty product.
 
  • Like
Reactions: Johnny907
It's funny how poorly this is playing right now on the non-tech blogs, whereas actual tech blogs such as The Verge are doing proverbial journalistic gymnastics trying to portray this move in a positive light.
It's almost like the people who should be the most upset about these announcements are suddenly scared to criticize Apple.
 
But reading is hard…

And Oregon's governor signed a law dropping the requirement that one prove proficiency in reading, writing and math to graduate HS, so get ready for less of it.

 
  • Angry
Reactions: SFjohn
  • Like
Reactions: P-DogNC
Well... they are encrypted though. While true Apple still has the keys to the locks, they clearly don't like to use them and only when forced by the Government. I suspect they want to move to not having the keys at all so they cant even be forced to turn them over.

Exactly. Just because something is encrypted, doesn't mean it is secure. Everything could be encrypted with a ROT1 or something but it wouldn't be secure.

As long as someone else holds the keys, there isn't really much point to encryption because at some point those keys will be hacked or otherwise compromised and then everyone's data will be at risk.
 
  • Like
Reactions: Philip_S
How many people have been caught using these methods? That is the first piece of info.

We have a National alert system for phones for lost children. Do you know how many children have been saved by that system from stranger abductions? Zero.

So how many people have been caught with these tools which are needed in the fight against this child exploitation? Especially after it has loudly been made public.
 
  • Like
Reactions: Philip_S and SFjohn
I don't think there was ever truly an option the moment people started connecting their computers to the internet.

Oh, but there is. PGP, SSL, SMTPS bitcoin encryption etc. all provide privacy. It is really a question of OS developers not compromising away the right to privacy for the expediency of the moment.

Just to be clear also, Apple is scanning the images ON the device itself. Then matches the hash to the CSAM image hashes database. How long until their are complaints about battery life being impacted by the scan? Storage being used to hold the database? (Not long I'm sure followed by class action lawsuits which will probably fail.)

Once they are scanning all the photos on your device to get the hash of each then they can track the photos on your device anywhere. A whole new level of surveillance.

What about if you have a copyrighted photo from Getty on your phone. How long until that is flagged as a copyright violation? (Even if it is fair use).
 
  • Like
Reactions: Philip_S
This. It's a big problem when a system of mass scanning and judgement are operated by a collaboration between a private company and a blackbox. There's no transparency and no way for any scrutiny/audit to happen.
Just like so many algorithms currently employed by YouTube, Facebook, Twitter that turn out to be algorithms tailor made to get outcomes they like. But they tell you - it’s the holy algorithm that has determined you are not complying.
 
I’m not quite sure what the goal for this feature is. Don’t get me wrong, anyone having these kinds of pics are disgusting. But will this stop someone from being abused? Will this stop the CREATOR of the images?

This is like scanning all of our music and video files to see if we actually purchased them or just pirated it.

Please note I’m comparing the end result between my two examples. Please don’t twist this saying I think these pics are equivalent to saving unpurchased music/video. In the end, how does both things prevent the creation of the material?

And this is how things start. Announce a feature that NOBODY can deny right now, but change it later. At that point it’s too late. So what is to prevent this same hash verification to include checking hash of pirated material?

And I don’t like how half the time someone wants to discuss this we are not for saving children. You want to know what I want to see happen? Joe Somebody that is the creator of these images being put in jail or worse. Not Jim Somebody that just so happens to come across these images and saves them. While utterly disgusting, Jim in this scenario wasn’t the one abusing the kid or the original creator of the image. So Joe in this scenario gets away and can do this to another kid.

Of course I do have some ignorance here. I don’t even know where these kinds of images originate. Nor do I want to!
 
After reading more, it appears they are using "perceptual hashes" so they are scanning the actual images by doing something like reducing the size to a small square (e.g. 32 by 32 or 8 by 8), converting to grayscale, average the colors, compute the mean, then swapping bits to 0 or 1 depending on whether the color is above or below the mean. Using something like phash.

So changing one part doesn't impact the hash.

It is not a huge computational burden, but it occurs on the device.
 
I’m not quite sure what the goal for this feature is. Don’t get me wrong, anyone having these kinds of pics are disgusting. But will this stop someone from being abused? Will this stop the CREATOR of the images?

This is like scanning all of our music and video files to see if we actually purchased them or just pirated it.

Please note I’m comparing the end result between my two examples. Please don’t twist this saying I think these pics are equivalent to saving unpurchased music/video. In the end, how does both things prevent the creation of the material?

And this is how things start. Announce a feature that NOBODY can deny right now, but change it later. At that point it’s too late. So what is to prevent this same hash verification to include checking hash of pirated material?

And I don’t like how half the time someone wants to discuss this we are not for saving children. You want to know what I want to see happen? Joe Somebody that is the creator of these images being put in jail or worse. Not Jim Somebody that just so happens to come across these images and saves them. While utterly disgusting, Jim in this scenario wasn’t the one abusing the kid or the original creator of the image. So Joe in this scenario gets away and can do this to another kid.
I agree. I would add that I really don’t think that most of child abusers are dumb enough to store CSAM stuff on non-E2EE cloud services, and as you said, this will not prevent CSAM content creation. While the intention is noble, I don’t think that Apple will catch many criminals this way (but I’m hoping for). So everybody has to pay and sacrifice a bit of privacy with risks of abuse by governments with probably minimal results
 
  • Like
Reactions: SFjohn and Ethosik
Sounds like a good reason to remove all photos from iCloud! I already keep mine on encrypted ssd’s! “It’s all about the children” if you believe that, I’ve got some swampland to sell you!
 
  • Like
Reactions: SFjohn
I agree. I would add that I really don’t think that most of child abusers are dumb enough to store CSAM stuff on non-E2EE cloud services, and as you said, this will not prevent CSAM content creation. While the intention is noble, I don’t think that Apple will catch many criminals this way (but I’m hoping for). So everybody has to pay and sacrifice a bit of privacy with risks of abuse by governments with probably minimal results

Don’t get me wrong. If there was significant proof that doing this would be a 100% guarantee that the creator would get caught, I would be all for it. But Apple announced this. Loudly. Everyone is talking about it. So if there was even the slight chance a creator is using their iPhone this way, they won’t for long now.

Where is the proof that this will even catch 1 person? I mean it’s very likely not not a 100% guarantee. So we are treating everyone as a criminal without any guarantee or numbers.

Some criminals are dumb sure, but there are plenty that are smart to avoid detection.
 
  • Like
Reactions: fwmireault
iMessage isn’t a social media platform…. It doesn’t need a report function.

“I also don't understand why Apple is pushing the CSAM scanning for iCloud into the device, unless it is in preparation for real encryption of iCloud backups.”

Bingo, in my opinion.
If Apple built a way to identify photos in the system, it’s not really end to end encrypted, is it? It’s a complete mockery of the concept. If Apple doesn’t backtrack on this 100% then I won’t be buying the iPhone 13. I’ll stick with my XS on iOS 14. No spyware (that I know of) built in.
 
Oh, but there is. PGP, SSL, SMTPS bitcoin encryption etc. all provide privacy. It is really a question of OS developers not compromising away the right to privacy for the expediency of the moment.

Just to be clear also, Apple is scanning the images ON the device itself. Then matches the hash to the CSAM image hashes database. How long until their are complaints about battery life being impacted by the scan? Storage being used to hold the database? (Not long I'm sure followed by class action lawsuits which will probably fail.)

Once they are scanning all the photos on your device to get the hash of each then they can track the photos on your device anywhere. A whole new level of surveillance.

What about if you have a copyrighted photo from Getty on your phone. How long until that is flagged as a copyright violation? (Even if it is fair use).
You should perhaps educate yourself more on this subject. What I highlighted is completely wrong. BUT you are free to dump any Apple products you use and move on to a more secure and private system from Google and Microsoft.
 
If Apple built a way to identify photos in the system, it’s not really end to end encrypted, is it? It’s a complete mockery of the concept. If Apple doesn’t backtrack on this 100% then I won’t be buying the iPhone 13. I’ll stick with my XS on iOS 14. No spyware (that I know of) built in.
You do what you want, but it’s the closest thing the US government (and/or FCACSE) is going to let you get to end to end encryption for cloud photo storage, at least with a large company like Apple (or Google, Microsoft, etc.)

You could also just turn off iCloud photos, which seems way more reasonable.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.