Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I keep hearing people say what I bolded in your post above, and I'm trying to figure out why you guys are so confused about this. Look at the very part of the article you quoted: "Apple then manually reviews each report to confirm there is a match." So unless you have confirmed child porn images on your device that you then upload to iCloud, your life will not be ruined because nothing will come of false positives. Why would Apple report an innocent image after review? Makes no sense. I don't understand your concern here at all.

That's a fair point. But that's placing a LOT of trust on them, and basically making Apple the police in that situation. They hold all the power in that moment of deciding whether or not to shut you down and report you, or not. I'm sure there are checks and balances in place to help prevent the innocent from being falsely flagged. But there's a gray area that gets close to crossing the line where you're naive if you think they'll get it right 100% of the time. And if they're wrong, it doesn't matter at that point. Your account is shut down and you've been reported. Let me give you a couple of scenarios... and I have middle school aged kids, so if you think this can't happen, you're unrealistic.

#1 - A middle school girl, we'll say 13 years old, sends a photo of herself topless to her 14 year old boyfriend. The boyfriend is an idiot and sends it to a few friends. Because of iCloud's photo backup, the photo gets saved, analyzed, and reported to this manual review group. Then suddenly they're sitting in front of a topless photo of a 13 year old girl. That's child porn. What do they do? And the boys that simply received a text message... they get reported / shut down?

#2 - A 16 year old boy takes a video of him and his 15 year old girlfriend... Because kids are idiots and don't lock their phones, a friend finds it and sends it to 10 other friends, b/c he's also an idiot. Then suddenly you have 12 teenagers with what is technically deemed child pornography on their phones. They should all be shut down and reported to NCMEC? (what ended up really happening in this situation was the police got involved, scared the living hell out of all kids involved, and taught them a very solid lesson on the dangers of photos/videos, revenge porn, etc etc. I do believe the kid that actually spread the message ended up getting in some legitimate trouble for it, too)

#3 - An adult is looking at pornographic pictures and saves one that ends up being an underaged girl - she's 17 but he thought she looked 21. He's now in possession of what could be considered child pornography, and for all we know it could be an image that matches up with something on the NCMEC. Does his account deserve to get shut down and reported to NCMEC?

Personally I'm against porn in general, of all types and ages, because I think it's unhealthy, and also one of the primary funding sources for for sex trafficking. But even I, who would be considered by most to be a prude in this area, still see over-reach here and think this is an extremely slippery and dangerous precedent.
 
  • Like
Reactions: Nightfury326
Here’s what I find problematic: How do they know the intentions of the photographer?

I have two kids, and like many, many other parents, when they were babies and toddlers I used to take pics of them playing in the house, outside or in the bath in diapers or underwear, or even nude. Obviously the intention is to capture memories of my kids doing hilarious things, and to privately save them for the future memories. But how would apple know this? Unless I’m missing something this is a grotesque over reach and one destined to scoop up innocent people.
You are missing something. They aren't looking at nude photos of your kids. They are looking to see if the photo hash of your nude kids matches that of known CP.
 
Of course not. But do you not think it is odd that Apple has suddenly become very concerned about it enough to build an infrastructure to sniff your iCloud account?

It's not scanning anyones iCloud, have a read up how they are implementing it.

I don't find it odd at all, big companies like Apple will have been planning this for a long time.
 
  • Angry
Reactions: peanuts_of_pathos
I don't follow this slippery slope thought process. He is essentially saying he doesn't trust Apple not to abuse this functionality. You could literally apply this to anything if you believe Apple has ill-intentions.

If we believe Apple has some master plan to abuse this then this opens open every part of their ecosystem to the same "it could be abused if they want to" statement.
That's not what he's saying. He's saying that if Apple puts this in there, then what's stopping governments from going "well, you put that in there, perhaps you can put THIS in there so we can check for things. Oh, and can you give us access just in case there are other crimes there, I mean, you DID put in this encryption back-door, so we know you can do that...so you can do this other thing too"

Then, of course, since there will be this backdoor, however small it is, there's room for it to be exploited by 3rd parties. Will we have headlines like "Zero day discovered using Apple's photo-scanning back-door to gain access...Apple has yet to patch this." and so on in the future?

Oh, and for this there will be defenders with "That's not what Apple is doing, there's no back door, relax. Just trust them." follow by someone doing exactly that.
 
I equate Apple’s version of security/privacy to a Las Vegas Casino’s version of theft. Publicly, they’re on the side of the consumer, but when it comes to their uses, all bets are off. Taxi driver take a long route to the airport (AKA Long-haul) to make a few extra bucks, the casinos make sure they lose their license. However, the odds at the tables are stacked in their favor. Figure out a way to count cards and you’re out. Same with Apple. Some app developer finds a way to game the system, and they’re out. However, Apple can create a permanent back door in iCloud backups that ONLY THEY can use for whatever purposes they deem necessary for various reasons (i.e. security and ‘cause covid).
 
So Apple doesn't need a warrant to do this, but law enforcement would??? What next, "no knock" warrantless searches of our Macs?

Does anyone else see the awful irony of this regarding their 1984 famous commercial?
image.jpeg
 
Good! The fact that it’s not a 1:1 hash comparison and it can find matches on cropped, color adjusted, rotated, and transformed pictures means there is a chance for legit pictures to be flagged between two consenting adults. And software is NEVER open to bugs right? It’s just asking for disaster.
 
Awful step by apple. And the 'alert your parents if you view a nude' is some awful overreach but typical of apple's strangely prudish approach.
I can understand how parents might want to know this (i mean it is kids we are talking about here) but can also see your point.
 
  • Like
Reactions: peanuts_of_pathos
The other article said the analysis all happens on device, not the cloud. So are they really creating a back door?

That said, this line concerns me from the first article:

“Apple then manually reviews each report to confirm there is a match, disables the user's iCloud account, and sends a report to NCMEC. Apple is not sharing what its exact threshold is, but ensures an "extremely high level of accuracy" that accounts are not incorrectly flagged.”

I’m all for stopping child porn, predators, sex trafficking, etc (and regular porn for that matter, but that’s a rabbit trail for another discussion). But this feels like an over-reach. I just can’t imagine there won’t be some false positives along the way, and this will ruin those peoples lives.
If Apple can scan for these images on device / iCloud, the technology can eventually be used to scan for anything. This should have never been built.

It’s a huge mistake for Apple and goes against everything they’ve stood for. And it will be a publicity nightmare for them since their focus is on privacy. Bad move.
 
I'm seriously considering NOT purchasing the 16" M1 MBP that is coming out this Fall after this stunt :confused: Cancelled my Apple TV+ subscription few minutes ago.
I’m considering selling all my stuff, and going back to third-party android ROMs and a Linux-based PC. Not immune to surveillance, of course, but it’s a step in the right direction (for me).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.