Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It has been stated that it’s not a true 1:1 hash. People aren’t thinking about that. It’s been stated it will detect crops, color adjustments, warps, transforms, rotations and more. Therefore, it’s not a true hash comparison 1:1. So how can they guarantee a similar photo with an adult won’t get flagged? What if that 24 year old looks like 15? What happens then? I know many people that look like they are in their teens. Even under manual review that can possibly be an issue. This isn’t as simple has hash match if they also can tell you manipulated the picture, which would produce a different hash but they still will match it.

Hash maps are huge! The likely hood of it matching a false positive is astronomical, then you have a threshold as well.
 
It’s well intentioned but it’s just a bad idea. It could/should be used for actual child porn suspects but not for every single iCloud user writ large as the potential for images to be falsely identified by the software is all but admitted because of the human review process. And presumably child porn involves anyone under the age of 18 so for edge case scenarios where images are not obviously depicting little kids, how could human reviewers possibly know for certain the age of who is depicted in an image? There are non porn images aplenty on the internet of women who are over the age of 18 but can look convincingly younger in a photograph, intentionally or not. And visa-versa. I just don’t think they have an answer for that and until they do, there is no reason for confidence that it will all “just work” in a draconian implementation like this and that someone innocent of uploading child porn to iCloud will never need to explain their perfectly legal iCloud images.
 
  • Like
Reactions: VulchR and Ethosik
Actually it's a massive sea change. It's the first time a major device manufacturer is installing software on users' devices that has the sole purpose of searching for blacklisted material and reporting you if it thinks it finds any. It negates all forms of end-to-end encryption since they can simply scan your information on the device before encryption. There is no accountability and no insight into what exactly is included in the hashed database of blacklisted material that is downloaded to your phone. It's designed so that you cannot know when the algorithm has decided that you are a criminal and reports you.

It is not at all comparable to companies scanning content on their servers that users have voluntarily uploaded after agreeing to terms of service. They are now reaching into *your* personal device whether you like it or not.
No I asked what is the privacy change. There is none. iCloud photos were ALWAYS unencrypted before and after upload. This is just scanning an already unencrypted data. That’s what the fuss is about right ? Privacy ? Your data on iCloud is Already shared with police if they ask for it. You have to turn OFF iCloud to have privacy
 
Things are different now.
(not for me,if they really do this,no more iPhones for me after 12 years simple as that)

Apple devised a way too assign a “Probably not CP“ score to all your pics before you upload them to their servers precisely to avoid looking at them.

It could be abused, but these are the facts.
 
  • Like
Reactions: peanuts_of_pathos
Can I store some super illegal printed pics in a drawer at your place?
Leave the door open to let me in and don’t look at them.
Don’t be pressured by authorities to look at them.
If that is my business model then don’t tell me. I’ll demand a warrant.
 
  • Like
Reactions: peanuts_of_pathos
Good intent but a bad idea. I don’t have any pictures that I wouldn’t want scanned and I’ll continue to use iCloud but I don’t like the idea.
It's not limited to iCloud. It's ON YOUR DEVICE and YOU HAVE NO CHOICE. Absolutely mental.
 
  • Like
Reactions: ratspg
This deserves a re-post :

1628231729468-jpeg.1815406


Things are different now.
(not for me,if they really do this,no more iPhones for me after 12 years simple
This has always been true if you have iCloud photos off
 
It's not limited to iCloud. It's ON YOUR DEVICE and YOU HAVE NO CHOICE. Absolutely mental.
It is limited to iCloud photos . It’s not hard to understand . If you turn off iCloud this won’t occur. Do you want to read that slowly to understand ?
 
So another issue, Apples stance on creating a back door is they wouldn’t do it for anyone or anything. So now that legal argument is dead isn’t it?
 
  • Like
Reactions: VulchR and mr_jomo
This is just Snowden trying to remain relevant. Who cares what he thinks anymore when he is still hiding in Russia? I'd take anything he has to say with very probable Russian influence in mind.
 
  • Disagree
Reactions: gmacgregor
It’s well intentioned but it’s just a bad idea. It could/should be used for actual child porn suspects but not for every single iCloud user writ large as the potential for images to be falsely identified by the software is all but admitted because of the human review process. And presumably child porn involves anyone under the age of 18 so for edge case scenarios where images are not obviously depicting little kids, how could human reviewers possibly know for certain the age of who is depicted in an image? There are non porn images aplenty on the internet of women who are over the age of 18 but can look convincingly younger in a photograph, intentionally or not. And visa-versa. I just don’t think they have an answer for that and until they do, there is no reason for confidence that it will all “just work” in a draconian implementation like this and that someone innocent of uploading child porn to iCloud will never need to explain their perfectly legal iCloud images.

It compares your pics against a “certified“ CP catalogue.
It doesn’t try to figure out if your pics are actually CP on the spot.

The question should be more about how this “certified” catalogue is audited and stuff.
 
Last edited:
  • Love
Reactions: peanuts_of_pathos
It compares your pics against “certified“ CP catalogue.
It doesn’t try to figure out if your pics are actually CP on the spot.

The question should be more about how this “certified” catalogue is audited and stuff.

It's provided by NCMEC, who will be heavily governed and audited.
 
  • Angry
Reactions: peanuts_of_pathos
The point is the accuracy is extremely high, no matter how you slice it. You and others are acting like there will be false positives left and right and on top of that, people will be thrown in jail for innocent images, LOL! Come on, let's get real please.

Again, I understand people's concerns about the bigger picture, but I DON'T understand why people feel the need to twist the facts regarding the details of this "smaller" picture being discussed now. Let's be fair and not act like every detail of what Apple is doing here is somehow suspect and sinister. You can admit that without supporting the general principle.

"Extremely high" isn't good enough, though, when taken in context with what they're doing. Apple is putting one foot into law enforcement by doing this. Their role should be protecting the privacy of the devices, not playing Chris Hansen (To Catch A Predator reference... ).

The way this should work is that law enforcement gets a warrant for Apple to do this hash scan on a user's device first, and then they implement it on a case by case basis.
 
So another issue, Apples stance on creating a back door is they wouldn’t do it for anyone or anything. So now that legal argument is dead isn’t it?

again. This is not a Backdoor because iCloud photos on device or on the cloud are NOT ENCRYPTED
 
It's not limited to iCloud. It's ON YOUR DEVICE and YOU HAVE NO CHOICE. Absolutely mental.
Uhm nope?
Just disable iCloud Photos.

Now, if it also applies to the “Photo streaming“ album and to the iCloud Backup of the camera roll, I will be pissed.
 
  • Like
Reactions: VulchR
This is absolutely horrible.
Nobody, that's normal, likes filthy child molesters but gosh this is waaaay too much information for Apple, after this what's next?
It's nice to frame it in a package that makes them sound like the good guys but treating every one of us like the bad guys is not good.
I wish the Linux phone had caught on and had more to offer.
 
The point is the accuracy is extremely high, no matter how you slice it. You and others are acting like there will be false positives left and right and on top of that, people will be thrown in jail for innocent images, LOL! Come on, let's get real please.

Again, I understand people's concerns about the bigger picture, but I DON'T understand why people feel the need to twist the facts regarding the details of this "smaller" picture being discussed now. Let's be fair and not act like every detail of what Apple is doing here is somehow suspect and sinister. You can admit that without supporting the general principle.
Everything starts small. Today’s protection of child sexual abuse could be tomorrow’s political attack and targeted censorship. Even if I give apple full benefit of doubt and assume everything they mention here is correct, the action itself is already alarming regardless of the cause. We have “attempted murder” as an offence in many countries btw, which focuses on actions, not the end result.

Again, this is an ongoing issue which will evolve as time goes on. The only thing that will not change is change itself.
 
This is absolutely horrible.
Nobody, that's normal, likes filthy child molesters but gosh this is waaaay too much information for Apple, after this what's next?
It's nice to frame it in a package that makes them sound like the good guys but treating every one of us like the bad guys is not good.
I wish the Linux phone had caught on and had more to offer.

there is no but when it comes to
Preventing child sexual abuse
 
"Privacy is a fundamental human right."

I'd recommend Tim Apple open a dictionary to examine the definition of every single word in this quote.
 
  • Like
Reactions: Shirasaki
That wasn't an answer to my question. But if this is a hill you're willing to die on, then I'd suggest not uploading any photos to iCloud, because there's a less than 1 in 1 trillion chance that a photo will be falsely flagged and even then nothing will happen unless you have even more that are falsely flagged and uploaded to iCloud (they haven't disclosed the threshold number of flagged photos at which point these flagged photos are reviewed). Apple employees aren't going to be perusing your iCloud photo library of innocent photos. While I respect your stance on privacy and I agree with it, you're misrepresenting what's actually happening here. Please get some balance instead of thinking emotionally.
I understand, now, that they are looking for specific hashes of registered child pornograhy photos. They need to do a better job of explaining how other personal photos will be protected.
 
This is absolutely horrible.
Nobody, that's normal, likes filthy child molesters but gosh this is waaaay too much information for Apple, after this what's next?
It's nice to frame it in a package that makes them sound like the good guys but treating every one of us like the bad guys is not good.
I wish the Linux phone had caught on and had more to offer.
I’d rather say you can count on Linux implementing similar software packages on at least popular distros, and integrate them so deep into the system, even ultimate customisability wont save it.
 
  • Sad
Reactions: peanuts_of_pathos
The intentions are good from Apple but once this is in place it will inevitably be expanded to scan for other things as well. How long before governments pass laws requiring Apple to look for certain this or they won’t be able to operate in the country? Will China require Apple to look for picture of Winnie the Pooh? How about countries requiring Apple to scan for pictures of large gatherings during the lockdowns?

Governments (especially the US) have found that they can use Big Tech to get around the rights of their citizens. I remember when computers and the internet were supposed to be liberating. Now it seems they are turning into Big Brother.

Edit: Currently Apple has the hooks for this process at the point photos are uploaded to iCloud. Why couldn’t they move the hooks to when pictures are saved on your phone? That way everything is caught in this massive dragnet.

Heck, maybe they can expand it to scan what is displayed on the screen that way you can’t look at “wrong think”. Chrome already does this to prevent phishing.


Maybe this is why Apple isn’t forcing people to iOS 15? Maybe they are being forced by the government to put this in or be broken up via antitrust? Otherwise why wouldn’t they have announced this late Friday evening when fewer people would notice?


Edit 2: I’ve see others point out that Apple is actually late to the game here. A Microsoft technology called "PhotoDNA" (developed in 2014) has been adopted by tech platforms including OneDrive, Gmail, Google Photos, Facebook, Twitter, Discord, and Reddit. The difference is that these all happen on server. Apple is actively snooping on device.
 
Last edited:
  • Like
Reactions: peanuts_of_pathos
"Privacy is a fundamental human right."

I'd recommend Tim Apple open a dictionary to examine the definition of every single word in this quote.
Your encrypted data on iPhone is still yours. When you check the iCloud box , YOU choose to give up that privacy on those photos.
It
Is
Not
Hard
To
Understand
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.