Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
At the moment the hash is produced from low res versions of the abusive images in the CSAM database.

It’s possible to produce a longer hash from a higher res image and that would make a hash collision basically impossible.
 
I have to ask, how does it feel to be on the other side of the fence now, with a social justice program that you don’t support, being told it’s for the greater good and to just shut up?

Not so great huh?

Maybe some people will finally learn that blind obedience (not to be confused with the Apple fanboy thing, though they are related) isn’t the best idea and social justice almost always has ulterior motives. It might as well be called social manipulation.

Just like with any power hungry organization, Apple won’t be backing down. They see you push back and they push forward harder, no matter how unpopular the policy is, because they think they are out to save the world from itself, as narcissistic as that sounds.

I would love to see this entire “feature” scrapped but I won’t hold my breath.
 
I’ve had more time to simmer on this whole CSAM/Apple situation since the story broke 10 days ago and I just can’t bring myself to feel like this is ok. I usually give Apple the benefit of the doubt on these types of things but I just can’t get past the hashing being done on-device.
That’s where it becomes uncomfortable for a lot of people. If we can honestly believe that this method would never be used in any other way than how Apple is selling it and it is impossible to hack, it would be more private than scanning in the cloud. The problem is that things don’t usually work out that way. I’d personally rather have them scan the cloud in a less private way and know my phone is not scanning me.
 
Yeah, this is increasingly sounding like its not meant for children. Not forgetting the fact that actual pedos, who are likely very small in number, will either just turn off iCloud Photos or get Android phones. Normal people are left stuck with the bill. Apple are going to have to pull this whole thing.

Most pedos are android users to begin with lol
 
Not forgetting the fact that actual pedos, who are likely very small in number, will either just turn off iCloud Photos or get Android phones
I'm pretty sure that this is exactly Apple's intention. They have to protect iCloud from hosting CSAM and probably don't have too much invested in actually catching pedophiles. If everyone who posts CSAM stops using iCloud and perhaps all Apple products, this is to Apple's benefit.
 
As this whole scanning thing continues to get more publicity, it’s going to be difficult for customers to not have heard about it by the iPhone 13 release and all the security researchers calling out potentially dangerous flaws is surely going to make the masses have doubt in it. This is beginning to look really bad for Apple.
Let's hope.
 
Wait - I’m confused by this. Does this mean that everyone’s iCloud is going to be scanned without user’s authorization in the name of child welfare??

No.

1. Your iCloud is not getting scanned, the photos on your device will be scanned prior to being uploaded to iCloud Photo Library.

2. User authorization is given when (if in the United States) a) updating to iOS 15 you agree to the terms and conditions and b) enable iCloud Photos
 
I'm glad I'm not the only one that was bothered by the fact it's been in there since 14.3.
It's called testing internally to make sure they get the code right before they release it to the public. There are lots of things in the code that don't get activated for several versions or even at all. AirTags were in the iOS code for nearly 2 years before that feature became relevant and active.
 
One more time for the people in the back: I KNOW how it works. I fundamentally and viscerally disagree with it being done ON MY DEVICE.
Well Apple doesn't want your illegal child porn on their servers so if you choose to use their iCloud photo library they will make sure what you are sending them doesn't contain that kind of material.

Don't like it? Don't use iCloud.
 
As Rene Ritchie says on MacBreak Weekly, Apple keeps talking down to us as if we don't understand, and our response is "You don't understand, we understand and do not like this"
So many elitists on here insist that if you don't like this you must not understand it, must be ignorant, must not have read up on it. They are the ones who don't understand that people can know exactly how this works, maybe more than they do, and still disagree with it.

This makes it even more concerning because it must mean that they are not self-critical enough to see the flaws in their system or potential abuse. They refuse to allow even the possibility that they have blindspots, because anyone who disagrees is just confused and ignorant.
 
Last edited:
Does this mean that everyone’s iCloud is going to be scanned without user’s authorization in the name of child welfare??

What makes you think you didn't give Apple authorization? At least if you are in the US, this is what you agreed to:
Apple's iCloud Terms

E. Access to Your Account and Content

Apple reserves the right to take steps Apple believes are reasonably necessary or appropriate to enforce and/or verify compliance with any part of this Agreement. You acknowledge and agree that Apple may, without liability to you, access, use, preserve and/or disclose your Account information and Content to law enforcement authorities, government officials, and/or a third party, as Apple believes is reasonably necessary or appropriate, if legally required to do so or if Apple has a good faith belief that such access, use, disclosure, or preservation is reasonably necessary to: (a) comply with legal process or request; (b) enforce this Agreement, including investigation of any potential violation thereof; (c) detect, prevent or otherwise address security, fraud or technical issues; or (d) protect the rights, property or safety of Apple, its users, a third party, or the public as required or permitted by law.
 
Well Apple doesn't want your illegal child porn on their servers so if you choose to use their iCloud photo library they will make sure what you are sending them doesn't contain that kind of material.

Don't like it? Don't use iCloud.
I am sure he understood this too. We know our options. That post was saying that Apple and others keep insisting that we don't like it because we don't understand how this works. It is possible to understand and still don't like it. Yes, we can turn off iCloud, but that is not the point of that post.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.