Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Or you know, just don’t have iCloud photos turned on.
Or be like 99.999% of people, and don’t be worried about features that Will not ever apply to you
“Arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say.”

- Edward Snowden
 
I understand the privacy implication I'm just bored of all stupidity I read ... the only thing apple have to explain is how they ensure the hash list isn't modified by a third party between CSAM and our iPhone the rest are just speculation and misunderstanding.
Because I pretty sure none of your have even read how the neural engine work or how they can blur hash even work.
But yeah you know more ... I have my dose of stupidity for this week good week to you.
I believe many people posting on this site are avoiding the central issue, as its been deliberately couched into the pa@do category, when in fact the concern is not about that (even though I despise all who prey on children). I have 2 grandchildren I'd do anything to protect, but that also includes protecting their freedom.

This is not about CSAM, this is just the start of the slippery slope. This is about Apple engaging in surveillance, whatever way you like to try and change it.

This is how autocratic governments and dictatorships sell their actions to the masses by first using an emotive subject where I would imagine most people despise those who prey on children....but its a ruse by these governments etc., as its really about opening a door to a much more malign SURVEILLANCE, as really that is what this action amounts to from Apple, whatever way they like to try and sell it is SURVEILLANCE.

Surveillance is irrevocably linked to privacy, something Apple has previously SOLD as a flagship policy.

You have to wonder whether Apple have been leaned on by a government or governments, and of course the first way is to use an emotive action where the subject matter would be hard to disagree with, i.e. children safety, but where in fact its just to open a door, and whatever words we use, whatever we seek to justify these actions by Apple, it is still SURVEILLANCE inextricably linked to PRIVACY.

Privacy once taken, never returns, and what is so serious is the hypocrisy of Apple who have made a massive platform about protecting us from surveillance society and protecting privacy that they have now holed below the waterline and their credibility is shot to pieces.

Been Apple through and through, bought Apple devices from Apple II right the way down the line, spoken to many of the founders of Apple so no one could accuse me of Apple bashing.

But this is a major mistake, and I suspect it was Mr Cook's ideas and if not must have passed his overseeing, and he should really stick to marketing if that is the case, but born out of genuine concern, as he was giving talks on protecting children way back.

However the road to hell is paved with good intentions and this 'good intention' is a green light for more of the surveillance society, that in the past Apple has refused to comply with.

If that credibility goes, so do many of the customers. Its not about CSAM its about opening the gate and destroying Apple's credibility over privacy and surveillance.
 
Last edited:
It is not scanning, if by "scanning" you mean "evaluating the content of your photo." Apple is comparing the digital fingerprint of the photo to a list of digital fingerprints from the CSAM database. The digital fingerprint is a hash, which means even if Apple or a third party gets its hands on it, it's useless to recreate the actual photo. And it's done on the phone and not Apple's servers, so they don't even have it to begin with.

Unless you think they would get them from your phone behind your back, in which case none of this is new anyway. :)
Of course it is scanning. How do you think the “digital fingerprint” of a photo is computed? Answer: by scanning the content of the photo, running it through an algorithm, and representing the ”essence” of the photo as a fairly large number (a hash).

There are many different ways of computing a hash code for an array of bits. For example, I operate a system containing several hundred million photos, and each picture has a simple 64-bit hash code so that I can figure out if any given one is unique. The algorithm I use is a simple one and does not evaluate the ”essence” of a photo, so it would be no good at matching manipulated photos, but it suits my purposes. Apple’s algorithm will be substantially more involved.

The human brain is quite good at evaluating the essence of a photo. Take a picture of Albert Einstein, remove all colour, chop a bit off the top, make it smaller and rotate it a bit. You can still see that it’s a picture of Einstein. In a sense, you have a hash representation of Albert Einstein already in your brain, and the brain is good at applying pattern matching to any photo it sees. This is the kind of hashing that Apple will be doing.

Leaving aside the rights and wrongs of doing that, they’ve slipped up on the implementation by storing known “bad” hashes on the device. It’s not sustainable because that list will get larger with every iOS update. Ultimately, the hash list will be larger than iOS itself. I’m at a loss to explain why they don’t just pass the computed hash to a web service to get the yay/nay from Apple.
 

This is the original web page I got my idea from. Much better than my 2 second skimming (did not read deep enough for comprehension).
that is a serious cryptographic topic :) - but more on secretly sharing information (like the Diffie-Hellman key-exchange). A hash is (should be) a perfect information destroyer (irreversible function), so it is a little bit the opposite of what the article describes.
 
I suspect they don’t really know the details of Apple’s actions either, and it makes me wonder what is going on behind the scenes with such privacy advocacy groups.

It’s also ironic that WhatsApp immediately went after Apple, in a move I can only describe as sheer and utter hypocrisy.

But I guess it’s simply been way too long since the last “Apple vs the world” debacle, hasn’t it?
Yeah, that's Facebook being a Facebook about it. I trust Apple more than them. This is still very uncharacteristic and unsettling for Apple to be doing, and I don't see how anyone can justify it.
 
Apple doesn't care about children. There's something sketchier.
Eh, I don't think it's fair to say they don't care about children in general. They've long had parental control features that are totally reasonable, and it's because their customers include families. Just not this one. This is mass surveillance.
 
Yeah, that's Facebook being a Facebook about it. I trust Apple more than them. This is still very uncharacteristic and unsettling for Apple to be doing, and I don't see how anyone can justify it.

Believe.

Believe that Apple is still very much committed to the security and the privacy of its users, and that in time, more information regarding all this will be made known and that many of the negative hot takes surrounding this will age poorly.

These few days have certainly helped me learn a lot more about this initiatives by Apple, I continue to have faith in what they are doing, and I continue to see myself using Apple products for a good long time to come.

Believe.
 
  • Disagree
Reactions: Stunning_Sense4712
Believe.

Believe that Apple is still very much committed to the security and the privacy of its users, and that in time, more information regarding all this will be made known and that many of the negative hot takes surrounding this will age poorly.

These few days have certainly helped me learn a lot more about this initiatives by Apple, I continue to have faith in what they are doing, and I continue to see myself using Apple products for a good long time to come.

Believe.
It's right from the horse's mouth at https://www.apple.com/child-safety/
I don't know what else to think. They say exactly what they're doing, and what they're doing is rotten. You can keep faith, whatever that means (I guess keeping your Apple devices or maybe stock), but I'm done regarding them as the privacy-focused alternative unless they want to undo this and apologize.
 
Eh, I don't think it's fair to say they don't care about children in general. They've long had parental control features that are totally reasonable, and it's because their customers include families. Just not this one. This is mass surveillance.
They have contracts with companies that use child labour and they know about it.
 
  • Like
Reactions: Pummers
It's right from the horse's mouth at https://www.apple.com/child-safety/
I don't know what else to think. They say exactly what they're doing, and what they're doing is rotten. You can keep faith, whatever that means (I guess keeping your Apple devices or maybe stock), but I'm done regarding them as the privacy-focused alternative unless they want to undo this and apologize.

Totally reasonable and understandable.

I will say to keep an open mind. There’s still another 1-2 months before iOS 15 is released. I am sure that Apple will continue to release more information about this, as well as go on media and publicity tours to improve our understanding and acceptance.

If by then anyone is still not convinced, they are free to switch to Samsung or some other brand. It’s only been 3-4 days since the news broke and we are all still busy digesting and try to make sense of what is actually going on.

Still early days.
 
Totally reasonable and understandable.

I will say to keep an open mind. There’s still another 1-2 months before iOS 15 is released. I am sure that Apple will continue to release more information about this, as well as go on media and publicity tours to improve our understanding and acceptance.

If by then anyone is still not convinced, they are free to switch to Samsung or some other brand. It’s only been 3-4 days since the news broke and we are all still busy digesting and try to make sense of what is actually going on.

Still early days.

Thing is though anybody who leaves to android the risk is still there like there is on any other device that uses google services within the OS
 
Thing is though anybody who leaves to android the risk is still there like there is on any other device that uses google services within the OS
Android is riskier to privacy in every way. The only reason to leave for Android is to spite Apple for their decision, and if someone wants to do that, I get it.
 
  • Like
Reactions: lkalliance
Android is riskier to privacy in every way. The only reason to leave for Android is to spite Apple for their decision, and if someone wants to do that, I get it.

I do get it yeah but doesn’t seem to makes things better. Will we see more people starting not using iCloud?
 
1. Apple would have to obtain an original (illegal) image, identify that it is illegal, and store the hash
Apple say they don’t obtain the original image. The organisation has the image and can legally process those images.
Even a single bit or byte in the wrong place will result in the copy’s hash not matching the original.
Hash can be done by features, rather than pixel perfect match. Google Image search would be far less useful if they are doing pixel perfect match. Neural Network can be trained to detect patterns, which usually can be composed into features of the original image.
It’s truly disappointing that Apple got so hung up on its particular vision of privacy that it ended up betraying the fulcrum of user control: being able to trust that your device is truly yours.
While I agree mostly with what that author says, I do want to mention one thing: with software bound with hardware so tightly nowadays, the ownership of the said hardware is practically permanently tied to whatever company who designs, manufactures and sells them. Think about an iPhone that does not install iOS. I dunno how useful it can be. Same for Android phone installing iOS. Hardware ownership dies with the birth of internet, and technically only PC hardware is still owned by whoever buys it since software is not hard tied to the hardware.
 
that is a serious cryptographic topic :) - but more on secretly sharing information (like the Diffie-Hellman key-exchange). A hash is (should be) a perfect information destroyer (irreversible function), so it is a little bit the opposite of what the article describes.
That’s a term redirected by Wikipedia from Private Set Intersection page, which Apple claims to use to securely transfer hash information between Apple and user’s iPhone without either Apple or user knowing the content of hashed image unless the matching threshold reaches.
 
Leaving aside the rights and wrongs of doing that, they’ve slipped up on the implementation by storing known “bad” hashes on the device. It’s not sustainable because that list will get larger with every iOS update. Ultimately, the hash list will be larger than iOS itself. I’m at a loss to explain why they don’t just pass the computed hash to a web service to get the yay/nay from Apple.
Despite what virtually everyone is saying, the system is supposed to be as private as it could possibly be, and it’s very likely a precursor to e2e encryption for photos. Encryption and hashing will happen on your device and then the only way anyone, including Apple, can learn anything at all about your photos is if your device matches multiple instances of child porn. That would be completely undone, if hashes were sent, one at a time to a matching service. The service would know immediately if there was a single match, and if the service knows, other parties could know.

Read the official documentation. The design shows that Apple, nor anyone, can know if there have been any hash matches until a threshold number has been matched. That protects people against getting their private business probed because of innocent hash collisions or even accidental or malicious uploading of a small number of genuine CSAM images.
 
When I was a kid, no one snuck into my house daily and scanned my family's photo albums and for me, that's the issue.

These companies designed a product (and services) that the entire globe now thinks it can't live without (it totally can) and then trojan-horsed surveillance methods into them, little by little, inch by inch, so as not to rock the boat or draw too much attention. But now, with everyone believing that life without a smartphone is an impossibility, they're starting to go full-bore and flat out announcing it.

It's like, how far do these people need to go? Is there even a breaking point anymore? I'm not so sure there is. And to me--that's frightening.

Thank you for your contribution, account created today.

Welcome to the macrumors family 🙏🏻🙏🏻
 
Thing is though anybody who leaves to android the risk is still there like there is on any other device that uses google services within the OS
If IOS is bundled with software and has security flaw fixes that can only be addressed by running the software or updating it, then I'd expect Apple to be on the end of some hefty legal actions.

Ironic just as they push Icloud+ with what I consider reasonable costs for 50GB, I now wouldn't touch I cloud with a very long bargepole and I doubt others will too.

Not only will this cost Apple money, it gives a battering ram to Epic, to Facebook and every other Apple critic who can rightfully suggest Apple are not really interested in Privacy if they engage in surveillance, and whatever technicality some may wish to bring to mitigate Apple's intentions, IT IS SURVEILLANCE, it is a threat to PRIVACY.

Privacy has been a massive PR coup for Apple, and they've inadvertently for what I suspect is the best intentions have put a torpedo to hole Apple below the waterline.

Urgent action by Apple to backtrack is necessary, and perhaps then as they've done with web tracking, if committed to this surveillance, should immediately enable it to be provably switched off or removed BY USER CHOICE.
 
  • Like
Reactions: bd139
Despite what virtually everyone is saying, the system is supposed to be as private as it could possibly be, and it’s very likely a precursor to e2e encryption for photos. Encryption and hashing will happen on your device and then the only way anyone, including Apple, can learn anything at all about your photos is if your device matches multiple instances of child porn. That would be completely undone, if hashes were sent, one at a time to a matching service. The service would know immediately if there was a single match, and if the service knows, other parties could know.

Read the official documentation. The design shows that Apple, nor anyone, can know if there have been any hash matches until a threshold number has been matched. That protects people against getting their private business probed because of innocent hash collisions or even accidental or malicious uploading of a small number of genuine CSAM images.
What it is designed to do and what is does can be worlds apart. I'm sure it was done with the best intentions as Tim Cook has long been an advocate of ensuring child safety, but on this one its done interminable damage to Apple.

Whatever the blurb says, it is a form of SURVEILLANCE, it is an intrusion on PRIVACY, two of the most important aspects that Apple has sold itself on safeguarding.
 
I do get it yeah but doesn’t seem to makes things better. Will we see more people starting not using iCloud?
Here's the thing - can you believe it's just icloud? If they have this scanning technology, they'll end up just baking it into the core OS to scan everything on the phone. Doesn't matter if it's iPhotos, iMessage, whatsapp, if its on the filesystem, the core OS will do the check on that file, and if the file is flagged, it'll get reported. iCloud or not.

If you believe it's just iCloud, you're a fool.
 
Foot in the door images would be enough for probable cause. I'd imagine that is already in the database.

macqael is a nasty little contributor. I wonder how long she/he has worked at Apple?
Imagine a Snowden case, where an image being traced back to a phone containing it can compromise Wikileaks and the whistleblower operation. This has nothing to do with kids, but being maliciously put into the database by state sponsored hackers.
 
Despite what virtually everyone is saying, the system is supposed to be as private as it could possibly be, and it’s very likely a precursor to e2e encryption for photos. Encryption and hashing will happen on your device and then the only way anyone, including Apple, can learn anything at all about your photos is if your device matches multiple instances of child porn. That would be completely undone, if hashes were sent, one at a time to a matching service. The service would know immediately if there was a single match, and if the service knows, other parties could know.

Read the official documentation. The design shows that Apple, nor anyone, can know if there have been any hash matches until a threshold number has been matched. That protects people against getting their private business probed because of innocent hash collisions or even accidental or malicious uploading of a small number of genuine CSAM images.
Then Apple and FBI would know how many photos are there on your phone, when each photo were taken. When you had cellular connection. When you were awake.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.