Of course they won’t, this would put their enterprise business in danger.Windows doesn't scan your photos on device like the apple devices will. Yet anyway, and I don't think they will.
Of course they won’t, this would put their enterprise business in danger.Windows doesn't scan your photos on device like the apple devices will. Yet anyway, and I don't think they will.
“Arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say.”Or you know, just don’t have iCloud photos turned on.
Or be like 99.999% of people, and don’t be worried about features that Will not ever apply to you
I believe many people posting on this site are avoiding the central issue, as its been deliberately couched into the pa@do category, when in fact the concern is not about that (even though I despise all who prey on children). I have 2 grandchildren I'd do anything to protect, but that also includes protecting their freedom.I understand the privacy implication I'm just bored of all stupidity I read ... the only thing apple have to explain is how they ensure the hash list isn't modified by a third party between CSAM and our iPhone the rest are just speculation and misunderstanding.
Because I pretty sure none of your have even read how the neural engine work or how they can blur hash even work.
But yeah you know more ... I have my dose of stupidity for this week good week to you.
Of course it is scanning. How do you think the “digital fingerprint” of a photo is computed? Answer: by scanning the content of the photo, running it through an algorithm, and representing the ”essence” of the photo as a fairly large number (a hash).It is not scanning, if by "scanning" you mean "evaluating the content of your photo." Apple is comparing the digital fingerprint of the photo to a list of digital fingerprints from the CSAM database. The digital fingerprint is a hash, which means even if Apple or a third party gets its hands on it, it's useless to recreate the actual photo. And it's done on the phone and not Apple's servers, so they don't even have it to begin with.
Unless you think they would get them from your phone behind your back, in which case none of this is new anyway.![]()
that is a serious cryptographic topicSecure multi-party computation - Wikipedia
en.m.wikipedia.org
This is the original web page I got my idea from. Much better than my 2 second skimming (did not read deep enough for comprehension).
Yeah, that's Facebook being a Facebook about it. I trust Apple more than them. This is still very uncharacteristic and unsettling for Apple to be doing, and I don't see how anyone can justify it.I suspect they don’t really know the details of Apple’s actions either, and it makes me wonder what is going on behind the scenes with such privacy advocacy groups.
It’s also ironic that WhatsApp immediately went after Apple, in a move I can only describe as sheer and utter hypocrisy.
But I guess it’s simply been way too long since the last “Apple vs the world” debacle, hasn’t it?
Eh, I don't think it's fair to say they don't care about children in general. They've long had parental control features that are totally reasonable, and it's because their customers include families. Just not this one. This is mass surveillance.Apple doesn't care about children. There's something sketchier.
Yeah, that's Facebook being a Facebook about it. I trust Apple more than them. This is still very uncharacteristic and unsettling for Apple to be doing, and I don't see how anyone can justify it.
It's right from the horse's mouth at https://www.apple.com/child-safety/Believe.
Believe that Apple is still very much committed to the security and the privacy of its users, and that in time, more information regarding all this will be made known and that many of the negative hot takes surrounding this will age poorly.
These few days have certainly helped me learn a lot more about this initiatives by Apple, I continue to have faith in what they are doing, and I continue to see myself using Apple products for a good long time to come.
Believe.
They have contracts with companies that use child labour and they know about it.Eh, I don't think it's fair to say they don't care about children in general. They've long had parental control features that are totally reasonable, and it's because their customers include families. Just not this one. This is mass surveillance.
It's right from the horse's mouth at https://www.apple.com/child-safety/
I don't know what else to think. They say exactly what they're doing, and what they're doing is rotten. You can keep faith, whatever that means (I guess keeping your Apple devices or maybe stock), but I'm done regarding them as the privacy-focused alternative unless they want to undo this and apologize.
Well they care about children whose parents have money, at least.They have contracts with companies that use child labour and they know about it.
Totally reasonable and understandable.
I will say to keep an open mind. There’s still another 1-2 months before iOS 15 is released. I am sure that Apple will continue to release more information about this, as well as go on media and publicity tours to improve our understanding and acceptance.
If by then anyone is still not convinced, they are free to switch to Samsung or some other brand. It’s only been 3-4 days since the news broke and we are all still busy digesting and try to make sense of what is actually going on.
Still early days.
Android is riskier to privacy in every way. The only reason to leave for Android is to spite Apple for their decision, and if someone wants to do that, I get it.Thing is though anybody who leaves to android the risk is still there like there is on any other device that uses google services within the OS
Android is riskier to privacy in every way. The only reason to leave for Android is to spite Apple for their decision, and if someone wants to do that, I get it.
Apple say they don’t obtain the original image. The organisation has the image and can legally process those images.1. Apple would have to obtain an original (illegal) image, identify that it is illegal, and store the hash
Hash can be done by features, rather than pixel perfect match. Google Image search would be far less useful if they are doing pixel perfect match. Neural Network can be trained to detect patterns, which usually can be composed into features of the original image.Even a single bit or byte in the wrong place will result in the copy’s hash not matching the original.
While I agree mostly with what that author says, I do want to mention one thing: with software bound with hardware so tightly nowadays, the ownership of the said hardware is practically permanently tied to whatever company who designs, manufactures and sells them. Think about an iPhone that does not install iOS. I dunno how useful it can be. Same for Android phone installing iOS. Hardware ownership dies with the birth of internet, and technically only PC hardware is still owned by whoever buys it since software is not hard tied to the hardware.It’s truly disappointing that Apple got so hung up on its particular vision of privacy that it ended up betraying the fulcrum of user control: being able to trust that your device is truly yours.
That’s a term redirected by Wikipedia from Private Set Intersection page, which Apple claims to use to securely transfer hash information between Apple and user’s iPhone without either Apple or user knowing the content of hashed image unless the matching threshold reaches.that is a serious cryptographic topic- but more on secretly sharing information (like the Diffie-Hellman key-exchange). A hash is (should be) a perfect information destroyer (irreversible function), so it is a little bit the opposite of what the article describes.
Despite what virtually everyone is saying, the system is supposed to be as private as it could possibly be, and it’s very likely a precursor to e2e encryption for photos. Encryption and hashing will happen on your device and then the only way anyone, including Apple, can learn anything at all about your photos is if your device matches multiple instances of child porn. That would be completely undone, if hashes were sent, one at a time to a matching service. The service would know immediately if there was a single match, and if the service knows, other parties could know.Leaving aside the rights and wrongs of doing that, they’ve slipped up on the implementation by storing known “bad” hashes on the device. It’s not sustainable because that list will get larger with every iOS update. Ultimately, the hash list will be larger than iOS itself. I’m at a loss to explain why they don’t just pass the computed hash to a web service to get the yay/nay from Apple.
When I was a kid, no one snuck into my house daily and scanned my family's photo albums and for me, that's the issue.
These companies designed a product (and services) that the entire globe now thinks it can't live without (it totally can) and then trojan-horsed surveillance methods into them, little by little, inch by inch, so as not to rock the boat or draw too much attention. But now, with everyone believing that life without a smartphone is an impossibility, they're starting to go full-bore and flat out announcing it.
It's like, how far do these people need to go? Is there even a breaking point anymore? I'm not so sure there is. And to me--that's frightening.
If IOS is bundled with software and has security flaw fixes that can only be addressed by running the software or updating it, then I'd expect Apple to be on the end of some hefty legal actions.Thing is though anybody who leaves to android the risk is still there like there is on any other device that uses google services within the OS
Foot in the door images would be enough for probable cause. I'd imagine that is already in the database.All you need is for NSA to sneak a non-related but “interested” photo into the database, then it will be reported.
What it is designed to do and what is does can be worlds apart. I'm sure it was done with the best intentions as Tim Cook has long been an advocate of ensuring child safety, but on this one its done interminable damage to Apple.Despite what virtually everyone is saying, the system is supposed to be as private as it could possibly be, and it’s very likely a precursor to e2e encryption for photos. Encryption and hashing will happen on your device and then the only way anyone, including Apple, can learn anything at all about your photos is if your device matches multiple instances of child porn. That would be completely undone, if hashes were sent, one at a time to a matching service. The service would know immediately if there was a single match, and if the service knows, other parties could know.
Read the official documentation. The design shows that Apple, nor anyone, can know if there have been any hash matches until a threshold number has been matched. That protects people against getting their private business probed because of innocent hash collisions or even accidental or malicious uploading of a small number of genuine CSAM images.
Here's the thing - can you believe it's just icloud? If they have this scanning technology, they'll end up just baking it into the core OS to scan everything on the phone. Doesn't matter if it's iPhotos, iMessage, whatsapp, if its on the filesystem, the core OS will do the check on that file, and if the file is flagged, it'll get reported. iCloud or not.I do get it yeah but doesn’t seem to makes things better. Will we see more people starting not using iCloud?
Imagine a Snowden case, where an image being traced back to a phone containing it can compromise Wikileaks and the whistleblower operation. This has nothing to do with kids, but being maliciously put into the database by state sponsored hackers.Foot in the door images would be enough for probable cause. I'd imagine that is already in the database.
macqael is a nasty little contributor. I wonder how long she/he has worked at Apple?
Then Apple and FBI would know how many photos are there on your phone, when each photo were taken. When you had cellular connection. When you were awake.Despite what virtually everyone is saying, the system is supposed to be as private as it could possibly be, and it’s very likely a precursor to e2e encryption for photos. Encryption and hashing will happen on your device and then the only way anyone, including Apple, can learn anything at all about your photos is if your device matches multiple instances of child porn. That would be completely undone, if hashes were sent, one at a time to a matching service. The service would know immediately if there was a single match, and if the service knows, other parties could know.
Read the official documentation. The design shows that Apple, nor anyone, can know if there have been any hash matches until a threshold number has been matched. That protects people against getting their private business probed because of innocent hash collisions or even accidental or malicious uploading of a small number of genuine CSAM images.