You seemed to have suggested that it has never occurred, so how do we know the odds. I am saying how would we know if it never occurred? It could flag false positives all the time and not be reported.We don’t. That’s why I said ‘probably”. But I‘d like somebody that knows stats to weigh in and explain how Apple can say one in one trillion with any confidence.
Did you actually read the article lol?No. It doesn't work this way. It's scanning for known child sexual abuse images using hashes to identify those photos. It's not going to flag your personal photo.
Johns Hopkins cryptography researcher Matthew Green said that CSAM scanning is a "really bad idea" because in the future, it could expand to scanning end-to-end encrypted photos rather than just content that's uploaded to iCloud. For children, Apple is implementing a separate scanning feature that looks for sexually explicit content directly in iMessages, which are end-to-end encrypted.
Green also raised concerns over the hashes that Apple plans to use because there could potentially be "collisions," where someone sends a harmless file that shares a hash with CSAM and could result in a false flag.
Green also raised concerns over the hashes that Apple plans to use because there could potentially be "collisions," where someone sends a harmless file that shares a hash with CSAM and could result in a false flag.
If you're not doing anything wrong, then you have nothing to worry about.
Any perspective where any trillion dollar tech company isn’t already owned by the IC is laughable at this point. This type of stuff is just marketing to cover how they use what they already know in their self-serving wars.
Well talk that to people in oppressive governments. “Wrong” can be very subjective. Speaking ill of your government can get you arrested ie that Olympic athlete from Belarus.If you're not doing anything wrong, then you have nothing to worry about.
It’s from the Apple technical document I linked.You seemed to have suggested that it has never occurred, so how do we know the odds. I am saying how would we know if it never occurred? It could flag false positives all the time and not be reported.
Well talk that to people in oppressive governments. “Wrong” can be very subjective. Speaking ill of your government can get you arrested ie that Olympic athlete from Belarus.
Ok. So we don't know.It’s from the Apple technical document I linked.
I’ve seen that occasionally with European visitors in the US. I think they may be looking for specific images, so this may not be an issue. Still, I don’t trust those building the hashes not to include photos that are not cases of abuse or that someone won’t do the equivalent of swatting you by maliciously finding a way to do a photo dump. If the bad guys know about this they won’t use it anyway. There has got to be a better way without invading privacy.In Europe, it’s customary to take family pictures of 10-year-old girls topless on the beach. In the US, this would be considered child porn and you can go to prison for 20 years if they find a picture like that on your phone.
I’ve seen that occasionally with European visitors in the US. I think they may be looking for specific images, so this may not be an issue. Still, I don’t trust those building the hashes not to include photos that are not cases of abuse or that someone won’t do the equivalent of swatting you by maliciously finding a way to do a photo dump. If the bad guys know about this they won’t use it anyway. There has got to be a better way without invading privacy.
They are looking for multiple images which would make it less likely. However I’m still not comfortable with this. It could be weaponized, it could contain photos that fall in to societal norms of some places (Europe), it could lead to a slippery slope of including more types of images, etcThis is the real issue, even before privacy.
So not true. If you have never been falsely accused, you have no idea the horror of being falsely accused of something as sinister as child pornography or sexual abuse. Pray it never happens to you.
I wonder if it is Apple trying to appease the powers that be as they resist handing over decryption capacity.This is still very creepy. Not sure what exactly Apple's motive is.
I'm just not going to allow Apple to go through my iPhone. Hopefully, that option is given.
Can you quote where the technical papers say this?No. It doesn't work this way. It's scanning for known child sexual abuse images using hashes to identify those photos. It's not going to flag your personal photo.
This is how democracy dies.I wonder if it is Apple trying to appease the powers that be as they resist handing over decryption capacity.
False positives are the issue.Don’t understand the big deal.
This isn’t people at Apple scanning people’s photos, it’s on-board AI that’s looking for matches with an existing database.
Your photo library is already being scanned for faces (People feature) and other points of interest.
This one;What idea?
Your definition of wrong might not match the law enforcement's definition.If you're not doing anything wrong, then you have nothing to worry about.
Not the FBI, a non-profit advocacy organization outside of any electoral oversight.No, that's not how it works.
Your photo will be reduced to a single numerical fingerprint. That fingerprint will be compared to a list of fingerprints for known images without any context about your image. This fingerprint isn't a picture of your child bathing, and it's definitely not an image or understanding of genitalia. It t's just a number like 8BA62546-1258-4E90-9096-48EE7365ECAE. Since your photo is not on the list, nothing will happen.
On the other hand (and of course you wouldn't do this, I'm just trying to explain the mechanics here), if you sold that image online to a lot of people and it became a well known image of child pornography, the FBI would eventually add the image to their database. Apple would end up building a fingerprint of the FBI's copy of that image. If that fingerprint still matched your image, it would be flagged. Apple/LEO would be able to look at their copy of the image matching that fingerprint because they acquired the image through another mechanism. In this case, though, some other mechanism got the photo from your computer or phone into the cloud.