Great news. Wonder where the mass surveillance defender clowns on this forum are now.
Not really, if by "scan" you mean create a hash (not scanning), then sure. That isn't "looking at your photos' content" That is adding up all the pixels into a formula (say SHA-256) and matching them to the hash of known shared images. This would not have detected new photos (as in you filmed your pedi-porn using your iPhone, but rather you stored a pornographic photo you got from the web that was already known by law enforcement. If this truly ran on-device not sure how it would create the backdoor for law enforcement mentioned in the article (in fact I'd be shocked if the photos app doesn't already hash the photos just for the duplicate detection mechanism) Storage wise is would have negligible effects (photos are huge, hashes are tiny). The one thing that seems a tad sketchy is if apple can hand verify a flagged photo is in fact CSAM, that would imply the encryption at rest is done with a mechanism that apple can decrypt? I thought they always bragged that not even they can decrypt your iCloud stuff... I thought it needed the data in your T2 chip?Apple was planning on scanning your phone.
I try not to go down conspiracy rabbit-holes.Yes, and there are also "secret" orders that can't be disclosed in any manner. So, while Apple can craft public relations on "their" behalf, it does not include the rest -- I believe every major tech entity must have orders in place for national security and other purposes. But that's another rabbit hole...
Unfortunately CSAM is still alive and well. This method of CSAM detection is dead before arrival.Everybody should be happy CSAM is DOA
Thank you, Apple. CSAM was a joke.
Everybody should be happy CSAM is DOA
People keep referring to CSAM as if it were software written by Apple. It is not. It's Child Sexual Abuse Material, which is supremely awful. What Apple had was a proposal, design, and prototype software to scan for CSAM in a particular way. Stop conflating the two! Unless you're trying to imply that actual CSAM isn't really all that bad.If Apple went ahead and implemented CSAM, ...
I don't think any of us you referenced portrayed it as software written by Apple. It was alleged earlier that Apple was planning on implementing CSAM surveillance which was heavily criticized.People keep referring to CSAM as if it were software written by Apple. It is not. It's Child Sexual Abuse Material, which is supremely awful. What Apple had was a proposal, design, and prototype software to scan for CSAM in a particular way. Stop conflating the two! Unless you're trying to imply that actual CSAM isn't really all that bad.
I’m certain the red mop head made sure they were all destroyed first… 😉The same big pressure keeping Epstein's client list behind closed doors... Govt doesn't care about child abuse.
For example a parent's pictures of children in a bubble bath could open the door for their entire iCloud to be shared with law enforcement.
Unless you're distributing pictures of your kids in a bubble bath where they could become part of a law enforcement database, it can't become part of the hash with our without public input. If your concern is that the method of detection changes and starts looking for potential CSAM versus hashing against known CSAM, then the rules can still change now just like they could if the original method went forward.Thanks for pointing that out.
However, I feel my concern remains as things like this have a way of becoming the monster they set out to kill. Once we allow CSAM detection, what is, or is not, considered CSAM is subject to change, with or without public input.
People keep referring to CSAM as if it were software written by Apple. It is not. It's Child Sexual Abuse Material, which is supremely awful. What Apple had was a proposal, design, and prototype software to scan for CSAM in a particular way. Stop conflating the two! Unless you're trying to imply that actual CSAM isn't really all that bad.
After extensive consultation with expertsAfter extensive public pressure...
Never should have succumbed to the public feedback! Perhaps a botched introduction but no one else would have done it 'right' like Apple because of the scrutiny they are under.
I understand that there are criteria for CSAM and I understand that the criteria would be substantial. However, if the technology is not implemented, then there is no criteria to be changed.
I’ve still yet to hear how exactly this step could be done without violating state and/or federal laws in the United States regarding the possession and distribution of CSAM.Matches confirmed manually before notifying law enforcement.
I’ve still yet to hear how exactly this step could be done without violating state and/or federal laws in the United States regarding the possession and distribution of CSAM.
Did they never plan to implement it for iPhoneOS 15?Apple initially said CSAM detection would be implemented in an update to iOS 15 and iPadOS 15...
What’s to stop them from insisting if Apple doesn’t deploy it? What’s to stop them from insisting it be implemented in a less narrow, transparent and secure way?If Apple implemented this technology, what would stop China or Russia or Iran or Saudi Arabia or any of the many other countries who suppress freedom from insisting that Apple use the technology to assist their immoral behavior?