Then why when I search the quote in the white paper do I get 0 returns?Apple specifically stated in their white paper it does “visually similar images produce the same hash”
Then why when I search the quote in the white paper do I get 0 returns?Apple specifically stated in their white paper it does “visually similar images produce the same hash”
That’s a decent explanation, but I’m baffled that they didn’t think of these things before announcing the feature originally.
Warranty is hardware too, quality/health requirements are hardware too, spare parts too?The things you are mentioned are apples and oranges. What I’m talking about specially is hardware. Compelling tech hardware standards like USB-C will inevitably slow down progress. If you disagree, you are myopic. Imagine if the EU did the same thing 20 years ago with USB-B connectors. We’d be stuck on inferior tech due to government mandate. Some day (probably not too far from now) a tech company or consortium will come up with a connector that is vastly superior to USB-C. But that will get stifled when governments compel things to be static.
Same here.Where I am, knives of any size or method of operation (including switchblades) are totally legal.
Why would they need to know this when anyone can just download these photos for legitimate purposes (example: journalism)?Which could just as easily be known photos of terrorist leaders, posters with terrorist slogans, known photos of drug paraphernalia, scans of subversive pamphlets...
Look at the paragraph above the three images on page 5.Then why when I search the quote in the white paper do I get 0 returns?
It doesn't have to be repeated at all because there's a document that describes why your concern doesn't hold....and how often does it have to be repeated that the proposed system was designed to match similar-looking images because requiring an exact match would be too easy to fool? When you're scanning the images from a billion iPhone users, even a million-to-one chance of a false match will be to much to properly investigate.
Which could just as easily be known photos of terrorist leaders, posters with terrorist slogans, known photos of drug paraphernalia, scans of subversive pamphlets...
Look at the paragraph above the three images on page 5.
Why would this system be used to find photos of terrorists? This system is designed to detect specific photos, files with specific fingerprints... not the subjects of photos. It's starting to feel like everyone is arguing past each other, and you're arguing against surveillance in general, not this particular method.Which could just as easily be known photos of terrorist leaders, posters with terrorist slogans, known photos of drug paraphernalia, scans of subversive pamphlets... No, it shouldn't be able to detect your own photos (unless its a false positive) but there's no reason that the on-device algorithm needs to be modified in order to generate a hash from any photo which would match the hash of whatever images, on whatever subject, were on the list.
In any case, since when was this algorithm going to be open source and available for inspection?
Cool. I maintain my expectation of privacy.There should be no expectation of privacy for illegal activities.
I think you're reading too much into that photoshop line. They're talking about fairly basic photo editing functions.... crop/rotate, color profiles, watermarks... not the entirety of the feature set. If you content-aware fill a seaside village over all the CSAM content described by the hash, or change every pixel to x000000, or do anything that substantively alters the subjective content of the image, then of course it will "fool it," because it's not the same photo anymore.Exactly. We don’t know but the white paper basically stating that photoshop manipulation will not fool it. You know how much you can do in photoshop?
Whew! I would be embarrassed for anyone to know how many adorable cat photos I keep in iCloud. 🤭
The system Apple proposed was not at all related to ML-based image recognition.I'm not yet confident that the on-device Neural Engine ML algos are quite as discerning as one might expect *shrugs*
I find the use/co-opting of this 1946 post WWII confessional regarding the Nazis in the way which it is used here to be almost irreverent and unquestionably offensive especially to those who had family murdered and persecuted during WWII.Good on Apple!
First they came for the socialists, and I did not speak out—
Because I was not a socialist.
Then they came for the trade unionists, and I did not speak out—
Because I was not a trade unionist.
Then they came for the Jews, and I did not speak out—
Because I was not a Jew.
Then they came for me—and there was no one left to speak for me.
With the rise of populist driven one-trick-pony political movements, it is truly great to see Apple's stance. Privacy is vital as is the right to free speech.
That is not how it works AT ALL. Read up on Cloudflare’s “fuzzy hashes” to check for CSAM and there you go.What worried me about all this is times when you shot a photo of your toddlers in the bath (or something else just as similarly innocuous) and ten minutes later your pad was being raided by cops. AI isn't smart enough yet to know the difference.
Yeah. On multiple levels... not just the co-opting of the victims' suffering for relatively trivial purposes, but also totally missing the point of the quote: The socialists didn't deserve to be put into camps and/or murdered. The trade unionists didn't deserve it. The Jews didn't deserve it. The point is that people should make an effort to empathize with innocent people which they don't inherently see as their own. To stand up for what is right, even if it doesn't directly affect you personally.I find the use/co-opting of this 1946 post WWII confessional regarding the Nazis in the way which it is used here to be almost irreverent and unquestionably offensive especially to those who had family murdered and persecuted during WWII.
The Apple engineer did not invent this technology that was, AFAIK, created by and is freely available to all Cloudflare clients long before Apple made any announcements - even longer back if you are just looking st the concept of hashes.There were users on here who kept saying "You are not an Apple engineer" and "Apple knows better" and "Protect the children" and "What do you have to hide." Well, apparently everybody knew better than Apple, including the inventor of this technology, who called it "dangerous." And the present Apple knows better than the past Apple. Apple is not always right. Sometimes even the ordinary commonsense users know better.
They wouldn’t “scan” the images at all unless they were overcomplicating hash computation, which is used to verify successful uploads to services like iCloud.I don't think they can scan your images when you enable advanced data protection. that would only work if they scanned the images on device before they are uploaded, which is precisely what they said they wont do.
I think the article you quoted is outdated.
Except THAT IS NOT HOW IT WORKS. Unless those adults idea of “fun” is sharing known CSAM content which has already been cataloged and had its hashes stored as such, in which case they deserve what they get. Again, it’s hashes, which are already calculated and used as part of the data transfer process, but people would rather listen to FUD than read the articles I have linked countess times before on Cloudflare’s existing and FREE implementation of this.Yes. That was my concern as well. Along with adults having fun by sharing content. There are definitely young looking adults.
No one said Apple invented this technology. As I said, the inventors -- not within Apple -- warned Apple against using this technology because it is "dangerous." Many security and privacy experts said the same. But some users on here said Apple engineers knew better. And now Apple admits that they did not know better.The Apple engineer did not invent this technology that was, AFAIK, created by and is freely available to all Cloudflare clients long before Apple made any announcements - even longer back if you are just looking st the concept of hashes.
Obviously they knew this perfectly beforehand but they didn’t expect such a backlash.
And now they realise it’s far easier (and more important) to promote privacy in their ecosystem than protecting children
As presented, a single flag would be insufficient to trigger followup. Additionally, there is a TON of automatable forensics they could utilize before anyone is notified about any of those flags (sharing frequency, sharing with IP addresses known to trade CSAM, exit node traffic on darknet providers, etc).Bravo to Apple for making what was clearly the right decision in the face of enormous pressure from the “but think of the children” crowd.
CSAM is disgusting and everything within reason should be done to stop it but that shouldn’t include the possibility of inaccurately detecting something, labeling it as CSAM and bringing down the full force of government on somebody who may be innocent nor should it include giving the government a back door to encrypted data opening up the potential for use of the same back door by bad actors or even for future government abuse. The law of unintended consequences must be considered here and balanced carefully against the potential benefits. In this case the potential for abuse and/or unintended consequences far outweighs the benefits.
Why not just say to heck with the 4th Amendment and let the government search any property it wishes to search to find CSAM? The founding fathers knew better than to entrust that kind of power to the government.
Standing up to the powers that be in the face of the kind of pressure being exerted here took guts that some company’s simply do not have.