Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
That’s a decent explanation, but I’m baffled that they didn’t think of these things before announcing the feature originally.

They did think of these things, and they made efforts to mitigate them. If you read about the approach they were taking and don't fall into the false "it's AI looking for nudies" assumptions, they had a pretty sophisticated set of checks and balances. I think most of the people who think this outcome was an obvious conclusion all along didn't really study the proposal.

The summary is really that those risks were very small, but people still cared about them a lot more than the child protections they offered.

One day there's going to be a headline grabbing cache of abuse images found in iCloud. Before they would have been accused of not even caring. Now they've put a significant amount of corporate resource into it and they've put together what looks to me like an incredibly well thought through proposal and consulted with advocacy groups on all sides. The consensus appears to be against doing anything and I don't blame Apple for dropping it.

I give them credit for trying and given how massive the backlash has been even here where people should be relatively well informed, I understand why they stopped.
 
  • Like
Reactions: AgeOfSpiracles
The things you are mentioned are apples and oranges. What I’m talking about specially is hardware. Compelling tech hardware standards like USB-C will inevitably slow down progress. If you disagree, you are myopic. Imagine if the EU did the same thing 20 years ago with USB-B connectors. We’d be stuck on inferior tech due to government mandate. Some day (probably not too far from now) a tech company or consortium will come up with a connector that is vastly superior to USB-C. But that will get stifled when governments compel things to be static.
Warranty is hardware too, quality/health requirements are hardware too, spare parts too?
Warranty makes companies repair your hardware for free if it’s their fault.
QA/health makes your hardware work as expected and don’t give any health issues.
Spare parts makes hardware repairs available and cheaper.

If someday there is a new connector, they will just need to show it to the UE and make a request to make it the new standard. And that’s better because the next standard will need to be a real positive change to get the approval so if we ever have to change our cables again will be for a great improvement
 
Which could just as easily be known photos of terrorist leaders, posters with terrorist slogans, known photos of drug paraphernalia, scans of subversive pamphlets...
Why would they need to know this when anyone can just download these photos for legitimate purposes (example: journalism)?

You do know they already scan iCloud email which in many ways is more sensitive than just detecting known photos, right? And you have no idea if they changed the algorithm because this is done server side.
 
...and how often does it have to be repeated that the proposed system was designed to match similar-looking images because requiring an exact match would be too easy to fool? When you're scanning the images from a billion iPhone users, even a million-to-one chance of a false match will be to much to properly investigate.
It doesn't have to be repeated at all because there's a document that describes why your concern doesn't hold.

Apple's analysis suggested one in a trillion accounts would trigger a false positive. So with a billion iCloud accounts that means a 1 in 1,000 chance of ever needing manual check. If that estimate is off by 5 or 6 orders of magnitude it means one person double checking the derivative data of 30 false positives a day. That doesn't sound like too much.

If you think Apple's analysis is wrong, then explain why. But "similar looking" isn't a very precise way of discussing a topic that already has a lot of deeply technical implementation details associated with it.
 
Which could just as easily be known photos of terrorist leaders, posters with terrorist slogans, known photos of drug paraphernalia, scans of subversive pamphlets...

Would those be included in the set by child protection services operating under at least two different governments?

If you don't take the time to understand what they were proposing to do it will feel a lot more worrisome than if you educate yourself. Read Apple's documentation on how this was actually meant to work and then point to the holes you find-- your comments will be more meaningful then.
 
Look at the paragraph above the three images on page 5.

1693603342121.png


I mean, if someone's phone is found to have 30 or more pictures of their kids that are indistinguishable from CSAM simply run through a black and white filter, then maybe a manual check isn't misplaced?
 
  • Like
Reactions: AgeOfSpiracles
Which could just as easily be known photos of terrorist leaders, posters with terrorist slogans, known photos of drug paraphernalia, scans of subversive pamphlets... No, it shouldn't be able to detect your own photos (unless its a false positive) but there's no reason that the on-device algorithm needs to be modified in order to generate a hash from any photo which would match the hash of whatever images, on whatever subject, were on the list.

In any case, since when was this algorithm going to be open source and available for inspection?
Why would this system be used to find photos of terrorists? This system is designed to detect specific photos, files with specific fingerprints... not the subjects of photos. It's starting to feel like everyone is arguing past each other, and you're arguing against surveillance in general, not this particular method.
 
  • Like
Reactions: Analog Kid
Exactly. We don’t know but the white paper basically stating that photoshop manipulation will not fool it. You know how much you can do in photoshop?
I think you're reading too much into that photoshop line. They're talking about fairly basic photo editing functions.... crop/rotate, color profiles, watermarks... not the entirety of the feature set. If you content-aware fill a seaside village over all the CSAM content described by the hash, or change every pixel to x000000, or do anything that substantively alters the subjective content of the image, then of course it will "fool it," because it's not the same photo anymore.

I also think it's easy over estimate the probability of very very unlikely events, like the possibility that your innocent pool party pics will be mistaken for illegal porn. I once caught my teenage nephew trying to enter random bitcoin recovery phrases using a BIP-39 word list; after telling him that its still stealing if it worked, I asked why he even thought it would work... because it seemed possible and he was bored. And yeah, it is possible. But it's so vanishingly unlikely that it was deemed perfectly secure.
 
Whew! I would be embarrassed for anyone to know how many adorable cat photos I keep in iCloud. 🤭

The inadvertent snapshots of the poses my two get into might even make Hugh Heff blush ;)

On a side-note...

One of my co-workers shared a funny with me, a few years ago:

Photos often gives suggestions for reminisce, and (it seems that) there is a category for 'Animals'

On a work-trip to coastal GA, I took it upon myself to wear a Burger King crown on my head one day.

We were planting a property-line in Sea Island with 3-5m shrubbery, and he took a photo of me walking, amongst.

A year later, Photos somehow ingested that shot, and offered it to him in 'Animals' :)

I'm not yet confident that the on-device Neural Engine ML algos are quite as discerning as one might expect *shrugs*
 
Good on Apple!

First they came for the socialists, and I did not speak out—
Because I was not a socialist.

Then they came for the trade unionists, and I did not speak out—
Because I was not a trade unionist.

Then they came for the Jews, and I did not speak out—
Because I was not a Jew.

Then they came for me—and there was no one left to speak for me.



With the rise of populist driven one-trick-pony political movements, it is truly great to see Apple's stance. Privacy is vital as is the right to free speech.
I find the use/co-opting of this 1946 post WWII confessional regarding the Nazis in the way which it is used here to be almost irreverent and unquestionably offensive especially to those who had family murdered and persecuted during WWII.
 
  • Like
Reactions: AgeOfSpiracles
OK, but “fuzzy hashes” are still used by Cloudflare and freely available to all Cloudflare users AND hashes cannot be reverse-engineered to reveal the original content… AND… those raw hashes are calculated and stored on upload to ensure the file(s) transferred w/o corruption; so saying it introduced a vector for compromise literally makes zero sense, but I am sure it will appease the people taking up arms on a non-issue.
What worried me about all this is times when you shot a photo of your toddlers in the bath (or something else just as similarly innocuous) and ten minutes later your pad was being raided by cops. AI isn't smart enough yet to know the difference.
That is not how it works AT ALL. Read up on Cloudflare’s “fuzzy hashes” to check for CSAM and there you go.
This would have compared hashes, which cannot be reverse-engineered, against KNOWN CSAM hashes (and whatever version of “fuzzy hashes” Apple had developed) and triggered an alert/flag that an image might be known CSAM. From there, there are some pretty easily automated digital forensics that could determine if further reporting were needed.
The amount of trumped up FUD around this feature - that other sites are already doing anyway (Cloudflare’s version is FREE to all their users) - is so ridiculous.
 
  • Like
Reactions: MajorFubar
I find the use/co-opting of this 1946 post WWII confessional regarding the Nazis in the way which it is used here to be almost irreverent and unquestionably offensive especially to those who had family murdered and persecuted during WWII.
Yeah. On multiple levels... not just the co-opting of the victims' suffering for relatively trivial purposes, but also totally missing the point of the quote: The socialists didn't deserve to be put into camps and/or murdered. The trade unionists didn't deserve it. The Jews didn't deserve it. The point is that people should make an effort to empathize with innocent people which they don't inherently see as their own. To stand up for what is right, even if it doesn't directly affect you personally.

First they came for the child predators, and I was like, cool.
 
There were users on here who kept saying "You are not an Apple engineer" and "Apple knows better" and "Protect the children" and "What do you have to hide." Well, apparently everybody knew better than Apple, including the inventor of this technology, who called it "dangerous." And the present Apple knows better than the past Apple. Apple is not always right. Sometimes even the ordinary commonsense users know better.
The Apple engineer did not invent this technology that was, AFAIK, created by and is freely available to all Cloudflare clients long before Apple made any announcements - even longer back if you are just looking st the concept of hashes.
 
I don't think they can scan your images when you enable advanced data protection. that would only work if they scanned the images on device before they are uploaded, which is precisely what they said they wont do.

I think the article you quoted is outdated.
They wouldn’t “scan” the images at all unless they were overcomplicating hash computation, which is used to verify successful uploads to services like iCloud.
 
Bravo to Apple for making what was clearly the right decision in the face of enormous pressure from the “but think of the children” crowd.

CSAM is disgusting and everything within reason should be done to stop it but that shouldn’t include the possibility of inaccurately detecting something, labeling it as CSAM and bringing down the full force of government on somebody who may be innocent nor should it include giving the government a back door to encrypted data opening up the potential for use of the same back door by bad actors or even for future government abuse. The law of unintended consequences must be considered here and balanced carefully against the potential benefits. In this case the potential for abuse and/or unintended consequences far outweighs the benefits.

Why not just say to heck with the 4th Amendment and let the government search any property it wishes to search to find CSAM? The founding fathers knew better than to entrust that kind of power to the government.

Standing up to the powers that be in the face of the kind of pressure being exerted here took guts that some company’s simply do not have.
 
Yes. That was my concern as well. Along with adults having fun by sharing content. There are definitely young looking adults.
Except THAT IS NOT HOW IT WORKS. Unless those adults idea of “fun” is sharing known CSAM content which has already been cataloged and had its hashes stored as such, in which case they deserve what they get. Again, it’s hashes, which are already calculated and used as part of the data transfer process, but people would rather listen to FUD than read the articles I have linked countess times before on Cloudflare’s existing and FREE implementation of this.
Basically, it is like your hosts file, but instead of IPs it’s a list of “fuzzy hashes” against which your image is compared as part of the upload.
That’s it - unique hashes and AI-derived “fuzzy hashes” to account for attempts at circumventing a regular hash check (eg, cropping). Hashes are a one-way function, so it gives zero access to your data whatsoever.
 
The Apple engineer did not invent this technology that was, AFAIK, created by and is freely available to all Cloudflare clients long before Apple made any announcements - even longer back if you are just looking st the concept of hashes.
No one said Apple invented this technology. As I said, the inventors -- not within Apple -- warned Apple against using this technology because it is "dangerous." Many security and privacy experts said the same. But some users on here said Apple engineers knew better. And now Apple admits that they did not know better.
 
  • Like
Reactions: gusmula
Obviously they knew this perfectly beforehand but they didn’t expect such a backlash.

And now they realise it’s far easier (and more important) to promote privacy in their ecosystem than protecting children

If MacOS scans your files they will be no different than google, at that point you might as well get an Android
 
  • Wow
Reactions: gusmula
Bravo to Apple for making what was clearly the right decision in the face of enormous pressure from the “but think of the children” crowd.

CSAM is disgusting and everything within reason should be done to stop it but that shouldn’t include the possibility of inaccurately detecting something, labeling it as CSAM and bringing down the full force of government on somebody who may be innocent nor should it include giving the government a back door to encrypted data opening up the potential for use of the same back door by bad actors or even for future government abuse. The law of unintended consequences must be considered here and balanced carefully against the potential benefits. In this case the potential for abuse and/or unintended consequences far outweighs the benefits.

Why not just say to heck with the 4th Amendment and let the government search any property it wishes to search to find CSAM? The founding fathers knew better than to entrust that kind of power to the government.

Standing up to the powers that be in the face of the kind of pressure being exerted here took guts that some company’s simply do not have.
As presented, a single flag would be insufficient to trigger followup. Additionally, there is a TON of automatable forensics they could utilize before anyone is notified about any of those flags (sharing frequency, sharing with IP addresses known to trade CSAM, exit node traffic on darknet providers, etc).
There are a lot more nefarious slippery slopes out there for people to get so worked up and distracted over this non-issue.
I would be on board for backing some of these claims if I was intentionally ignorant of the fact that hash checks are already happening with every file sharing service.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.