Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
With respect - people focusing on the specific case of scanning for CSAM are missing what the story is.

What's really happening here is Apple are putting a tool onto users phones whose purpose is to compare users private data against black box third party databases.

What it's used for is simply subject to Apple policy and what local jurisdictions might force them to use it for.

Installing a tool like this on user devices is a huge mistake.
 
It's not about that - you're totally missing the bigger picture on this.
No offense intended at all. Please take none.
People in this thread are taking our concerns over this way out of context when the bigger picture is right there in front of all of us.

This is just telling authorities ‘look at what we have convinced our users of in order to be in their physical devices’.
 
Definitely, this is a bridge too far.

I don't want to be in a position in five years time where Apple detects an "illegal" CD-rip on my iPad because they compare a hash to a database in my home country, or where I'm fined because Apple finds a meme my government doesn't like.

Obviously this is sold as a CSAM detection measure at first, but Apple's incentives are aligned in such a way that they have a much greater interest in using it to prevent copyright infringements (particularly on their own IP).
 
No. Not in the slightest. https://www.theverge.com/2021/8/10/...safety-features-privacy-controversy-explained:
"When iCloud Photos is enabled on a device, the device uses a tool called NeuralHash to break these pictures into hashes — basically strings of numbers that identify the unique characteristics of an image but can’t be reconstructed to reveal the image itself. Then, it compares these hashes against a stored list of hashes from NCMEC, which compiles millions of hashes corresponding to known CSAM content. (Again, as mentioned above, there are no actual pictures or videos.)"

I'm fine with this. Apple has historically, and presently, always put customer privacy first compared to almost every other brand. So for now, I'm fine with it.
 
Last edited:
Its not a no from me yet, but it has definitely given me pause for thought about upgrading.

It's not so much the functionality as they describe now but as others have alluded too, its the potential about what it becomes later and I'm not sure I want to support that.

I generally swap between iOS and Android every few years anyway, maybe this is a Pixel 6 year.
 
With respect - people focusing on the specific case of scanning for CSAM are missing what the story is.

What's really happening here is Apple are putting a tool onto users phones whose purpose is to compare users private data against black box third party databases.

What it's used for is simply subject to Apple policy and what local jurisdictions might force them to use it for.

Installing a tool like this on user devices is a huge mistake.
I think there is room for some gray area here.

How much of a black box is the data source? Apple says that the CSAM hash list will be coded right onto the phone. But that database surely grows over time: what has Apple coded to enable updates to that list? Does it query NCMEC specifically, hard-coded? Does it query Apple? Or is there no mechanism for live updates, and an updated list needs to be added with an iOS update.

Querying Apple, as a third party, would be the least good. Querying NCMEC, as the first party (and a government-overseen one, if you believe that to be a good thing) is better. Hard-coding into iOS with no live update mechanism would be best, I think: that means that the list is a static object that watchdog groups can analyze and confirm is what it says it is.

I believe there is a balance Apple is trying to achieve here: put in some kind of mechanism to look for child pornography, but do it in as customer-private way as possible. Last year Facebook reported some 20 million cases of CSAM. Google did several hundred thousand, I think. Apple did 265. There is a reasonable way to look at this and say, "the cost of not reporting CSAM is worth it against protecting consumer privacy." I believe it is also reasonable to look at this and say, "Apple not reporting CSAM is unconscionable: Apple is protecting those engaging in child exploitation." I think the way they've described this implementation is as privacy-preserving as possible.

This is a tool that has awful potential uses. It turns this realm of consumer privacy from a technological hurdle ("You can't do this because the phone's design makes it impossible") to a policy hurdle ("You can't do this because we say you can't"). That's a step back. But if there is a company I trust to at least value and prioritize consumer privacy, it's Apple. They may not succeed at it, but they are incentivized to defend that line.

In exchange we get the good of Apple--one of the richest, most powerful companies in the world, with a user base of many millions--joining the fight against child exploitation.
 
Yes. Like others, I've lost my enthusiasm. Computers used to be fun, before and in the early days of the internet. But like a lot of things, they've lost their whimsy and almost all the joy and innocence are gone. Maybe that's what happens when too many people get involved. But this might be the last straw. In the future I'll just use whatever's cheapest, begrudgingly at that, and treat it as a work tool that doesn't actually belong to me.
 
Yes. Like others, I've lost my enthusiasm. Computers used to be fun, before and in the early days of the internet. But like a lot of things, they've lost their whimsy and almost all the joy and innocence are gone. Maybe that's what happens when too many people get involved. But this might be the last straw. In the future I'll just use whatever's cheapest, begrudgingly at that, and treat it as a work tool that doesn't actually belong to me.

I've been thinking a lot about this in the last few years...sort of wondering how much is due to my age advancing vs things that have actually "changed"..

I do think it's a combo, but I do also really believe a lot changed due to technology and particularly networked technology advancing into the mainstream.

I find almost nothing about technology much "fun" anymore.

It's everywhere and also in various stages of "poorly designed" or "bug ridden" or "not updated"...and it just adds an un-pleasurable amount of low level friction in so many areas of life now.

And now we also have to be worried about what is listening to us or watching us and who or what might be collecting that data and doing god knows what with it and to what end.

There's a reason surveillance states are seen as dystopian.
It adds a low level paranoia and fear to everything at the societal level.
 
Last edited:
Not sure yet. I don’t use iCloud to store photos so don’t know if I’m affected. If it scans the pictures on the phone locally (I’ve heard some reputable people saying it can create hashes of your photos locally & compare to the CSAM database ) then it does make me think twice.

Really don’t buy the argument that if your photos are ok you will be too. I guess I just don’t trust the algorithms that hash the images.

and I do worry that it would be easy for a junk app to add something to your phone to get you flagged !
 
Last edited:
Right now yes it is. Any breaking of iMessage and iCloud is unacceptable. One of the whole points is privacy and security. Once a back door is there it’s there forever.
Any reason for the compromise in security will be expanded in the future. It’s just a matter of when. Looks like I’m going to have to stay on 14.7
 
Don’t forget they’re breaking iMessage too.
and while this sounds fine about the pictures at some point someone will have them looking for memes or jokes or videos that don’t meet certain criteria. It always happens if you read a history book. What is legal one day is a capital offensive the next and this is a tool to be used to sniff it out.
 
Don’t forget they’re breaking iMessage too.
and while this sounds fine about the pictures at some point someone will have them looking for memes or jokes or videos that don’t meet certain criteria. It always happens if you read a history book. What is legal one day is a capital offensive the next and this is a tool to be used to sniff it out.

Or even just more benign in the near term...
Things like “ensuring clips you share are licensed by their commercial and corporate partners”

Perhaps refuse to share the text of a news article without a subscription...

The really awful examples here are endless
 
The absurdity of saying Apple can do no wrong is mind boggling.

How many times has Apple Maps screwed you over when just trying to enter an address, or taken you to the wrong location? How many years did it take to improve Maps? Is it working perfectly even now?
When Apple Maps makes a mistake, the driver can make a choice not to turn his/her car into the river.

When Apple CSAM makes a mistake, you will find out by having the cops standing outside your door and Chris Hansen leering in the background because Apple wanted to make an example out of how well the CSAM hunt is going.

The CSAM algorithm won't ask you if it's flagged something. The CSAM algorithm has no stated way of rolling your cumulative guilt score back. The CSAM algorithm won't tell you that you've been deemed guilty without hearing or trial. Apple CSAM presumes that you are guilty, you just haven't been caught yet.

Will Apple even allow CSAM staff or OS programmers to be deposed or cross examined during your trial? How about the qualities and (in)accuracy of the non-governmental NCMEC database?

That is even beyond the thousand other ways you can get screwed accidentally or deliberately. Whatsapp images are automatically added to your photo roll. You can take photos with a locked iPhone. There are probably dozens of other entryways to add potentially bad/incriminating photos that you never took.

The fact that Apple was a laggard in even investigating, let alone reporting CSAM, in comparison to Google or Facebook shows Apple HAS a CSAM problem, yet all those other companies did not need to resort to putting a monitoring agent locally in the device/OS to fight CSAM. They reported the problems once it hit their servers.

But let's throw the big switch on the big brother Appley CSAM machine without real transparency, proper vetting and oversight. They didn't even do a mass beta test on their own employees. But we all should trust Apple because they know we're all holding it wrong.
 
Right now yes it is. Any breaking of iMessage and iCloud is unacceptable. One of the whole points is privacy and security. Once a back door is there it’s there forever.
Any reason for the compromise in security will be expanded in the future. It’s just a matter of when. Looks like I’m going to have to stay on 14.7
The ugly reality is that Apple has had a backdoor with iCloud from the beginning that a lot of people don't realize, outside of this nerd forum. If a person uses iCloud backup, or messages in the cloud, Apple has the decryption key thus giving them access to all your iMessage, photos and email.

People around here should turn off iCloud backup, iMessage in the cloud, iCloud email, and definitely iCloud photos, if they want to keep Apple from having access.
 
I’m out I was leaning that way having seen what I could get for the money I paid for my iMac that would have a much greater life span and be upgradeable. I’m fed up of a red dot on preferences on MacOS because I don’t use iCloud. I don’t want a nanny phone that opens up the way fir government spying down the road. A custom rom Android phone and a windows computer that will last a decade are on the horizon. Apple are to far behind by playing leak a few updates make lots of money. At least Samsung are trying something new we get possibly a 120Mhz screen and a new deformed notch, yeah time to move on. As for cameras the competition are making apple look lost too. I’m saddened it’s come to this but that’s what happens when a company is run by bean counters rather than forwardthinking dynamic engineers with a CEO to back them up.
 
No. Not in the slightest. https://www.theverge.com/2021/8/10/...safety-features-privacy-controversy-explained:
"When iCloud Photos is enabled on a device, the device uses a tool called NeuralHash to break these pictures into hashes — basically strings of numbers that identify the unique characteristics of an image but can’t be reconstructed to reveal the image itself. Then, it compares these hashes against a stored list of hashes from NCMEC, which compiles millions of hashes corresponding to known CSAM content. (Again, as mentioned above, there are no actual pictures or videos.)"

I'm fine with this. Apple has historically, and presently, always put customer privacy first compared to almost every other brand. So for now, I'm fine with it.

Summarized from https://www.hackerfactor.com/blog/index.php?/archives/929-One-Bad-Apple.html

NCMEC uses a perceptual hash algorithm provided by Microsoft called PhotoDNA. NMCEC claims that they share this technology with service providers.
...

Perhaps there is a reason that they don't want really technical people looking at PhotoDNA. Microsoft says that the "PhotoDNA hash is not reversible". That's not true. PhotoDNA hashes can be projected into a 26x26 grayscale image that is only a little blurry. 26x26 is larger than most desktop icons; it's enough detail to recognize people and objects. Reversing a PhotoDNA hash is no more complicated than solving a 26x26 Sudoku puzzle; a task well-suited for computers.
...

If someone were to release code that reverses NCMEC hashes into pictures, then everyone in possession of NCMEC's PhotoDNA hashes would be in possession of child pornography.
 
  • Like
Reactions: Schismz
Summarized from https://www.hackerfactor.com/blog/index.php?/archives/929-One-Bad-Apple.html

NCMEC uses a perceptual hash algorithm provided by Microsoft called PhotoDNA. NMCEC claims that they share this technology with service providers.
...

Perhaps there is a reason that they don't want really technical people looking at PhotoDNA. Microsoft says that the "PhotoDNA hash is not reversible". That's not true. PhotoDNA hashes can be projected into a 26x26 grayscale image that is only a little blurry. 26x26 is larger than most desktop icons; it's enough detail to recognize people and objects. Reversing a PhotoDNA hash is no more complicated than solving a 26x26 Sudoku puzzle; a task well-suited for computers.
...

If someone were to release code that reverses NCMEC hashes into pictures, then everyone in possession of NCMEC's PhotoDNA hashes would be in possession of child pornography.
PhotoDNA can be compromised. At the moment, I don't recall if it was Matthew Green or another Cryptographer that I saw talking about how he was easily about to manipulate PhotoDNA and sent the White Paper to Microsoft and others.
 
it didn’t at first, because on the surface of it, I don’t have that kind of content, and don’t use icloud anyway. However, Apples willingness to go down this road at all, and the elementary fallacy “if you're not committing crimes then you have no reason to worry about us trampling your 4th amendment right to privacy” issued in yesterdays statement tells me they are not actually serious or even familiar with the subject of privacy at all. This marks a turning point in my relationship with technology. Rather than it being a passive creative tool, I now have to think about whether my device (and however many unknown faceless people are on the other end of it) will approve of what I’m doing on it at any time and whether it’s reporting me if it doesn’t. Am I breaking some copyright law by listening to my music library or remixing some clip from a dvd into a mashup? & if so, can I trust my own computer will not prosecute me for that next? Not anymore. This Apple computer used to be a digital paintbox, now it is a digital boobytrap of tripwires and federal agents in my home. With this change, Apple’s willingness to use our devices to actively monitor us for illegal activity and then turn our devices against us means there’s no way I’m upgrading or purchasing anything beyond this point.
 
Last edited:
Doesn't effect my decisions. Sorry if this offends anyone but I honestly couldn't care any less. I am not saying I live a squeaky clean life but I sure as **** don't do anything that would land me in jail. I'm so disgusted by the type of things they are trying to hunt that I'm more than willing to lose a bit of my privacy if it means just one of those sick Aholes is caught.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.