Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Because they find that CSAM whilst looking for other things as part of a request from police for instance?

So what? They reported it, didn't they? So they're not breaking the law by failing to report CSAM that they found. And if you're trying to imply they never were scanning for it and only looked when requested, then you'll need to provide some solid evidence for that claim.

Looking at the NCMEC figures, that's not the case.

I've already replied to this assertion. Simply repeating it doesn't make it more valid.

Why are you so eager to protect CSAM collectors who stashed their collection on iCloud?

W...T...F... How in the name of all that is rational did you get THAT out of anything I said? I can't even begin to fathom how you thought that made any sense as you were typing it out.
 
That's stating the obvious. I'm simply saying I've seen no rational reasons for "choosing not to." They're all based on paranoid slippery-slope fallacies or conspiracy theories.
Right, because we've never seen things start relatively innocuous and turn incredibly nefarious.
 
Kind of ironic coming from Germany which just implemented upload filters and wants access to encrypted messaging
Well, an upload filter is not like sniffing your underwear out of your drawer, as Apple is doing, i suggest Tim to buy used underwear's if sniffing is his passion.
Nobody have issues with scanning whats on their servers, but this automatically sniffing on device, and at the same enforcing upload-all without a way to exclude photos from being uploaded to iCloud when iCloud is enabled, thats pure surveillance.
 
So what? They reported it, didn't they? So they're not breaking the law by failing to report CSAM that they found. And if you're trying to imply they never were scanning for it and only looked when requested, then you'll need to provide some solid evidence for that claim.



I've already replied to this assertion. Simply repeating it doesn't make it more valid.



W...T...F... How in the name of all that is rational did you get THAT out of anything I said? I can't even begin to fathom how you thought that made any sense as you were typing it out.
You're changing your story from 1 post to the other. I'm going to end this discussion. (with you)
 
I don't like that people view child porn. And as a conservative Christian pastor who works full time at a church, I don't want anyone viewing porn. Furthermore, I intentionally don't watch material that has risky scenes or language that offends me.

However, the same technology that Apple wants us all to accept this fall could one day be the same technology that tells a government that I am a conservative Christian pastor. Therefore, the right thing in this situation is not to catch people that are going to not use the feature—the right thing in this situation is to not implement a feature that is highly useless against the people for whom it is intended... because the day might one day come when others get caught in a web that was not originally intended for them.
This is the best post on this matter I have seen on this site. Thank you.
 
First part: technical document released by Apple explains the high level concept.
Second part: Apple has every right to have your iPhone scan it the moment you have iCloud Photos turned on. Turn iCloud photos off and you turn CSAM detection off. There is your control.
So I have to turn off a feature that I pay for in order to have Apple stop spying on me?

Seems legit.
 
My point is that it isn't so much about the technology, but a lot more about people's trust in the US government since clearly many people have trusted Apple significantly over the years with a closed iOS system along with their health data, fingerprints and facial recognition data - the latter two stored in the secure enclaves of their devices and not worry about the potential vulnerabilities surrounding it.

Apple's CSAM device scanner can be disabled simply by not using iCloud Photos. People complain that the hashes could be reverse-engineered and new images faked to use those hashes. If that's the case, we'd have seen issues with existing systems that already implement CSAM detection, surely? Besides which, as I understand it, the hashes are made up of two different sources.
No, it is not about the trust in the U.S. government. It is about how much Apple is to be trusted. And this oversteps that boundary of trust. As a user of iCloud photos I will be discontinuing my usage of it as a result.
I do not trust Apple to run a process on my phone that even potentially reports back to Apple about even the metadata of the personal information stored on my devices. This is surveillance of the population at large, even if it is only hashes. The potential for abuse is far too high.
The founders of the U.S. told us never to give up freedom for security. This is exactly that. It is giving up, even if only in a small part, our right to privacy for a bit of security. This is not the way. Security can and should be achieved through other means.
 
I can see through yourvmultiple posts that you are a strong supporter of the CSAM implementation.

No. I don't *want* CSAM detection on my phone, but the fact that Apple is doing this to comply with laws and this method of CSAM detection is FAR MORE PRIVATE than what Facebook/Google/Amazon is doing tells me Apple's method is superior in terms of privacy. This should be preferred method if you care about privacy.

However you are simplifying things. It is not true that you have control.

Sure you do. Turn off iCloud and CSAM detection is never activated

Your imigases are scan these or other way as the hash library is built into iOS.

So? iOS has been scanning your photo library using machine learning since 2019 to generate smart search tags. This is done whether or not iCloud is off or on.


The thing is its not many opposing apple scanning icloud library for CSAM on their server. We oposse the code on our iphones and that we find a privacy invasion.

CSAM on device detection is not less private than server scanning if you know you're going to be using iCloud library. If you're not using iCloud library, CSAM on device is not performed.
 
Apple's CSAM device scanner can be disabled simply by not using iCloud Photos. People complain that the hashes could be reverse-engineered and new images faked to use those hashes. If that's the case, we'd have seen issues with existing systems that already implement CSAM detection, surely? Besides which, as I understand it, the hashes are made up of two different sources.

So, your suggestion is to disable a feature that I pay for in order to not have my privacy violated by Apple? Is that truly what you're suggesting? What other features that I pay for do I need to turn off to secure my privacy?
 
No, it is not about the trust in the U.S. government. It is about how much Apple is to be trusted. And this oversteps that boundary of trust. As a user of iCloud photos I will be discontinuing my usage of it as a result.
I do not trust Apple to run a process on my phone that even potentially reports back to Apple about even the metadata of the personal information stored on my devices. This is surveillance of the population at large, even if it is only hashes. The potential for abuse is far too high.
The founders of the U.S. told us never to give up freedom for security. This is exactly that. It is giving up, even if only in a small part, our right to privacy for a bit of security. This is not the way. Security can and should be achieved through other means.
For me this isn't about whether or not I trust Apple. I trust Apple, otherwise I wouldn't own my iPad or iPhone. This is about believing that the information on my device that I paid for is MY information and Apple has no right to scan it, regardless of the privacy protections they believe they've put in place.

Give me a system and I'll find a way to corrupt it. Just tell me the rules.

And that is my biggest fear here.
 
Yes, the part you’re ignoring is the part where Apple has to comply with a change in the law. That’s what the fella in the German parliament was concerned about.

What these governments know is that Apple has a file scanner built into the OS, and it can be set up to work with multiple databases in different countries. If the pay change the law then Apple will comply, especially when their supply chain and sales are threatened.

Oh, and I’ve been a software developer for over 25 years. Code is a lot more modular than you think, these days. I reckon it would take less time to implement this change than to move all Chinese iCloud accounts to Chinese servers and hand over control to a Chinese cloud service company – which Apple did at China’s request.

The way the system is designed, they would need to have separate versions of iOS for each country (which they do not) and the different databases would have different root hashes. They would get caught right away. Again: it's much easier to comply with a government request from the server, where they can implement any kind of procedure without raising red flags.

That is the point of this whole system. Apple implemented this in such a way that any change in how it operates needs to be reflected in iOS, that security researchers can check. This will never be possible on a server. This system helps prevent government overreach by making part of the scan procedure auditable.

If you don't want to play with it, disable iCloud. Again, security researchers can check Apple's claim that the hashes are only produced when photos are being uploaded to their servers.
 
People who keep posting this simply prove they don't even understand the most basic thing about the CSAM detection on iOS 15. Let me try to simplify this for you:

iPhone <-- this is your iPhone
iCloud <-- this is not your iPhone (it's Apple's servers)

So if you have 1,000,000 CSAM images on your iPhone and don't enable iCloud for photos, then everything that happens on your iPhone does indeed stay on your iPhone. However, if you enable iCloud for photos, you are now moving things OFF your iPhone and ONTO Apple's servers (remember, Apple's servers are NOT your iPhone).
Agreed, the violation is moving part of the spyware on to my device, remove it and we no longer have an issue
 
I listend to Craig F explain to the WSJ how CSAM “actually” works. The whole time, I kept thinking “It seems so benign (after all, who ISN’T against child porn) when discussed in context of child porn, but what happens when a government somewhere in the world wants to use it to find something other than porn?”. As Craig described it, the CSAM tech starts with a “known” image that produces the “fingerprint” that’s used to generate the alerts, and that image comes from law enforcement. Once in place, who among us believes a government won’t feel completely justified to seed CSAM with images to look for other than child porn? Every photo library, complete with GPS coordinates of where the picture was taken, will be available to be searched for a Person of Interest. Every iPhone in the world becomes a digital spy.
 
So many people against Apple's CSAM.... This is an opportunity for them to send Apple a message by not upgrading their existing devices to iOS 15 and voting with their wallets by not buying any new Apple device that has it.

But we all know that will not happen.
This is what myself and the 4 other people on my plan will be doing in 12 months when our phones are paid for. We'll switch to whatever the latest and greatest Android phone is. I'm not happy about it. I'm not an Android fan at all. I had a Google Nexus 7" tablet and really didn't like the OS. I realize it's changed a lot since then.
 
Right, because we've never seen things start relatively innocuous and turn incredibly nefarious.

As I literally said:
They're all based on paranoid slippery-slope fallacies or conspiracy theories.

The argument that we shouldn't do something because it COULD be abused is fallacious.
 
Agreed, the violation is moving part of the spyware on to my device, remove it and we no longer have an issue

Please explain how a publicly announced iOS update (including disclosure of the CSAM detection features) is "spyware". Last time I checked, spyware is malicious software installed without your knowledge or consent. None of that applies here. If you choose to upgrade to iOS 15 and be paranoid about that feature (for no rational reason that I've seen), that's on you.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.