Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
CSAM Is a massively bad idea but I doubt Apple will reverse course. They rarely do. I suspect this is being driven by fears of somehow being prosecuted for "storing” illegal materials on their server even if they have no control over it and have no access to the content.
Well, they at least "gave in" (sort of) to the Safari address bar complaints on the iOS 15 Beta, so one can only hope...
 
I bet they will have an announcement next week. They dont want to make the situation worse. They cant just cancel because half the users will be upset “for the kids”. So they have to come up with a coherent plan to get themselves out of this problem.

One possibility. Integrate it in the Photos app itself and make Photos uninstallable. Rather than integrating it in the OS. Making iCloud sync of your picture and video library dependent on having Photos installed. If their goal is only the narrow scope they are claiming. Then Apple should have no issue with this.

Then people have a choice. Keep the Photos app with iCloud sync and accept on device scanning on backup. Uninstall Photos and the hash checker/generator/database is also uninstalled. Then if you want to view photos. You'll have to use a third party program. If you want to back them up or view on other devices. You'll have to use a third party service or do so locally.

While it would stink to not have the Photos app and iCloud conveniences. You'd at least have the option of uninstalling the CSAM checker and the potential backdoor which it entails. While Apple would accomplish their stated goal of preventing CSAM uploads to iCloud.

They could even give users an option of two variants
- Photos iCloud: Has the built in CSAM scanner and possible further intrusions that entails.

- Photos Local: No built in scanner. No iCloud functions such as iCloud sync, iCloud backup or shared albums. No backdoor. But does allow local device viewing and management. Along with backup and sync over a customers private LAN. To their own server and other local devices with the local variant installed.

Personally I'd love a local server option to replace iCloud entirely. Allowing me to do all iCloud like sync, backup and optimize functions. Locally on my own server between my devices. But that's a different discussion.
 
All it will take to frame someone is one malicious app with some encrypted CSAM hidden in it calling UIImageWriteToSavedPhotosAlbum() to put 30 incriminating images into someone's photo library. If they have iCloud Photos enabled the on-device scanning will then pick those images up and report the device's owner to Apple despite that person being innocent.

Worse still, if Apple is declared a monopoly and forced to open up the walled garden it will be much easier to get apps like this onto devices from cowboy app stores as there would be no checks by Apple like they do with the App Store.
THIS ^^^^^^^^^

If ANYONE every wants to get you in trouble, all they would have to do is text you some bad images (unsolicited, of course). You could delete the messages but are they still on your phone? Who knows.

When (not if) the technology expands to include unfavored speech, memes, etc., offending material can be introduced to your device through simple communication (text, iMessage, email, etc.) and you are now done.
 
This is all so depressing. The problem here is that such a huge number of people are just awful.

So awful that we can’t even create tools to protect children from some of the awful people without other awful people using those tools to be even more awful, and on a grand scale that could further victimize billions.

Humans, man. We really suck.
 
Cancelled my family icloud account yesterday…. Waiting to see if I will have to leave Apple completely….just figured they are watching the numbers so went ahead and canceled before implementation
My iCloud sub was up for renewal two days ago. I cancelled. Apple Music was dumped a few months ago for various reasons. Obviously the numbers of subs fluctuate daily but I hope there is a clear to see and abnormal decrease this week.
 
Could. And in the end that's what it boils down to. Could.

and...

"Apple has strongly pushed back against such a possibility, saying it will refuse any such request from governments."

Some will trust Apple's assertion. Others won't.

Those who don't should VOTE WITH THEIR WALLETS. Of those who don't trust Apple, I suspect a very tiny number will step up and follow through. That requires courage.

.
Could if they have a known image close enough to the one they are targeting. Cloudflare does a great job of explaining "fuzzy hashing" when they announced CSAM scanning for all their customers nearly 2 years ago: https://blog.cloudflare.com/the-csam-scanning-tool/
For example: the "Celebgate" photos and videos that were leaked and shared. Those could be tracked if their hashes were fed into CSAM, but any photos you take and share would need to first be intercepted or otherwise obtained and make its way into the CSAM database (or another database if one were being used). Apple would not be able to just scan for "gay erotica" or anything of the like.
It's amusing to me that so many people freely let gmail scan their emails and gave away their photos to FaceApp without even reading or thinking about their terms of use, but are worried about a feature that is literally available to any Cloudflare customer and is done by other services writ large.
I think a lot of the confusion comes from the other feature they announced which has several caveats that make it inapplicable to most (if not all) forum posters. "It's my phone!" doesn't generally apply to kids under 13 who are using an iPhone - it's almost certainly (with a few rare exceptions) their parent's phone and iCloud family account. Even if it is their phone, they are not old enough to enter into the legal agreements to use any services on the phone on their own.
 
  • Disagree
Reactions: Rafa_uk
This is all so depressing. The problem here is that such a huge number of people are just awful.

So awful that we can’t even create tools to protect children from some of the awful people without other awful people using those tools to be even more awful, and on a grand scale that could further victimize billions.

Humans, man. We really suck.
Nah - most humans are good. It's just that a few bad people can cause a huge amount of misery ...
 
Could. And in the end that's what it boils down to. Could.

and...

"Apple has strongly pushed back against such a possibility, saying it will refuse any such request from governments."

Some will trust Apple's assertion. Others won't.

Those who don't should VOTE WITH THEIR WALLETS. Of those who don't trust Apple, I suspect a very tiny number will step up and follow through. That requires courage.

.
Apple will not lose markets like China, period end of subject. They are beholden to the shareholder...
 
I think given the recent revelations about the abuse of the NSO Pegasus system, Apple may end up temporarily shelving the idea of that CSAM scanning system. Reason: some "state actor" hacker could bypass Apple's limits and start scanning for way more image types than CSAM.
 
Could. And in the end that's what it boils down to. Could.

and...

"Apple has strongly pushed back against such a possibility, saying it will refuse any such request from governments."

Some will trust Apple's assertion. Others won't.

Those who don't should VOTE WITH THEIR WALLETS. Of those who don't trust Apple, I suspect a very tiny number will step up and follow through. That requires courage.

.
It actually goes beyond trusting Apple though and ventures into what governments could coerce Apple to do.

I read an interesting legal explanation of this elsewhere. The argument goes back to when the FBI wanted Apple to build a backdoor around iOS encryption. You can't compel speech and therefore the government couldn't compel Apple to write new software to do that.

The problem is Apple went ahead and did that on their own.

So that original legal protection is moot. The software now exists and the government has a different legal argument to make: they now need to be able to compel Apple to use the software instead of needing to compel Apple to write the software.
 
It actually goes beyond trusting Apple though and ventures into what governments could coerce Apple to do.

I read an interesting legal explanation of this elsewhere. The argument goes back to when the FBI wanted Apple to build a backdoor around iOS encryption. You can't compel speech and therefore the government couldn't compel Apple to write new software to do that.

The problem is Apple went ahead and did that on their own.

So that original legal protection is moot. The software now exists and the government has a different legal argument to make: they now need to be able to compel Apple to use the software instead of needing to compel Apple to write the software.

There's that "could" word again. Which was my point, previously.

Still boils down to trusting Apple and its assertion: "Apple has strongly pushed back against such a possibility, saying it will refuse any such request from governments."

If you trust Apple, fine. If you don't, Vote With Your Wallet. I suspect no more than a tiny number of the "don'ts" won't.
 
One possibility. Integrate it in the Photos app itself and make Photos uninstallable. Rather than integrating it in the OS. Making iCloud sync of your picture and video library dependent on having Photos installed. If their goal is only the narrow scope they are claiming. Then Apple should have no issue with this.

Then people have a choice. Keep the Photos app with iCloud sync and accept on device scanning on backup. Uninstall Photos and the hash checker/generator/database is also uninstalled. Then if you want to view photos. You'll have to use a third party program. If you want to back them up or view on other devices. You'll have to use a third party service or do so locally.

While it would stink to not have the Photos app and iCloud conveniences. You'd at least have the option of uninstalling the CSAM checker and the potential backdoor which it entails. While Apple would accomplish their stated goal of preventing CSAM uploads to iCloud.

They could even give users an option of two variants
- Photos iCloud: Has the built in CSAM scanner and possible further intrusions that entails.

- Photos Local: No built in scanner. No iCloud functions such as iCloud sync, iCloud backup or shared albums. No backdoor. But does allow local device viewing and management. Along with backup and sync over a customers private LAN. To their own server and other local devices with the local variant installed.

Personally I'd love a local server option to replace iCloud entirely. Allowing me to do all iCloud like sync, backup and optimize functions. Locally on my own server between my devices. But that's a different discussion.
That would definitely be acceptable, but Photos is such an integral part of the OS, it would be hard to do.
 
"A foreign government could, for example, compel a service to out people sharing disfavored political speech. That's no hypothetical: WeChat, the popular Chinese messaging app, already uses content matching to identify dissident material. India enacted rules this year that could require pre-screening content critical of government policy. Russia recently fined Google, Facebook and Twitter for not removing pro-democracy protest materials."

Without taking a side in this, none of those things in the above warning could be forced on Apple with what Apple says they're doing. Apple has a list of hashes and they're checking images uploaded to iCloud to see if they match that hash. WeChat's content matching would require text content analysis — totally different. The India example appears to be the same kind of thing (or maybe a requirement for human pre-screening, which is more different still). And the Russia example is where Russia identified posts or pictures and demanded they be removed; which is absolutely not the same thing.

So... this article seems to be people urging Apple not to proceed with its plans, based on warnings that have little to do with what Apple is actually doing.
They get the list of hashes from somewhere. That list of matching hashes can be generated from any set of images. Apple can be forced by governments to use that government's list of hashes.
 
Apple, just cancel this on device surveillance software
My understanding is that this only runs on Apple's servers via your icloud photo library. So if you choose not to use icloud photos, nothing would get scanned. Is this inf act correct? As other have mentioned, all of us thing protecting the kids is a great thing to do - this is the wrong way to go about it.
 
What I still have never had my question answered is WHAT exactly is Apple doing for the manual review process? It’s NOT legal for ANYONE other than NCMEC to have these images, so Apple cannot compare flagged items to anything correct? So that means they will need to make some judgement calls on people around 15 and 25 that might look older or younger?

How does Google and Microsoft do this manual review process?
 
Apple has been under congressional pressure to do something about this. It absolutely has something to do with their well-founded concern of having this material on their servers. They’ve been reporting hundreds of cases a year. Facebook: millions.
The absolute numbers are impossible to interpret. You would need to calculate detections per image stored and I doubt Apple and Facebook would release the number of images on their systems.

And remember Apple is only scanning e-mails ATM. Hands up if you think pedophiles are dumb enough to e-mail illegal content around. They might think they are more anonymous on FB, Whatsapp, etc.
 
  • Like
Reactions: Mal Blackadder
Could if they have a known image close enough to the one they are targeting. Cloudflare does a great job of explaining "fuzzy hashing" when they announced CSAM scanning for all their customers nearly 2 years ago: https://blog.cloudflare.com/the-csam-scanning-tool/
For example: the "Celebgate" photos and videos that were leaked and shared. Those could be tracked if their hashes were fed into CSAM, but any photos you take and share would need to first be intercepted or otherwise obtained and make its way into the CSAM database (or another database if one were being used). Apple would not be able to just scan for "gay erotica" or anything of the like.
It's amusing to me that so many people freely let gmail scan their emails and gave away their photos to FaceApp without even reading or thinking about their terms of use, but are worried about a feature that is literally available to any Cloudflare customer and is done by other services writ large.
I think a lot of the confusion comes from the other feature they announced which has several caveats that make it inapplicable to most (if not all) forum posters. "It's my phone!" doesn't generally apply to kids under 13 who are using an iPhone - it's almost certainly (with a few rare exceptions) their parent's phone and iCloud family account. Even if it is their phone, they are not old enough to enter into the legal agreements to use any services on the phone on their own.

You misinterpreted how the word "could" was used in my post, meaning Apple could acquiesce to opressive foreign government demands. Or you could believe their assertion: "Apple has strongly pushed back against such a possibility, saying it will refuse any such request from governments."

"Since Apple's announcement of the feature, the company has been bombarded with concerns that the system behind detecting CSAM could be used to detect other forms of photos at the request of oppressive governments. Apple has strongly pushed back against such a possibility, saying it will refuse any such request from governments."
 
From what I’ve read here: they don’t. The problem with their service was that it used an external server to scan for content. That’s not the case in Apple’s implementation at all. All communication is completely end-to-end encrypted. Malicious users can still send offensive material to whomever they want and no-one except for the receiving user will know about it. However, with the new service, if parents choose to enable it, children’s accounts will scan the received images after decrypting them but before displaying them and present the minor with a content warning. If the kid is below the age of thirteen, the parents will can choose to get a warning that improper material was sent to their child. None of this is enabled by default. No external sources are alerted, the service (iMessage) or its provider (Apple) don’t get a notification at all. So, the E2E-messaging is still safe, but children get an optional layer of protection from creeps. Also, older minors can avoid unsolicited dick pics without their parents knowing about it (just in case some moronic parents try to blame their kids just for receiving that kind of harassment. Sadly, victim blaming is not unheard of).
You are addressing the iMessage screening protections. The researchers are addressing the Cloud iPhotos on device CSAM detection.
 
  • Like
Reactions: Mal Blackadder
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.