Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
How would they even use the CSAM Detection System to root out terrorism?
It would only work if terrorist shared meme photos and they would soon learn not do that.

The system can't be used to detect a category of photos only a specific set of photos and close derivatives.

Indeed. But the set of photos is not disclosed, only their hashes. Therefore, it could contain any arbitrary photo, including photos matching known terrorists.

If the terrorist doesn't have a photo of himself in his phone, which would be reasonable to assume, then maybe he happens to be outside and someone happens to take a photo of him while taking a selfie or photographing a friend or any other activity people usually do outside, which is not at all unlikely.

And while the software as it is right now would be incapable of matching the terrorist's face at any angle, actual photo recognition could well be part of the next upgrade. And if we're adding photo recognition, might as well add GPS tracking too, so we know where the terrorist was when his face was captured. None of which would be unlikely, if they want to expand the idea of CSAM detection to detect terrorists in the first place.

It's not necessarily about the software in its current state (even though I could instantly see ways to circumvent and misuse it when it was announced, and I'm not even a security specialist), but the infinite possibilities for expansion.
 
  • Like
Reactions: PC_tech and dk001
Since this first made news, I’ve switched to a Google Pixel running GrapheneOS, a Linux laptop and a Remarkable 2 e-ink pad.

I have not missed the Apple ecosystem one bit and it’s refreshing getting back to hardware/software companies rather than former computing company who now sees itself as a media conglomerate that has shunned it’s former principles.
I found this funny, but making the news this morning (Tom’s Hardware if interested) Pine64 just announced the successor to their Pinephone Linux privacy phone: the “Pinephone Pro” (and no I’m not making this up). So if Apple’s new policies are driving you away, try the Pro with upgrades like 2mp - 5mp front camera and 5mp - 13mp rear. 4 core to 6 core SOC. 32gig to 128gig internal storage and 2/3gig ram to 4 gig. Gorilla glass 4. Available sometime next spring. Pinephone means privacy…heheheheehe…
 
They’re talking about scanning the photos that are about to be uploaded to iCloud Photos (and apparently it’s disabled if you disable the iCloud Photo Library). Would you be happier if exactly the same photos were scanned, but the scanning was happening on Apple’s servers after the photos were uploaded?
On servers is fine as this would also catch and clean up items already on the iCloud. Something this proposed process does not. Beside, that is Apples bailiwick.
The impression I get is that every service is already scanning, or will soon be required to scan, photos that end up on their servers, for CSAM, and Apple figured it improved user‘s privacy if that required scanning happened client-side before the photos were uploaded rather than server side after they’re uploaded.
Not true. Look at what is reported. Google scans on Share. Amazon scans very little. Facebook scans pretty much everything. DropBox scans on Share. Same with MS. I do not know of any entity that scans everything.
Baking it into the OS makes it harder for a government to come along and ask for additions - it’d mean changing iOS rather than just changing a script that runs on the server. Seems like Google and others that do it all server-side would be easier for a government to arm-twist into scanning more, since they can just change one script that customers never see - and security researchers / privacy advocates can’t analyze - rather than having to add code to the OS that then needs to be updated on every user device.
The Government may already ask for scanning server side. The one area they cannot currently scan is on-device. That is an area many State entity's want. If you are looking for how, try some of the more in-depth articles on this. Try post #102 and #77 for some links.
 
  • Like
Reactions: Philip_S
Also, for those of you switching to Android… really? Google only sells every little thing about you. The response to Apple getting less private so you’re going to android rings hollow.
I switched my main line from an iPhone 11 Pro Max to a Samsung Android phone and I couldn't be happier with it. I switched not for "privacy", but because Apple wants my phone to be the cops, only without the rules normal cops have to abide by. It's pure and simple big brother stuff.

I don't really care if google sells my marketing data, unless they have something to turn me in to the cops on my phone, everything's okay.

I'm also not asking Apple to go e2e to icloud, makes no difference to me so that's not a selling point.
 
  • Like
Reactions: dk001 and PC_tech
all my devices were locked from auto updating after this announcement was made. This will be abused, I’m not taking part in it.
I did the same, as well as disabling icloud photos sync.

Guess what though, the update to 14.8, which I did initiate, turned on iCloud Photos again. And people think turning off iCloud photos would make any difference in whether they scan your phone or not... :(
 
  • Like
Reactions: dk001 and PC_tech
And people think turning off iCloud photos would make any difference in whether they scan your phone or not
One of the arguments that I've heard many times is "turn off iCloud to stop scanning." But one thing I keep thinking: even if that would work at first, who is to say that turning iCloud will work five years from now?
 
we know already, these researchers already know everybody in the world knows the score. They also know Apple postponed without stating anything further due to the already ‘more than enough’ backlash!

yet still none of them care to speak anything about how and why and lack of communication that Amazon, Microsoft, Facebook , and Google have done on this very same topic.

I find that like paid agents to level Apple as they turn their focus simply for excuse. Maybe researchers should become like sciences where they too should follow the scientific method and not exclude observation against existing parties.
The reason the other companies were not as heavily scrutinised is they did it server side not on the device. Which has annoyed many people.

Apple has also spent the last 5 years telling us they care about our privacy so naturally people are going to be annoyed when they announce something like this.
 
They’re talking about scanning the photos that are about to be uploaded to iCloud Photos (and apparently it’s disabled if you disable the iCloud Photo Library). Would you be happier if exactly the same photos were scanned, but the scanning was happening on Apple’s servers after the photos were uploaded?

The impression I get is that every service is already scanning, or will soon be required to scan, photos that end up on their servers, for CSAM, and Apple figured it improved user‘s privacy if that required scanning happened client-side before the photos were uploaded rather than server side after they’re uploaded.

Baking it into the OS makes it harder for a government to come along and ask for additions - it’d mean changing iOS rather than just changing a script that runs on the server. Seems like Google and others that do it all server-side would be easier for a government to arm-twist into scanning more, since they can just change one script that customers never see - and security researchers / privacy advocates can’t analyze - rather than having to add code to the OS that then needs to be updated on every user device.
Oh right. Not sure I see that much of a distinction.
I use iCloud for email only but that said, the conspiracy theorist in me says that it'll be baked into the OS in a later update/upgrade and the wording that mentions it will be as obfuscated as F.
 
We don't how much editing is needed to avoid detection. Probably changing into black and white, changing colours, hue, contrast, some minor cropping won't avoid detection. Maybe even mirroring is supported.

But yes, adding enough new elements to the image is a way to avoid detection.

It's not as big a problem as you might think when it comes to reducing the spread of CP through iCloud, but it clearly illustrates why it's such a stupid system for governments to use to catch "unwanted elements".
I dunno. I can imagine a government or two that would scan for the Tank Man picture. In any case I was fairly shocked that the EU was considering something like Apple's system.
 
  • Like
Reactions: dk001 and PC_tech
Not only the NYT and MacRumors: most articles reporting on this fail to link the actual paper.
It is a little weird they rely so heavily on just their own works, authors are also main part of the works cited lol. No conflict of interest at all there.

It's also based on a lot of suppositions.
 
The degree of passive aggression between civil rights advocates and the surveillance state continues to astound me.

The fact that this conversion revolves around whether or not CSAM will work is the result of a community twisting itself in knots to avoid fallacious attacks of supporting child abuse while the other side is busy using the standard child exploitation/drugs/money laundering schtick to continue rug humping for more surveillance. That other side could’ve solved money laundering and drug trafficking decades ago with a flat tax and free markets.

And the child abuse issue? Maybe don’t hire FBI agents to investigate pedophiles who turn out to be pedophiles themselves.

Anyone conscious of open source platforms and the darknet already knows CSAM will only catch low hanging (stupid) fruit and do absolutely nothing to stem the tide of organized crime and human trafficking, but we’re supposed to pretend that’s what the discussion is about when Apple just embedded a keyword OCR scanning tool into iOS for screenshots and photos? How about the fact that iOS now remote loads your emails and webpages through a “trusted partner” egress proxy by default? Maybe that ‘trusted partner’ got tired of trying to break your VPN and wanted the lazyboy option for collating metadata to your identity.

But I digress – to the point: “the writing is on the wall” should be Apple’s marketing slogan for the way CSAM was juxtaposed to this new OCR function in iOS photos. Governments couldn’t care less about children or they wouldn’t let their schools decay, their air and water get poisoned, and their food intake loaded with microplastics and forever chemicals. CSAM is just the surveillance state forcing for profit companies to keyword search remotely stored user metadata and if you think it’s to protect kids I have a bridge to sell you.
 
Last edited:
It is a little weird they rely so heavily on just their own works, authors are also main part of the works cited lol. No conflict of interest at all there.

When some of the leading research figures in a specific field collaborate in a paper it's pretty natural that a lot of previous research will be their own. Some of the authors are quite well known and prolific in the field.

It's also based on a lot of suppositions.

Which is pretty obvious since the matter discussed is the hypothetical implementation of client-side scanning on electronic devices for law-enforcement purposes and its risks. When treating something hypothetical it's pretty inevitable to have to make assumptions and a perfectly fine practice, as long as the assumptions are well reasoned and supported by compelling arguments.
 
  • Like
Reactions: Grey Area and dk001
I still like Apple and will probably buy the new laptops too. But Apple has completely ruined its reputation at this point. Every time I hear the word Privacy from them from this point on I will not trust them and just laugh. I couldn't believe it when I said Privacy in the recent iPhone 13 announcement. Which, BTW I still got my iPhone 13 Pro, but their reputation is still ruined IMO. I use Windows so its not like I am brand new to having things not be private - Windows 10 does a lot of spying but all they will know about me is I like to play games and use Visual Studio.
 
Good. Wait until government pressure that feature to be mandatory, just like they enforce speed limit. Except, way worse. If you still fail to understand the extent of how dangerous it is, there is no Hope.

Enjoy reading things at its face value, cause that’s what government always want us to do no matter what.

The government can do that right now. If you think the NSA can’t (or won’t) break into Apple’s servers to look at your data, you’re living in a dream world. Spies don’t need to ask Apple for help or permission.
 
It's quite amusing how many people Apple has managed to unite against them with all this. That probably says something in itself.
 
The government can do that right now. If you think the NSA can’t (or won’t) break into Apple’s servers to look at your data, you’re living in a dream world. Spies don’t need to ask Apple for help or permission.
Heh. I know NSA now has full access to everyone’s any electronic device that is connected to the internet. That’s not the point. The point is government is openly against their own citizens, abuse power much more than before. Imagine police officer’s new job is banging random residents door arresting or killing people with no warrant just because.
 
Scanning what is already uploaded to Apple servers is one thing because cloud account data does not belong to the account holder. Such data is about the account holder, but it ultimately belongs to the company.

However, if Apple plans to have access to offline content stored physically on one's Apple device, then it is a whole new level of privacy invasion.

This is why something like Pixel phones running fully de-Googled versions of Android OS, such as GrapheneOS, are the preferred by security and privacy researchers.
 
  • Like
Reactions: BurgDog
... CSAM will only catch low hanging (stupid) fruit and do absolutely nothing to stem the tide of organized crime and human trafficking...

That is not how it works at all. Countless major busts involve catching "low-hanging (stupid) fruit" and turning it into an informant. Think of CSAM as having "foot in the door". A lawful search/scan for an illegal child abuse photos can show absolutely no illegal child abuse photos, but can show photos of some other suspected criminal activity. That is enough to begin an investigation.
 
Considering that apple already has access to everything on iCloud, do they want to save processing costs and waste users cpu cycles to do scanning instead? Or do they want to pretend that they can't access data on iCloud and appear more trustworthy this way?
 
I think it's likely that Apple may just shelve the idea (for now). The possibility of a state actor hacker using the system to look for things political is why Apple may not implement it, unless the technology improves.
You don't need a "hack." A country just need to set up a specific law to utilize the system, and Apple will have to comply.

Apple already allows pre-installed 3rd party local apps on Russian iPhones. Apple talked high and mighty when they're virtue signalling, but in the end, it's all about money.

That's why it's dangerous to have such mass scanning system in place, no matter how noble the intention were.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.