Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It is not a technicality, it is scanning, and the answer is yes. My device, my data, Apple's Servers, their data. Since the goal is to keep CSAM off of Apple servers, it's best to scan there.

That's just one goal.

Other (side) goals might be:
  • Reduce the number of people using Apple products to store and spread CP
  • Show lawmakers and others that Apple is doing something to combat this
  • Reduce the economic incentives for selling and buying CP by making it more difficult to perform such transactions (if you buy CP you have to store it somewhere)
  • Being able to provide end-to-end encryption for more data in iCloud in the future
As more and more people have a phone and maybe a tablet as their only computer, it becomes more likely that people need to store such images on their phones. This might scare some people from buying since they have no place to store it, or at least they will go PC or Android.
 
Reduce the number of people using Apple products to store and spread CP
Not a normal corporate goal! (Not profit conscious! :)
Show lawmakers and others that Apple is doing something to combat this
As others have shown, Apple's way isn't the only way.

Being able to provide end-to-end encryption for more data in iCloud in the future
Nothing has been announced, and I seriously doubt it will happen. (Not that it's one of my issues anyway)
 
  • Like
Reactions: BurgDog
Sure, and then it'll be discovered in audit and be a huge scandal for Apple. Do you think Apple wants that? I bet this doesn't happen. Besides, it's US only for the time being, so let's just wait and see.
The image fingerprint database is US and CSAM only for the time being. However other territories would soon have entirely different sets of fingerprints sourced by the relevant authorities..
 
  • Like
Reactions: dgrey
Instead, they give us this tool, that's WIDE OPEN for abuse by intelligence services.

So how would it be abused? Give a specific examples with what kind of images they would force Apple to hash and how that would help them achieve their goals.

I believe the opposite. The CSAM Detection system is an inefficient way for what intelligent services are looking for at forcing Apple to use other technologies already on the iPhone would me much more effective.
 
So how would it be abused? Give a specific examples with what kind of images they would force Apple to hash and how that would help them achieve their goals.

I believe the opposite. The CSAM Detection system is an inefficient way for what intelligent services are looking for at forcing Apple to use other technologies already on the iPhone would me much more effective.

How about text messages containing certain words? How about any kind of pornography? How about Homosexual content (a photo of two males kissing for example?) How about photos of women living in islamic countries not covered up? Any type of scanning is bad, and you don’t know what countries like China and Russia, or theocracies like Saudi Arabia or Iran might request.
 
Don't forget, Apple follows local laws, which means if you go to the far east, China, or Taiwan, Apple will report scan results to those governments the second you cross the border. If you go the middle east then Iran (via Lebanon and Syria) and Saudi Arabia will get the results. If you go to California they get the results.
The local govts will provide the hash to search on and they are not going to be child porn.
 
Apple is creating a powerful AI scanning tool to look for some images illegal to possess per US law and supposedly is doing this as a way to conform to US law about not having those images on cloud storage owned by Apple. They drive this tool with a database of image fingerprints provided to them by, effectively, an agency of the US government. And when illegal images are found and validated by Apple will file the legally required report to that US government agency preserving all evidence of the crime as required by US law.

This tool and database is installed in every Apple device worldwide that can upload to any cloud storage including those not in the US. I would expect first that every other country would be appalled by this expectation to follow US law for their people but second would expect that tool to be used with a database provided by each of them and a validation and reporting process defined by them to enforce their law on their own people.

Apple has assured everyone that they will only use this tool on people under US jurisdiction and uploading to US cloud storage. Apple has stated they won't follow the demands of other countries to modify the tool to conform to their laws, and will only enforce US law with the tool. Still the tool with internal knowledge to look for violation of US law will be on every device in every country, even if disabled outside the US.

I wonder if Apple has really thought this through. It would be a lot simpler to just do the input scan on those cloud servers that required it and not install a module on all their devices world-wide to do the scan when pushing an upload.
 
Last edited:
Let's not be naive here. Regulations revolve around the simplest model of human behavior - People WILL be greedy, abusive, thieving snake bastards whenever they believe they can get away with it. Apologies to snakes. Goes double for corporations with profitability and share-price pressures. Industrial pollution. Tax Loopholes. Dark Money in Campaign Funding. Labor Management. Right to Repair. Medical Billing. And pretty much everyfreakingthing else in the world.

Apple is no longer a visionary, humanitarian technologist, if they ever were; Apple is now Dennis the stocktaker. For example, there's staggering liability exposure if section 230 is weakened. There's staggering revenue to claim in countries without civil liberty enshrined in a working constitution. Divorcing business practice/assets from end users' debatable conduct enriches/protects Apple, not kids, from regulatory and judicial proceedings.

Now, consider that ANY stocktaker with a flying saucer full of lawyers, would know what a crap-storm the CSAM announcement would trigger. Apple obviously felt compelled to either announce and take the heat - or announce specifically so society would notice and DEMAND that the system be dismantled. Either way, the question becomes: Which other IT companies HAVE GONE AHEAD AND DONE IT quietly?

Perhaps Apple is merely leaking compliance with some black-on-black program that's already in the can. It's not like Microgoogazon were going to argue with the FBI, CIA, DHS or DOD, with trillions in future cloud computing contracts at stake.
 
  • Like
Reactions: BurgDog
So in your opinion a criptographically air gapped device is less private than servers that are potentially accessible (like, physically accessible) to thousands of employees and government officials.

Interesting theory.
Yes. There's no such thing as a "cryptographically air gapped device" in computer science. If the tracking is done online you can disable your network and know you're safe. If it's done on device, there's nothing you can do except never go online.
 
Exactly. I hope this initiative sets a precedent for corporations to be more and more accountable for these technologies... however, if governments and institutions truly cared about this issue, they would invest heavily in addressing the root causes of child abuse instead of policing and punishing after the abuse has already happened. It doesn't make any sense whatsoever. This is not about protecting children, it is about control. I am glad the public and organizations are speaking up and I hope the momentum is not lost.
Ignoring the slippery slope fallacy, the flaw in this argument is that some governments and institutions don't care. Look as the child labor that occurs in China or the child brides all over this planet of cased in point. The scandals of the Catholic Church and the Boy Scouts are other examples.

Even when the governments and institutions do care what are the "root causes" are there for child abuse? Never mind that there is a huge difference between child abuse and sexual exploitation of children.
 
This is no logical fallacy. This is truth to logic.

EX.
-If we let all criminals in jail out -> crime incidents will increase
-If we lower interest rate -> more people will take loans
-If we make a discount -> our sales will rise
These are all excluded middle fallacies to support an argument from adverse consequences positions.

Heck, two of them are total non sequiturs.

-If the reason that there are lower interest rates is if 1/5 are out of work how does it follow more people will take loans? Sure as bloody well didn't work that way during the Long Depression or Great Depression.

-If a discount is made and nobody wants or knows about the product how does it follow sales will rise?
 
  • Like
Reactions: JMacHack
Sure, and then it'll be discovered in audit and be a huge scandal for Apple. Do you think Apple wants that? I bet this doesn't happen. Besides, it's US only for the time being, so let's just wait and see.
If they couldn't predict that they'll get court orders, national security letters, and so on, they need their heads examined. After all, they've had plenty of warnings and prior examples.

Server-side searches should utterly scare any privacy-minded individual just as much, if not more.
And Apple’s scan results are collected ONLY AFTER you surrender your pics to iCloud anyway.
Server side approaches make it easier to detect whether they're scanning material you didn't intend to upload. The tokens are much smaller than the original files.
Why would they scan for CSAM twice?
Because not everyone will update their software, iCloud Photos has a web interface, and because the first rule of security is not to trust the client - on macOS one could use a patched, version of Photos that attaches the token for random noise, or cat photos, or whatever. (That won't work on iOS without jailbreaking, but you can use unsigned and self-signed mac apps.)
This begs the question of why we still live under a government we view as repressive?
Because it is sufficiently repressive that we can't meaningfully vote against said repression, and can't emigrate to anywhere with a meaningful democracy.
Why did we put trust in a private entity (who is always beholden to the governments of the markets they operate in) to protect us from our own country?
Because their efforts to protect our privacy were their principal selling point.

So just because something isn't required by law, companies that are proactive about not wanting CSAM on their servers are automatically suspect in their motives? What kind of twisted logic are you employing to arrive at such a conclusion? It's not required by law - so what? It's not prohibited by law.
Aside from the question of whether they're wasting shareholders' money, they're abandoning a long-held pro-privacy position in favour of doing an end-run around encryption and making promises they know they can't keep.
Apple had to implement CSAM. It’s a US Federal initiative.
No they didn't:
(f)Protection of Privacy.—Nothing in this section shall be construed to require a provider to—
(1)
monitor any user, subscriber, or customer of that provider;
(2)
monitor the content of any communication of any person described in paragraph (1); or
(3)
affirmatively search, screen, or scan for facts or circumstances described in sections (a) and (b).
 
Ignoring the slippery slope fallacy, the flaw in this argument is that some governments and institutions don't care. Look as the child labor that occurs in China or the child brides all over this planet of cased in point. The scandals of the Catholic Church and the Boy Scouts are other examples.

Even when the governments and institutions do care what are the "root causes" are there for child abuse? Never mind that there is a huge difference between child abuse and sexual exploitation of children.
I think that's his point: they don't really care. If they did, there would at the very least be effective enforcement against the catholic church, the boy scouts, and so on, plus legislation to punch though the many corporate veils the catholic church hides behind to ensure that no organisation is ever liable for anything if it has any assets.

If they were a bit more enthusiastic, they'd push for a WTO rule allowing (or even requiring) countries to require importers to prove that they don't use children, slaves, etc..
 
Not a normal corporate goal! (Not profit conscious! :)

As others have shown, Apple's way isn't the only way.


Nothing has been announced, and I seriously doubt it will happen. (Not that it's one of my issues anyway)

Having your products and services not being used by criminals might avoid getting a bad reputation with customers which might be good for business. Also it helps with politicians like Lindsey Graham who "threatened" Apple to enact new laws if they didn't do more.

Yes, there are alternative methods, but it doesn't change the fact that they can use their method to show they are doing something.
 
German government / Bundestag just asked Apple to drop this feature !
I am sure more will follow from other countries.

Apple should just put the mess in the cloud, then it would not scan private data on user devices (what is prohibited in many countries).

Currently it's just a spyware / backdoor completely controlled by a company – this is just illegal (and dangerous if it gets hacked - think Pegasus).
Well done Germany !
 
  • Like
Reactions: Playfoot
Seriously? So just because something isn't required by law, companies that are proactive about not wanting CSAM on their servers are automatically suspect in their motives? What kind of twisted logic are you employing to arrive at such a conclusion? It's not required by law - so what? It's not prohibited by law.
Seriously?

This was in response to many stating that Apple MUST do this as it was required by law. Simply pointing out that was not the case.

Seriously.
 
  • Like
Reactions: BurgDog
Nobody ever actually challenges me on the fact that the actual verification of the results of the scan is on-server and only after you’ve already surrendered the pics to Apple’s server anyway. Hence making it for all intents and purposes a server-side scan.

They seem to fall back to just not wanting any CPU cycle of their local device being devoted to the preliminary airgapped scanning. Unreasonable, weird, petty, unfair, dramatic, etc. or in one word: specious.
Well, ignoring the fact that coiffed mane Craig clearly stated it is "on device", the issue I have, or one of many issues, is that Apple has absolutely NO idea of what is in the database(s) it receives from law enforcement (yes, initially on in the USA). So, matching hashes of something, whatever the something is, for now CSAM, is matching hashes.

However, in the future, who knows what is in the database. Remember NCMEC has been deemed a de facto law enforcement agency. And I could repeat myself FISA, NSL, China, ad nauseam.
 
Seriously?

This was in response to many stating that Apple MUST do this as it was required by law. Simply pointing out that was not the case.

Seriously.

No, it was a response to me (you quote my post, not someone else's), and I never said such a thing. And on top of that, you most certainly were not "simply" pointing that out. You then implied there must be some mysterious (likely nefarious) motive ("So, the question is why?" - queue the dramatic music).
 
Aside from the question of whether they're wasting shareholders' money, they're abandoning a long-held pro-privacy position in favour of doing an end-run around encryption and making promises they know they can't keep.

That is incorrect. As has been explained countless times, on-device scanning is MORE private, because Apple has no access to the scanning data. No scanning happens at all unless you use iCloud for photos, and if you do, Apple never sees any scanning data unless you upload a collection of known CSAM to iCloud.
 
These are all excluded middle fallacies to support an argument from adverse consequences positions.

@MacBH928 rattled off a couple of quick examples with fairly loose wording that superficially invoked "excluded middle" - but, heck, this is an internet forum not a prestigious academic journal where authors might be expected to provide a reference for every last assertion made or implied.

In your "debunking" you have:

(a) Taken "->" to mean "it is inevitable that" rather than "it is likely that". Almost any discussion about social/economic issues involves likelihood rather than deterministic cause and effect.

(b) Assumed that the "missing link" is unlikely or contentious, and needs spelling out - it really isn't an extraordinary claim that lower interest rates might encourage more people to take out loans, or that released criminals are more likely to re-offend than the average person.

(c) Chosen a couple of special-case counter examples that are unlikely or implausible. The Great Depression was an extreme event in which many well established economic principles broke down. It's highly unlikely that a seller would offer a discount without advertising it. We're not discussing a mathematical theorem where a single counter-example disproves the theorem. Even in science, many "laws" only apply within certain limits.
 
  • Like
Reactions: BurgDog
Does Corellium hate Apple or they just slip up more often with this? Then again fueled by media claims.. Craig already explained his "stuff up" This is what happens when you announce something at an event,, but confuse everyone

If Apple made it clear from the start, it may of been very different, but no doubt 'anything-privacy' would find its ways to the courts anyway
 
No, it was a response to me (you quote my post, not someone else's), and I never said such a thing. And on top of that, you most certainly were not "simply" pointing that out. You then implied there must be some mysterious (likely nefarious) motive ("So, the question is why?" - queue the dramatic music).
At this point, I can't remember. However, I must assume there was something you wrote that made me think there was a legal question involved...Or I mistakenly clicked on your message.

No, not nefarious, just questioning why. In my opinion Apple has needlessly burned privacy capital, Apple has "needed hard" to create a system that could be used for State security, Apple has fumbled the launch and the PR. Why all of this when it was NOT required by law? Some have pointed out a greater and better present such as expanded E2E. However, no one is in possession of a crystal ball.
 
  • Like
Reactions: BurgDog
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.