Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I don't want to get defensive about CSAM because I think those people deserve the death penalty. However, you're wrong about being fined. Apple will only get fined if the company knows of CP content and doesn't report it. How will Apple know of it if those users never utilize iCloud Photos? Therefore no fines.

By embedding the hash scan within iOS, it can be assumed that Apple thinks every user is a suspect of CSAM. The next thing that Apple might scan for is what you posted: terrorism and possibly drugs. If hashes can catch CP, then it can definitely catch terrorism. Every Apple user in America, will then be a suspect of terrorism.
Well yes, the next chapter title in any narrative that begins with "For the Children" is always called, "Because Terrorism!" I strongly agree with you, every customer of Apple just became a suspect of terrorism and potential thought-crimes, because a monolithic tech company is colluding with government to enable warrantless searches, which the government itself could never do because we used to have a system of checks and balances (within the US anyway).

But, Apple's walled garden is keeping the world safe from threats like Pegasus which couldn't possibly work on SpyPhone. Oh, no, wait... nevermind. Strange they haven't mentioned it in any security updates, I guess they're too busy r00ting their own OS to get around to patching exploits anymore.

Happy Apocalypse!

Apple iOS Lack of Security - snitchOS.jpg
 
Last edited:
I say again....

Many people have stated that Apple is required to do this, as a service provider.

Someone also linked to the actual law; 18USC2258A.

Here's an interesting part of this. 2258A, section (f)

(f) Protection of Privacy.-Nothing in this section shall be construed to require a provider to-

(1) monitor any user, subscriber, or customer of that provider;

(2) monitor the content of any communication of any person described in paragraph (1); or

(3) affirmatively search, screen, or scan for facts or circumstances described in sections (a) and (b).

Now... read that again carefully. NOTHING in this section shall be construed to *require* a provider to...
... MONITOR ANY USER, SUBSCRIBER, OR CUSTOMER
... MONITOR THE CONTENT OF ANY COMMUNICATION...
... AFFIRMATIVELY SEARCH, SCREEN OR SCAN FOR FACTS OR CIRCUMSTANCES.


That being said, this is a CHOICE by Apple... and NOT A REQUIREMENT. In fact, the law specifically says that they are NOT REQUIRED to scan, monitor or search for CSAM. Just to report it if it is discovered.
They never said they were actively looking.

They are putting the means to identify it in THEIR software system as is their right (to be argued of course, but nothing to do with this law). They are not monitoring as they have no idea what is on your phone at any time.

YOU upload illegal images to iCloud.

They "discover" it and report it to NCEMC as the law states they have to. Thanks for clarifying what everyone has been saying.
 
  • Like
Reactions: MarvinK9
They never said they were actively looking.

They are putting the means to identify it in THEIR software system as is their right (to be argued of course, but nothing to do with this law). They are not monitoring as they have no idea what is on your phone at any time.

YOU upload illegal images to iCloud.

They "discover" it and report it to NCEMC as the law states they have to. Thanks for clarifying what everyone has been saying.
They never said they were actively looking?
WTF do you think "HASHING YOUR PHOTOS AND CHECKING THEM AGAINST A DATABASE" means?

I've said it before, I'll say it again. You, sir, are a troll.
 
And that's kind of my point (see my previous post). This is a DECISION by apple. Not a requirement. In fact, the law that they are using to justify this move specifically states...(and I quote)

(f) Protection of Privacy.-Nothing in this section shall be construed to require a provider to-
(1) monitor any user, subscriber, or customer of that provider;
(2) monitor the content of any communication of any person described in paragraph (1); or
(3) affirmatively search, screen, or scan for facts or circumstances described in sections (a) and (b).

... This law specifically and very clearly states that they do NOT require a provider to do what they are doing.
Apple seems to have ignored this fact....

(Source: https://uscode.house.gov/view.xhtml...lim-title18-section2258A&num=0&edition=prelim)
I agree this is something Apple decided to do and was not a requirement.
I understand there are concerns that in the future other decisions could be made that I would not like.
I still agree with this decision.
I'll protest with the rest of you when and if someone tried to censor political opinions and other free speech with similar tech.
That I agree with catching CSAM in this way does not imply any consent to extend this feature to other content, nor does it enable it.

You wrote: Agreeing to surveillance of a billion users "Just in case" = bad idea.

I agree that this kind of technology can be abused. But I don't think it can be stopped. Which implies it is going to happen for all sorts of things, and since that's inevitable, this should be one of those things. We already agree to mass surveillance in certain situations for particular purposes - we don't get bent out of shape when a patrol car cruises down our street, we have planes in the sky measuring our speed on the highway, etc. That doesn't mean we agree to mass surveillance for all purposes. At least with this tech it can be controlled. Whereas people will recognize things they aren't supposed to be looking for if they see it, the machine can be programmed not to. That makes this kind of automation better.

The difference is one system is employed for the end-user's benefit.
Well here we just need to re-frame the end-user as all iPhone users. Perhaps it isn't to the criminals benefit, obviously. However, identifying and reporting CSAM images is to my benefit as I don't want that to be a part of the society I am living in. So I will allow it on my device and encourage it to happen other places for the greater good. I might not like it when the cops catch me breaking the rules (that isn't necessarily for my benefit), but I still want them out there trying to catch criminals in general (I benefit from there being less crime).
 
They never said they were actively looking?
WTF do you think "HASHING YOUR PHOTOS AND CHECKING THEM AGAINST A DATABASE" means?

I've said it before, I'll say it again. You, sir, are a troll.
Are you intentionally misrepresenting what it means to be actively looking?

They only look when you try to send the image to them. This is a user-initiated operation involving transferring CSAM to Apple servers. It is distinctly separate from the scanning that happens to every picture you take for other general classification purposes - the "active" scanning is the stuff you are already okay with.
 
  • Like
Reactions: MozMan68
Are you intentionally misrepresenting what it means to be actively looking?

They only look when you try to send the image to them. This is a user-initiated operation involving transferring CSAM to Apple servers. It is distinctly separate from the scanning that happens to every picture you take for other general classification purposes - the "active" scanning is the stuff you are already okay with.

Exactly….calls me a troll, but consistently combines actions and flat out lies or misrepresents what is actually happening with this process.

You may not like the way Apple is marking the pics, but the fact that even they do not know if it ever happens unless you turn on iCloud Photos and upload it to their servers means you are in total control of your actions.
 
Well here we just need to re-frame the end-user as all iPhone users.
This fails on two logical fallacies. The first being a straw man, in attempting to re-frame the argument. Spyware on my mobile devices does not benefit me. Photo categorization software does. The second being to beg the question. To wit: What you find to be for the greater good is agreed to be for the greater good.

Nice try, but no cookie for you.

Now, to the argument you made: I agree eliminating CSAM is for the greater good. What I will argue is that I have no right to force my moral imperatives on others. So, no, I will not suggest others compromise their privacy in support of my beliefs.

You, OTOH, seem just fine with bending others to your beliefs by whatever means necessary. People like you scare me.

Are you intentionally misrepresenting what it means to be actively looking?

They only look when you try to send the image to them.
Wow.

The cops aren't "actively looking" when they install traffic light cameras because they only trigger if you blow the light? (I don't agree with those, either.) The TSA isn't "actively looking" because they'll only search you if you try to use air travel? Somebody wouldn't be "actively looking" if they aimed a surveillance camera at your house because it only sees you if you leave your house?
 
Exactly. They ACTIVELY CHECK what is being uploaded to them. It cannot be turned off. It's happening on my device. That is an active, intrusive, search and "Scan for facts or evidence", which is clearly outlined in the law as something that they are NOT required to do.
You are just flat out wrong...nothing that Apple or anyone else can see happens on your device. YOU must upload illegal images to THEIR servers.

You have no RIGHT to use iCloud Photos. It is an offered feature. Simply turn it off and nothing happens (on your phone or in iCloud).

Upload illegal pics to Apple's servers, all bets are off. It's their server, they can scan for ANYTHING in all of your pics/documents....whether they are required to do so or not. If you are upset that they "choose" to help stop child pornography, don't use Apple products. But, if they choose to search your photos for illegal content and find it in iCloud, they are indeed required to report it.
 
  • Like
Reactions: MarvinK9
You are just flat out wrong...nothing that Apple or anyone else can see happens on your device. YOU must upload illegal images to THEIR servers.

You have no RIGHT to use iCloud Photos. It is an offered feature. Simply turn it off and nothing happens (on your phone or in iCloud).

Upload illegal pics to Apple's servers, all bets are off. It's their server, they can scan for ANYTHING in all of your pics/documents....whether they are required to do so or not. If you are upset that they "choose" to help stop child pornography, don't use Apple products. But, if they choose to search your photos for illegal content and find it in iCloud, they are indeed required to report it.
And I'm done.

Troll.
 
The tech is the same, the opportunity is the same…and if you are really worried about hash images or data associated with those hashed that are hard coded into the system by Apple could be compromised in any way, you’ve been watching too many spy movies.
In un-related news... didn't the government just say that Apple has to allow third-party apps, side-loading, and 3rd party app stores?

I'm sure there's nothing that a rogue coder could do to those "hard coded" hashes....
 
In un-related news... didn't the government just say that Apple has to allow third-party apps, side-loading, and 3rd party app stores?

I'm sure there's nothing that a rogue coder could do to those "hard coded" hashes....
I don't think it actually matters. Apple did a lot of innovative and brilliant things in the past, but it's possible that their entire "walled garden" narrative is just a narrative... It seems to have 0 effect on malware like Pegasus which is quite probably still embedded within Apple's tangled up mass of duct-tape & superglue with a shiny exterior. I mean, objectively... all their security doesn't seem to have helped much.

So pretend you're an engineer at Apple. Off the top of my head, your list of moving targets is:

- Transition from Intel to Apple Surveillance.
- All the exciting new features in Monterey because it'll have a changeable color mouse pointer!!!
- Pegasus ... it seems to 0wn iOS, kinda a problem they haven't addressed publicly, since they've been busy with their own backdoor program blowing up in their face lately.
- Installing all the new DBs that replace CSAM that other governments want to access.
- Complying with list of 101 countries who will get in line with special requests.

I think there are 1001 additional entries to that list, and Apple has an awful lot of work to do. Over in that mythical place called "the real world" Catalina was a dumpster-fire of an OS release and they weren't even dealing with all of the above back then. Big Sur has been pretty much okay, so that's good, since it gives me someplace to idle out for a few years and figure out where I'm jumping ship to, or whether I'm content with just jailbreaking my own property and kicking Apple's crap out of the process table.
 
i'm out, no icloud for me, reason is that if apple is wrong and you get wrongly identified and the prosecutor repeats the line about "1 in a trillion" false positives, you are headed to the big house because, like dna matching the judge and/or jury is gonna eat that up, "he has to be guilty since the odds are 1 in a trillion!"

can anybody point me to a really simple and cheap cloud backup solution ? :)
 
i'm out, no icloud for me, reason is that if apple is wrong and you get wrongly identified and the prosecutor repeats the line about "1 in a trillion" false positives, you are headed to the big house because, like dna matching ...
It doesn't work that way. The hash matching is only used to determine the possibility there's CSAM. Eventually somebody reviews the actual image(s) that triggered the CSAM-scanning system.
 
the "possibility" is plenty enough for me :(
I don't disagree with you, in principle. I'm one of those who will be abandoning the Apple ecosystem if they go through with this (and maybe even if they don't, because now my trust in them is compromised), but I believe it important to argue the merits of their solution on facts. And the fact is: Nobody will be arrested, tried, and convicted based purely on a hash match. Well... not yet, anyway.
 
I don't disagree with you, in principle. I'm one of those who will be abandoning the Apple ecosystem if they go through with this (and maybe even if they don't, because now my trust in them is compromised), but I believe it important to argue the merits of their solution on facts. And the fact is: Nobody will be arrested, tried, and convicted based purely on a hash match. Well... not yet, anyway.
right, i was arguing for the system in another thread, we have no idea what kind of political pressure apple is under and also there is the possibility that they may be rolling out e2ee and so this system is the bone they throw to authorities

the design looks and sounds ok, i understand how it is supposed to work but like a lot of people this has touched a nerve for me, being accused, tried and convicted of being a "possible" pedophile is just too much

apple is now google for me, microsoft, apple, google ... all the same

the prime directive now is to assume that ALL tech companies are hostile to their users best interest, trust no one and proceed accordingly
 
  • Like
Reactions: lxmeta and bobcomer
Putting aside the on-device vs cloud scanning argument for a moment...

How many people have been wrongfully accused for possessing CSAM on Google, Facebook, OneDrive, and others? There are a billion photos uploaded to those services every day.

Surely we would have heard stories like "I was arrested for child porn... but it was simply a picture of my breakfast!"

:p
 
  • Like
Reactions: jfim88
Putting aside the on-device vs cloud scanning argument for a moment...

How many people have been wrongfully accused for possessing CSAM on Google, Facebook, OneDrive, and others? There are a billion photos uploaded to those services every day.

Surely we would have heard stories like "I was arrested for child porn... but it was simply a picture of my breakfast!"

:p
Nope! You would not! Because there are plenty of people who would not believe the wrongly accused. If one gets accused for child porn or rape, something stays behind in terms of reputation, even if the court decides he was innocent.
 
iPhone - 699$
iPhone without privacy - 399$
iPhone without privacy, without iCloud - 99$

Why buy an iPhone? Goodbye Apple, welcome Android, sabe money instatly!
 
  • Like
Reactions: Mega ST
New iPhone, New Macbook, New iPad with more spyware, paid iCloud for get spy.

Definition: Spyware is the term given to a category of software which aims to steal personal or organisational information. ... Once a spyware gets successfully installed, it starts sending the data from that computer in the background to some other place.

My phone, my rules!!! I can’t pay more for less privacy.

spyware spī′wâr″
►​

  • n.
    Software that secretly gathers information about a person or organization.
  • n.
    Any malicious software that is designed to take partial or full control of a computer's operation without the knowledge of its user.
  • n.
    programs that surreptitiously monitor and report the actions of a computer user.

 
Nope! You would not! Because there are plenty of people who would not believe the wrongly accused. If one gets accused for child porn or rape, something stays behind in terms of reputation, even if the court decides he was innocent.

But it wouldn't get to the courts.

Once the Apple employee verifies that it's *not* a known MCNEC CSAM photo... and instead it's a picture of a mermaid at Disneyland... it's done.

Besides... there has to be 30 photos before this manual review even gets triggered. Do you think the hashes will create false positives 30 times in a row for a single user?

And let's be clear... this sort of scanning is ALREADY happening across billions of photos on Facebook, Google Drive, OneDrive, etc. This didn't start with Apple and their recent announcement.

Is the court system clogged with phony accusations?

No.

Look... I know a guy who is currently serving a 7-year sentence for possessing and distributing child porn.

He did have this objectionable material on his Android phone and in his Google Drive and GMail. And he was investigated for 6 months by a child-safety task-force before local law enforcement was even called. The court case took two years. He was first arrested in April 2019 and he went to jail in May 2021.

The system works at catching the *real* bad guys.

It's not gonna send you to jail accidentally.
 
But it wouldn't get to the courts.

Once the Apple employee verifies that it's *not* a known MCNEC CSAM photo... and instead it's a picture of a mermaid at Disneyland... it's done.

Besides... there has to be 30 photos before this manual review even gets triggered. Do you think the hashes will create false positives 30 times in a row for a single user?

And let's be clear... this sort of scanning is ALREADY happening across billions of photos on Facebook, Google Drive, OneDrive, etc. This didn't start with Apple and their recent announcement.

Is the court system clogged with phony accusations?

No.

Look... I know a guy who is currently serving a 7-year sentence for possessing and distributing child porn.

He did have this objectionable material on his Android phone and in his Google Drive and GMail. And he was investigated for 6 months by a child-safety task-force before local law enforcement was even called. The court case took two years. He was first arrested in April 2019 and he went to jail in May 2021.

The system works at catching the *real* bad guys.

It's not gonna send you to jail accidentally.
You guys are missing the point.

It's not about scanning for CSAM.

It's about scanning AT ALL.

The opportunity for abuse is HUGE. All it takes is changing the database hash, and now you can scan for political messages, terrorism, BLM, Antifa... you name it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.