Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
client side *anything* is more private than server side everything. This premise is especially evident in privacy circles. If it’s done on your device, it’s inherently private.

LOL!! Thank you, I needed a good chuckle today 😂 😂 😂

Server side scanning = the system is ONLY ever capable of scanning content that you upload to them

Client side scanning = the system is CAPABLE of scanning ALL content on YOUR property, regardless of whether or not you employ their cloud services. client side scanning sets the precedent that scanning your property is okay, regardless of what BS Apple put out in their whitepapers and peer review papers about only running the scan on content destined for iCloud -- that all becomes irrelevant the moment a Senator publicly announces that Apple have demonstrated the goals of the government's mission to 'backdoor encryption to catch terrorists' is now possible via Apple's strategy without 'breaking' encryption which is the excuse Apple and others have relied upon for years to counter the 'backdoor' request. The EU are already in the process of exploring a similar solution to what Apple are voluntarily rolling out.

Also "muh slippery slope" doesn't apply if there is legitimate evidence that can be cited to demonstrate what implementing client side scanning WILL lead to. Did you also think activists and academics were fallacious in their critique of the patriot act potentially being used to subjugate innocent Americans? Because that's what ended up happening. If only there was an entire library's worth of US history demonstrating how willing the government are to abuse their own citizens in the name of protecting their god of capital.
 
Last edited:
You missed the analogy that the items considered fair game to report is not within public purview. Anyone can add things to the list, or come up with entirely new lists. Like China that wants dissident materials kept off of iPhone engravings - not hard to imagine them quietly informing Apple “either also scan for these images (tank man; self-immolating man) or you can‘t sell in China”.
You missed the part where no one can add things to the list. Or come up with entirely new lists.
Apple combines the list of multiple child protection agencies. And only if an image is in all lists, it’s included in the DB. Apple doesn’t even know what’s in the list. It’s also part of the OS, and not a list that can be updated on the fly.

And again, what if China somehow does manage to compromise the DB. What then? An iCloud account will flagged, Apple will verify and notice immediately their system has been tampered with.

Hundreds of posts speculating how the system will be misused, without having a clue or understanding what system Apple built…
 
Last edited:
  • Disagree
Reactions: pdoherty and rme
Ran into this video this morning.
Check out the time stamp 11:40
A complete and utter "about face".

Apple has created a system that allows for a database created by law enforcement, yes, NCMEC has been found in a court of US law to be a de facto law enforcement agency, to be loaded onto users' devices to search for material. For now, it is CSAM. Tomorrow what might be in the database.
 
my post for today and why its ridiculous to have our user hardware used rather than icloud:
It's now likely the database has in region of 4,200,000 hashes as organisations share so its not just USA, rising at the rate of 8,000 per day, but contrary to what some have suggested these organisations are NOT crime fighting agencies in law and some may consider it rather a convenient reason they are not and where this database is bound to rise exponentially with organisations sharing data.

Even though hashes may contain limited data, it is still a backdoor, and not knocking child abuse organisations often charities NOT deemed to be authorised agencies and some without even security clearance, it is a backdoor with the potential for wholesale surveillance of our private devices and it will NOT be restricted to the USA, not be restricted to iPhones.

Now given the extent even so far and the rate of change, where even if a hash size is 16bits this database will represent quite a chunk, require user processing power commensurate with an ever growing has list and where the results are still likely to have hash problems where I believe the 1 in a trillion example by Apple is way too optimistic and where hashes can be compromised along with the CSAM application that Apple use, and according to some already has.

So the science is not safe to start with, and certainly not via our hardware, because that is where we separate child safety and surveillance, and via our hardware it is surveillance and even if it was kept to child safety, which is very unlikely in my opinion, it still represents a backdoor, which is why those arguing solely on child abuse or minutiae of its workings have the situation so wrong as its still a backdoor waiting to be exploited by governments/hackers/despots, and even companies that have grown so much they now consider themselves policemen of the world, but even decide what the laws are, leaving it open for unscrupulous use of this technology.


Hardware based illustrates another problem, the never ending need to update and increase hash database on at least a weekly basis, with an ever increasing file size, processing time etc., processing time we have paid for in some cases to do specific jobs and where we might not even use iCloud at all, but where the software is still there.

Now to why this has come about in this fashion.
In the USA where its being implemented one word describes why its coming in: AVOIDANCE.

In common with many countries the USA has its own privacy laws and in particular in the USA the 4th amendment safeguards or was mean to.

However the 4th Amendment interpretation is rather different to how it was envisaged when it was created, and where the USA and no doubt other countries have found a way to circumvent privacy laws, so they can shout from the rooftop about safeguarding privacy, whilst driving a coach and horses through it, FOR ANY PURPOSE, so its not about child abuse.

With regards to the USA they are doing this because the 4th Amendment after Court rulings decided that you can circumvent the privacy enshrined in the 4th Amendment because they deem it ONLY APPLIES TO GOVERNMENT ACTION BUT DOES NOT CONSTRAIN PRIVATE PARTIES "not acting as an agent of the Government or with the participation or knowledge of any government official" Jacobsen, 466 U.S. at 113 (quoting Walter v. United States, 447 U.S. 649, 662 (1980) (Blackmun, J., dissenting)). Must correct elsewhere that the parties checking the data have been ruled to be government agencies, as the above shows the ruling was that they are not, which is what allows Government players to utilise these hash collection agencies, because it allows government to rule that it is not against the 4th Amendment by virtue of these collection organisations NOT being government agencies. Convenient way to circumvent privacy rules in the 4th.

Similar laws have been interpreted similarly elsewhere with new laws brought in to allow it on a similar basis to above.

There is no doubt in my mind now that Government (along with agencies within who have openly called for a backdoor) have sought to bend the rules knowing that the precedent now that a private party can effectively invade privacy is a green light to conduct wholesale surveillance, and what better way of companies bringing that in but under the guise of child safety.

Sad for me that if Apple do this via my hardware and yours and it will NOT be just iPhones, it will have gone from being part of the solution, to a part of the problem sacrificing two decades of user targeted features plus the emphasis on privacy and security. The slogan 'what happens on the iPhone stays on the iPhone' The same with all other Apple devices this is likely to occur on in my opinion. R.I.P.

It still makes no sense logistically to have these functions anywhere near our hardware, the fact that it is being done that way, where it will no doubt require constant updates, etc. etc., illustrates to me at least, this is definitely the start of a slipper slope of mass surveillance of our own equipment, and with the rulings above, its the green light that can be used to implement it, and probably why others have not gone the route of user hardware being having a backdoor, which is what this is and what no doubt will result in surveillance of much more than a simple hash, but even hash checking is surveillance let alone what comes after.

There is no doubt based on the information available that agencies and governments calling for backdoors have been successful by the interpretation of laws, yet enabling them to shout about protecting privacy which is basically stretching the truth now, and there is no doubt that some companies association with China has meant bespoke arrangements for China which are anything but protecting privacy.

I doubt whether Apple or any organisation will comment about 'duress' where many of the above possess great leverage, as China has demonstrated, but with democratic governments where some should be holding their heads in shame, as espousing privacy whilst actively coercing private companies to run a coach and horses through it.

However it still does not alter that Apple has no legal obligation to do what its suggesting, and my gripe is and always has been that it is unnecessary and unwise to use customers hardware to do it, and makes no sense logistically either. If Apple are going to do this make it server based completely. The fact that they are not will give the real suspicion and probably factual that the backdoor will be open for a lot more than the excuse being used.


meteor edited: 16bits was supposed to read 160 although academic as it was to indicate that even limited data multiplied still represents significant data and with hash lists growing will only increase hence if Apple want this make it solely via iCloud not our hardware
 
Last edited:
If ever there was a better reason to use nothing but Signal Private Messenger, which is true E2E encryption, I don't know what it is.
You really think Signal is secure when it’s running on an OS that has a built in backdoor? Don’t think for a minute that your messages in Signal can’t be accessed. I think if you’re in a mostly free country and you’re not involved in politics then it’s okay. No one important cares about the private or maybe dirty conversations you might have with your significant other. Now for example if you’re in a middle eastern country and happen to be messaging your friends about how your government is restricting woman’s rights so you want to setup a protest march then maybe not a good idea to use an iPhone.
 
can't wait...

1. Apple has billions in cash
2. false positive causes me to lose my current job
3. professional reputation ruined
4. my attorney(Jackie) files $10 million lawsuit on my behalf(30 times my annual income)
5. Apple settles for [undisclosed] amount. Attorney Jacke takes his usual cut.
6. move into my new 150-foot yacht and circle the globe for the next 15 years.
7. Thank you, Apple.

Basically, you are living in fantasy land? Apple has limited immunity which make winning cases against providers like Apple very hard.
 
You have far more knowledge of ‘what’s happening with device code’ if you choose to look, than you will ever ever have on the code that is running on a remote server. This is a clear fact. It’s not an opinion, misguided or otherwise.
There is a thing called escrow agreement.
This has nothing to do with CSAM. Apple finally caved into government pressure to put a backdoor on the iPhone. Now Apple has access to your photos and messages without your knowledge. We're just supposed to take their word for it that they're only using the access to compare hashes to CSAM

Thay always had access to your photos and messages … you gave the permission todo so.
 
So why do they need another backdoor then? Making my private phone a Swiss cheese.

My snail mail letter box has never contained anything sinister. I still have a lock and key to keep my personal mail private.
 
Since your phone includes more information about yourself than iCloud does, any scanner has more access than if they just scanned server side. And since it's down on your device, iCloud is irrelevant now, and it's oh so easy to change what is scanned without most users even knowing about it.

You can store up to 4Tb on iCloud while many users only have 64Gb storage on their phones.

I have more data in iCloud than on my iPhone, iPad and Macs.

If scanning is to occur no matter what, it's better for me that it only scans on device.

The best solution for me would be end-to-end encryption on all iCloud data and scanning occurring on as little data as possible on the phone.
 
Last edited:
I wonder how those Linux phones are coming along.

They will be worse for almost everyone if they aren't even going to use Google services and apps.

To use most of Google services and apps including their app store, you need Google Play Services, which contains scanning software. This scanning software scans regularly at least part of your file system.
 
They will be worse for almost everyone if they aren't even going to use Google services and apps.

To use most of Google services and apps including their app store, you need Google Play Services, which contains scanning software. This scanning software scans regularly at least part of your file system.
So it’s down to pick to the lesser of 2 evils.
 
Whatever the governments "ask" them to do with it. This just makes it easier to comply, but yes, I would be surprised if they didn't fork over info given what they want to do now, it all follows.

What is much easier is just for Apple to turn on iCloud backup secretly on your phone.
Then your iPhone will silently backup almost all of your data to iCloud which Apple have access to.

The CSAM Detection system is extremely inefficient to survey your whole device and isn't capable at all at looking at non-image data.
 
  • Haha
  • Like
Reactions: dk001 and bd139
LOL!! Thank you, I needed a good chuckle today 😂 😂 😂

Server side scanning = the system is ONLY ever capable of scanning content that you upload to them

Client side scanning = the system is CAPABLE of scanning ALL content on YOUR property, regardless of whether or not you employ their cloud services. client side scanning sets the precedent that scanning your property is okay, regardless of what BS Apple put out in their whitepapers and peer review papers about only running the scan on content destined for iCloud -- that all becomes irrelevant the moment a Senator publicly announces that Apple have demonstrated the goals of the government's mission to 'backdoor encryption to catch terrorists' is now possible via Apple's strategy without 'breaking' encryption which is the excuse Apple and others have relied upon for years to counter the 'backdoor' request. The EU are already in the process of exploring a similar solution to what Apple are voluntarily rolling out.

Also "muh slippery slope" doesn't apply if there is legitimate evidence that can be cited to demonstrate what implementing client side scanning WILL lead to. Did you also think activists and academics were fallacious in their critique of the patriot act potentially being used to subjugate innocent Americans? Because that's what ended up happening. If only there was an entire library's worth of US history demonstrating how willing the government are to abuse their own citizens in the name of protecting their god of capital.
But they’re gonna scan for csam filth regardless. It’s more private to do it on device than it is in the cloud where it’s wholly out of your control.
Besides - I strongly suspect that they’re going to implement e2e encryption across iCloud (it’s the way they’re headed, especially in light of the tor/vpn style browsing in iOS 15).
Apple have been in a postion to secretly scan your phone ever since they invented it. Introducing a public controversial feature like this really isn’t the best way to start! It’s pretty obvious, but conspiracy blah blah.
 
That is one thing that makes little sense to me. Messages is highly used in the USA. The rest of the world, not much.
So if this is a global affecting change, either they plan to expand this or they have another very different reason for putting this functionality into play.

The CSAM Detection system is US only. My opinion is they do this because of political pressure and wants to avoid laws being passed which would make scanning iCloud obligatory. By doing it on device they also don't have to break encryption.

If thinking about the new features in Messages I think it's has to do with the huge problem of adults contacting and sending nudity pictures to teenagers and even children. Also a lot of parents, especially in the US, wants very strict parental controls when it comes to pornography and nudity in general. It's basically what a lot of their customers want.
 
I fully support more parental controls. Like expanded administrators rights over their kids phones. Apple does the right thing here. But this is quite different from a company scanning private storage anytime and independently notifying authorities.
 
I think you're forgetting that photos can come to be on iCloud in more ways than from your Apple devices, and that ability to get there via other means, like web access, or Windows, or, ..., means they're probably be scanned on the server side as well, and probably on some schedule. At least I assume it so and can't see how Apple could explain away bad stuff on their servers because they only scan once, on an Apple device!

iCloud for Windows is a program and they could to the scanning there. Also they could stop uploading from the web. Or they could only scan in the cloud if you upload from a non-Apple device.

But this system will allow Apple in the future to provide end-to-end encryption in iCloud for users who accepted no upload from Windows and the web.

With server scanning there is no way it could be implemented.
 
  • Haha
Reactions: dk001
I don't think it's fanciful at all. Surveillance states like the U.S. are always looking for more advanced ways to conduct mass surveillance more easily and efficiently. Apple is serving it to them on a golden platter. Of course they will use and abuse this as quickly as possible.

On the contrary, with the courts ruling that law enforcement can compel your biometric password (i.e. force your finger onto the TouchID sensor), it is absolutely a reality that your finger (or face) can be used involuntarily to access your phone.

But not chopping of fingers which some people thought would increase a lot.
 
LOL!! Thank you, I needed a good chuckle today 😂 😂 😂

Server side scanning = the system is ONLY ever capable of scanning content that you upload to them

Client side scanning = the system is CAPABLE of scanning ALL content on YOUR property, regardless of whether or not you employ their cloud services.

What about those people who have more data in the cloud than on devices?
It is something which is going to be more and more common.

Would device scanning then result in less being scanned? Wouldn't it be better?

What if you want the cloud to have a copy of everything you have, even your most secret things, but the device only contains a small subset of your data and you don't have your most private stuff on it?

Wouldn't then on device scanning be better?

You seem to assume that your device contains all and the most private date you have.
 
It's now likely the database has in region of 4,200,000 hashes as organisations share so its not just USA, rising at the rate of 8,000 per day,[...] where this database is bound to rise exponentially with organisations sharing data.

[...] where even if a hash size is 16bits this database will represent quite a chunk, [...]

I highly doubt it will be 16-bit. More likely it will be 256-bit or larger.

Where does your information about 4.2 million images come from?
And where do you get that Apple will have so many hashes? I have heard there will be about 200 000 hashes.

Even if true it is only about 135Mb of storage if the hash is 256-bit.

Also Apple doesn't need to have hashes of all the images in the databases. They can select a subset containing the most which are currently spread, thus creating a worst of the worst of the worst list.
 
[...] but contrary to what some have suggested these organisations are NOT crime fighting agencies in law and some may consider it rather a convenient reason they are not [..] and not knocking child abuse organisations often charities NOT deemed to be authorised agencies and some without even security clearance,

NCMEC are authorised by Congress to do this job. They have been doing it more or less for two decades if not more. They're also mostly funded by the federal budget. And are considered now a government agent in legal cases.

It's a good thing they're not law enforcement agencies and only provides evidence to law enforcement agencies. It's better for the citizens that investigations are handled by the normal law enforcement agencies whether it's the local sheriff or FBI.
 
Now given the extent even so far and the rate of change, [..] with an ever growing has list and where the results are still likely to have hash problems where I believe the 1 in a trillion example by Apple is way too optimistic and where hashes can be compromised along with the CSAM application that Apple use, and according to some already has.

Apple assumes NeuralHash will have a false positive rate of 1 per 1 million images. This is about 1/33 worse than what a real world test has shown.

Apple has also stated they will increase the threshold value if their very conservative estimation based on real world data test is wrong.

The possibility of creating images which are falsely matched are handled by a second secret hashing algorithm occurring on the server side against the derivate image in the safety voucher being sent to Apple which they can read when the threshold has been reached.
 
NCMEC are authorised by Congress to do this job. They have been doing it more or less for two decades if not more. They're also mostly funded by the federal budget. And are considered now a government agent in legal cases.

It's a good thing they're not law enforcement agencies and only provides evidence to law enforcement agencies. It's better for the citizens that investigations are handled by the normal law enforcement agencies whether it's the local sheriff or FBI.
Wasn't intending to post again but legal cases demonstrate they are not considered a law enforcement agency. I posted information earlier and court transcripts from the US are available.

If they were then it would not be admissible.

That is what allows government agencies etc., to bypass the 4th Amendment because if they were considered government agencies, the 4th amendment would not allow their actions. It is because they are considered private entities and a PRIVATE SEARCH.

This effectively allows governments plausible deniability even if they fund what legally may be non governmental organisations but where in reality they are, which rides a coach and horses through the 4th Amendment and other privacy rules elsewhere.

Been involved in investigating child abuse along with other security matters both UK and International, so there is no way I'd want to safeguard them, but it has to be proportional with due process and be mindful of the rights to privacy of the billions of people who abhor child abuse, but whose privacy may be taken as the first steps to absolute surveillance, and the freedom some of us hold dear will no longer exist.

"II. Analysis
The Fourth Amendment provides in relevant part that “[t]he right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not beviolated.”U.S.Const.amend.IV.FourthAmendmentprotectionsattachwhena“search”occurs. A “search” occurs when the government infringes on an expectation of privacy that society is prepared to consider reasonable, see United States v. Jacobsen, 466 U.S. 109, 115 (1984), or where
4
Case: 2:16-cr-00047-DLB-CJS Doc #: 41 Filed: 05/19/17 Page: 5 of 27 - Page ID#: <pageID>
the government physically intrudes on a constitutionally protected area for the purpose of obtaining information, see United States v. Jones, 565 U.S. 400, 407-08 (2012). Fourth Amendment protections DO NOT APPLY TO A PRIVATE SEARCH. Jacobsen, 466 U.S. at 113. Nor do they apply if the government merely replicates a prior PRIVATE SEARCH. Id. at 115."

 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.