my post for today and why its ridiculous to have our user hardware used rather than icloud:
It's now likely the database has in region of 4,200,000 hashes as organisations share so its not just USA, rising at the rate of 8,000 per day, but contrary to what some have suggested these organisations are NOT crime fighting agencies in law and some may consider it rather a convenient reason they are not and where this database is bound to rise exponentially with organisations sharing data.
Even though hashes may contain limited data, it is still a backdoor, and not knocking child abuse organisations often charities NOT deemed to be authorised agencies and some without even security clearance, it is a backdoor with the potential for wholesale surveillance of our private devices and it will NOT be restricted to the USA, not be restricted to iPhones.
Now given the extent even so far and the rate of change, where even if a hash size is 16bits this database will represent quite a chunk, require user processing power commensurate with an ever growing has list and where the results are still likely to have hash problems where I believe the 1 in a trillion example by Apple is way too optimistic and where hashes can be compromised along with the CSAM application that Apple use, and according to some already has.
So the science is not safe to start with, and certainly not via our hardware, because that is where we separate child safety and surveillance, and via our hardware it is surveillance and even if it was kept to child safety, which is very unlikely in my opinion, it still represents a backdoor, which is why those arguing solely on child abuse or minutiae of its workings have the situation so wrong as its still a backdoor waiting to be exploited by governments/hackers/despots, and even companies that have grown so much they now consider themselves policemen of the world, but even decide what the laws are, leaving it open for unscrupulous use of this technology.
Learn how to use image hashing or image fingerprinting to find visually similar images or duplicate images via hashes and the Hamming distance metric.
practicaldatascience.co.uk
Hardware based illustrates another problem, the never ending need to update and increase hash database on at least a weekly basis, with an ever increasing file size, processing time etc., processing time we have paid for in some cases to do specific jobs and where we might not even use iCloud at all, but where the software is still there.
Now to why this has come about in this fashion.
In the USA where its being implemented one word describes why its coming in: AVOIDANCE.
In common with many countries the USA has its own privacy laws and in particular in the USA the 4th amendment safeguards or was mean to.
However the 4th Amendment interpretation is rather different to how it was envisaged when it was created, and where the USA and no doubt other countries have found a way to circumvent privacy laws, so they can shout from the rooftop about safeguarding privacy, whilst driving a coach and horses through it, FOR ANY PURPOSE, so its not about child abuse.
With regards to the USA they are doing this because the 4th Amendment after Court rulings decided that you can circumvent the privacy enshrined in the 4th Amendment because they deem it ONLY APPLIES TO GOVERNMENT ACTION BUT DOES NOT CONSTRAIN PRIVATE PARTIES "not acting as an agent of the Government or with the participation or knowledge of any government official" Jacobsen, 466 U.S. at 113 (quoting Walter v. United States, 447 U.S. 649, 662 (1980) (Blackmun, J., dissenting)). Must correct elsewhere that the parties checking the data have been ruled to be government agencies, as the above shows the ruling was that they are not, which is what allows Government players to utilise these hash collection agencies, because it allows government to rule that it is not against the 4th Amendment by virtue of these collection organisations NOT being government agencies. Convenient way to circumvent privacy rules in the 4th.
Similar laws have been interpreted similarly elsewhere with new laws brought in to allow it on a similar basis to above.
There is no doubt in my mind now that Government (along with agencies within who have openly called for a backdoor) have sought to bend the rules knowing that the precedent now that a private party can effectively invade privacy is a green light to conduct wholesale surveillance, and what better way of companies bringing that in but under the guise of child safety.
Sad for me that if Apple do this via my hardware and yours and it will NOT be just iPhones, it will have gone from being part of the solution, to a part of the problem sacrificing two decades of user targeted features plus the emphasis on privacy and security. The slogan 'what happens on the iPhone stays on the iPhone' The same with all other Apple devices this is likely to occur on in my opinion. R.I.P.
It still makes no sense logistically to have these functions anywhere near our hardware, the fact that it is being done that way, where it will no doubt require constant updates, etc. etc., illustrates to me at least, this is definitely the start of a slipper slope of mass surveillance of our own equipment, and with the rulings above, its the green light that can be used to implement it, and probably why others have not gone the route of user hardware being having a backdoor, which is what this is and what no doubt will result in surveillance of much more than a simple hash, but even hash checking is surveillance let alone what comes after.
There is no doubt based on the information available that agencies and governments calling for backdoors have been successful by the interpretation of laws, yet enabling them to shout about protecting privacy which is basically stretching the truth now, and there is no doubt that some companies association with China has meant bespoke arrangements for China which are anything but protecting privacy.
I doubt whether Apple or any organisation will comment about 'duress' where many of the above possess great leverage, as China has demonstrated, but with democratic governments where some should be holding their heads in shame, as espousing privacy whilst actively coercing private companies to run a coach and horses through it.
However it still does not alter that Apple has no legal obligation to do what its suggesting, and my gripe is and always has been that it is unnecessary and unwise to use customers hardware to do it, and makes no sense logistically either. If Apple are going to do this make it server based completely. The fact that they are not will give the real suspicion and probably factual that the backdoor will be open for a lot more than the excuse being used.
Apple has long made privacy central to its marketing pitch for iPhones and iPads, but with the looming installation of a backdoor to scan…
macdailynews.com
meteor edited: 16bits was supposed to read 160 although academic as it was to indicate that even limited data multiplied still represents significant data and with hash lists growing will only increase hence if Apple want this make it solely via iCloud not our hardware