All three are false or grand assumptions.But you cant deny thatbit is creating backdoors, compromising devices and leaving us thinking whats next.
All three are false or grand assumptions.But you cant deny thatbit is creating backdoors, compromising devices and leaving us thinking whats next.
But that is the one point that is so clear in the white paper that I was having difficulty with you understanding…the hash method is so precise, simply taking a pic of a baby in a bath and having it match a database image is so astronimaclly high, much less than more than one, it shouldn’t be an area of concern.This helps. Thanks! This is all I wanted clarification on and too bad it took so long. And the idea might be preposterous but as a developer these details help me understand how something works.
The scenario and Apples claims were contradictory. They claimed any modification would still result in a match but a similar picture wouldn’t is a bit contradictory. Like a bathtub picture for example. Legit vs offending pic but the same angle and style.
But that means Apple is “on my phone”…oooooooooo 🤣That’s why they implemented CLIENT-SIDE analysis so no data leaves your phone and Apple doesn’t access your photos.
You prefer server-side analysis where all your encrypted iCloud photos need to be decrypted first???
Theyre the first company to implement a privacy oriented solution to ban illegal SCAM content from their servers and everybody complains. Go figure.
Those with more awareness and experience know that it is not "what if" it gets used for something different in the future. Those with more awareness and experience know that the only reason they're doing this in the first place is to set the stage and pave the way for something else down the line.Think it's too late. The mob has already made up it's mind. For what it's worth, I think the idea behind the implementation is pretty sound and being done for the most respectable of reasons and ultimately if you don't want it, you don't use iCloud to store your photo's. I understand the outrage though because of the scope for abuse.. and the fact that today it's just looking for CSAM images but who knows what it could be used for in the future? But with all this power surely they have some duty of responsibility to look out for those who are incapable of defending themselves.
But that is the one point that is so clear in the white paper that I was having difficulty with you understanding…the hash method is so precise, simply taking a pic of a baby in a bath and having it match a database image is so astronimaclly high, much less than more than one, it shouldn’t be an area of concern.
The reason for people not accepting this on this forum is either them not understanding what one in one trillion really means or they simply do not believe what Apple is saying as it pertains to the odds. I can’t really help with either of those…much…
I'd LIKE to say that I ate something exotic on the deck of a yacht while hanging out with Kanye in the Azores living my best life, instagram, selfie, besties, like whatever, etc. BUT......I had a boring ole turkey sandwich with side of chips and an apple while watching the local news on TV. Oh well.Look, in the end Apple had a misfortune of introducing this right after the revelations about the abuse of the NSO Pegasus spyware. No wonder people are skeptical. By the way, what are you eating for lunch? 😏
There is an answer to that, and it is quite complicated. The short version is: there is indeed a public and private version of reality.Under that assumption, they could do whatever they want behind closed doors already. Hell, why even publish this stuff about CSAM, if they could just do it without telling people, as you say?
Well, Apple is different. They are the ones pretending to be concerned with privacy at all costs. And have millions upon millions of followers who actually believe it, for no other reason than they say so.It's about bashing Apple. No one yells at Google for their GMail scanning for child sex abuse imagery. But Apple is for some reason different.
In the US, what if Congress and the FBI force Apple to include "other" hashes?
It is indeed a well-meaning but very slippery slope.
In my opinion, they already have, and already do.How long before USA and/or China demands Apple to start scanning for content other than CSAM on all iPhones (not just the ones with iCloud Photo Library enabled), or they would not be allowed to sell devices in those markets? These two countries represents two-thirds of Apple's revenue and therefore have a lot of leverage on the company.
Just because they claim to and make it part of their marketing, does not mean they actually do.So? One of the reasons we use Apple is it had a modicum of respect for privacy. Those companies don't respect our privacy.
You are hoplessAll three are false or grand assumptions.
I went back to read the previous comment more carefully, as you can understand the first and most urgent part to answer to was the one in which I was called out for being factually inaccurate.
I’ve read the analogy (yet to read the linked blog post) and it doesn’t make much sense honestly.
Apple is already inside your sacred space with both feet, they’re already sleeping on your sofa, since the OS is not open source you don’t know what’s really happening under the hood.
They will occasionally ask you to participate to initiatives that are not in your direct interest, like submitting error telemetry (you can opt-out of this with no functionality loss).
They will for sure update the terms and conditions to ask your consent about this too. This time opting out will cost you not having access to their iCloud Photo service (you will still be able to backup your precious photos to iCloud Backup, from what I’ve gathered), but what are you gonna do, this time the stakes are pretty high: hosting or not CSAM on their servers. Of course it can’t be opt out for people willing to store stuff on their (Apple’s) property. But as far as what happens within the (not so) sacred space of the local device, they will just stick an unreadable post-it to the fridge, nothing more. That post-it note will sit there doing nothing and being unreadable. It’s just a technicality like a thousand like it, nothing much to get triggered about or to equate to unrelated examples.
As for ”why are companies and banks asked by the State to sometimes police and report their customers”, that’s how it works. You can‘t bring 1M€ cash to a bank and you can’t upload CP to cloud storage hosts. The fact Apple is using a workaround that makes finding CP even more privacy-minded is frankly just a technicality. (and you can live a perfectly normal digital life by uploading those pics to another service, a bit different from the “you can’t leave your home with the stolen items, period” example).
Apple does care about their user's privacy in democracies. It's just that they shot themselves in the foot by introducing CSAM in the US.Just because they claim to and make it part of their marketing, does not mean they actually do.
Google does not install spyware on your device that sifts through your files. Everybody understands that if you choose to upload data to the cloud (i.e. someone else's computer) without first encrypting it, they can access it and have the right to keep illegal material off of their servers. What Apple has implemented is fundamentally different and new. Your own device is now treating you like a suspected criminal and may report you if Apple's secret algorithm thinks it sees something "suspicious". And yes, if they expand (or are forced to expand) the use of the infrastructure they built, it can absolutely undermine end-to-end encryption.It's about bashing Apple. No one yells at Google for their GMail scanning for child sex abuse imagery. But Apple is for some reason different.
You want to see two pictures that do match regarding a possible hash? - here you go.
View attachment 1816592View attachment 1816591
Hey Tim, we got a match!
Yes, I am aware, that both are NOT in that database, but they obviously match.
Pictures originate from:
If it is a problem, publishing the pictures iteslf here, please remove that post.
You’re right. But I can’t half run!You are hopless
Proponents of this system should think about the missing question in this FAQ: “Is the CSAM system effective?”.
The answer is “No”. It doesn’t prevent children from being abused. It doesn’t prevent the creation of new CSAM. It doesn’t even prevent CSAM from being shared.
Someone in possession of CSAM would simply disable iCloud Photos which also disables the detection system. They can still share it using websites and apps and they can even spread it through iCloud Drive by encrypting it before uploading.
In short, Apple introduces a system which is known beforehand to be completely useless for its intended purpose. So what is Apple’s real intention?
But that means Apple is “on my phone”…oooooooooo 🤣
Proponents of this system should think about the missing question in this FAQ: “Is the CSAM system effective?”.
The answer is “No”. It doesn’t prevent children from being abused. It doesn’t prevent the creation of new CSAM. It doesn’t even prevent CSAM from being shared.
Someone in possession of CSAM would simply disable iCloud Photos which also disables the detection system. They can still share it using websites and apps and they can even spread it through iCloud Drive by encrypting it before uploading.
In short, Apple introduces a system which is known beforehand to be completely useless for its intended purpose. So what is Apple’s real intention?
I have to say, I find it really disconcerting that this is the issue that suddenly so many people think Apple is overstepping. A million things you let go. But Apple decides to start doing what ever other major provider does, and not allowing iMessage to continue to be a safe haven for child porn...and that generates outrage?
The reaction alone tells me this is probably something Apple really does need to do.
I do wonder about this. That doesn't mean we shouldn't do anything possible to slow down / hinder / catch child predators. But if it can be turned off anyway (unlike server side scanning), how much does it truly help? Doesn't that essentially make it somewhat of an opt-out feature from the criminal's mindset?