Totally agree with the EFF that Apple should abandon this whole CSAM thing. For decades, we've used Apple devices because of its solid privacy and confidentiality measures. This CSAM thing, if implemented, really would make us rethink.
I run software company for more than 20 years, and I perfectly understand the tech. Countless experts are already given explanations and criticism over this backdoor, go outside the echo chamber and do your research, I will not give it to you freely. Repeating Apples PR mantra "you're holding it wrong, understand the tech" is making you look stupid and uneducated. If you want to present technical argument, please do it. But at this point in time even Apple has understandied that "tech" is easy to be fooled with adversarial networks, that is the reason for delaying it. And obviously iPhone 13 is coming out soon, so they need a PR move.
No. They are matching your hashed image to a stored hash of a known child abuse image. The scan only takes place if your image is going to be uploaded to the cloud anyway.Looks like you know nothing about what people is discussing here. They are discussing scanning your local photo album and eventually upload your private photo to Apple for human review.
I am also a programmer for more than 28 years. You can be a developer and lack design system thinking and skills. That is one of the reasons why I pay salaries to dev's and not the other way around. The design fail in Apples approach is that they are introducing on device processing with third party non publicly auditable hashes. Period. The industry uses server-side processing with PhotoDNA from years. And this is enough.I’ve been a dev for 23 years and I disagree with your assessment. Running a software company isnt the same as having the tech knowledge
Is it my contractor job to come back to check on my house periodically to see if I store bodies under it?Yes it is apples job. Apple isn’t legally allowed to store child abuse images. They are responsible for ensuring this.
This means it’s Apples job to scan for abuse photos before they are uploaded to the cloud..
Literally in the article you responded to:
”for fear that they [the plans] would lead to the censoring of protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.”
You honestly believe private companies should start monitoring their users? You want Apple Police, Microsoft Police, Google Police etc. of gathering info for some other private organisation so they can use it to their own purpose? Shouldn’t we leave hunting down the criminals to governments and law enforcement agencies and not to some shady groups who are not governed by the laws like law enforcement?
Does the contractor own your land where you’re storing the bodies?Is it my contractor job to come back to check on my house periodically to see if I store bodies under it?
Does Apple own MY phone? Last time I check it’s MY phone and in MY house. I didn’t rent it.Does the contractor own your land where you’re storing the bodies?
That part of the objection comes from the SMS/iMessage live scanning I believe“The responses to Apple's plans have been damning: over 90 organizations across the globe have urged the company not to implement them, for fear that they would lead to the censoring of protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.”
I understand other hypothetical risks listed, but why or how would Apple’s original CSAM plan “have disastrous consequences for many children.”? 🤔
No, but you get a license to use the software.Does Apple own MY phone? Last time I check it’s MY phone and in MY house. I didn’t rent it.
In one of the articles that MR has reported on the matter (am not able to find it at present) I remember reading about Apple saying that images are encrypted and as a result it would take a lot of computing power and programming to pro-actively scan images on icloud servers thus is much easier, simpler and quicker to scan for image hash values on a users device where there would only be a few image files to scan rather than millions of image files. Having to scan the servers on a daily basis would slow down the servers.
As for other tech companies scanning their cloud storage servers, I do not know if they encrypt images in the same manner that Apple does.
Yes it is apples job. Apple isn’t legally allowed to store child abuse images. They are responsible for ensuring this.
This means it’s Apples job to scan for abuse photos before they are uploaded to the cloud..
I hadn't heard that one before. That must be the rumored 3rd part. Can't say I like it much, but if it doesn't tattle, ehhh.and expanded CSAM guidance in Siri and Search.
For a million times: it’s not Apple job to do this!
Want to protect our children? Either donate fund to FBI team who’s dealing with this issue or talk secretly to Congress to pass a law requires ALL who store our photo to scan for CSAM. Apart from these GTFO of my devices!
I’m a paying customer. I don’t like being point finger at ‘Hey! Let me check you. You COULD be a criminal’. This is not a way to treat your loyal customer.
This debate will get no where because it is clear there are those who will verminatly defend their right to privacy over that of anything else, in this case the protection of children.
The thing is, don't children deserve our protection? is it not up to us, as adults to find ways to protect children from harm?. It's a daming question but given the way people are defending their right to privacy, they need to ask themselves this question..is your right to privacy more important than the protection of a child? I think members here are too scared in being judged if they reply 'Yes' to that question.
You honestly believe private companies should start monitoring their users? You want Apple Police, Microsoft Police, Google Police etc. of gathering info for some other private organisation so they can use it to their own purpose? Shouldn’t we leave hunting down the criminals to governments and law enforcement agencies and not to some shady groups who are not governed by the laws like law enforcement?
Oh no, I simply go by what APPLE telling me year after year. Silly me for believing them eh?You seem to have a misguided notion that private companies are obligated to protect your constitutional rights. They are not. The government is obligated to do that. If you don’t like what Apple is doing, you are free to use another platform.
Said this before will say it again for those who missed the first go around.Ah, the slippery slope fallacy.
The reality is if they planned to "censor protected speech" or "threaten the privacy and security of people around the world" then they could do that anyway. With or without this technology.
They obviously thought otherwise.Exactly, so it’s not their job.
Not the first of their mistakes.They obviously thought otherwise.