EFF have lost all the credibility by taking a completely uninformed stance on this issue. Unbelievable!
This is the role of governments, how is that going?Inform the public about signs of child abuse? Encourage people to report it?
For a million times: it’s not Apple job to do this!1: privacy issue if it means children are protected. Those who question the privacy asspect of the issue I would have to question why because do they not want children to be protected?, therefore a persons privacy is more important than the protection of children? That concept is appalling to me. A childs protection comes before my privacy.
How prominent is it, and how do we know that information?I think many people are truly unaware of the staggering prominence of child abuse in society, if people knew how common and widely distributed the material is they might throw some support behind this.
They do though. The proposed solution is the total abandonment of this privacy destroying project.The EFF don’t seem to be proposing any alternative solution.
Varies from country to country.This is the role of governments, how is that going?
Apple should do the opposite of what they‘re trying to do now: tell anyone who wants them to do this that they have to MAKE IT INTO A LAW for them to do it because they can’t just simply do this to their customers.The only logical move for Apple is to scarp any in-device scanning plans and go back to building “more secure” mobile devices. People have paid premium for privacy and that’s what they are expecting.
But Apple can also scan the iCloud archive - there is no need to install spyware on your phone that could be abused.The thing is, apponents of the system are seeing it two ways one being a privacy issue and two being open to abuse issue. Now personally for me, I do not have a problem with 1: privacy issue if it means children are protected. Those who question the privacy asspect of the issue I would have to question why because do they not want children to be protected?, therefore a persons privacy is more important than the protection of children? That concept is appalling to me. A childs protection comes before my privacy.
As for the second issue, open to abuse, yes it's a very very valid argument because checking systems do get abused.
Why isn't it Apple's job? If a criminal is using an Apple device to take indecent pictures of children and then upload them to Apple servers so other criminals can access the images, why can't Apple put in controls to prevent this?For a million times: it’s not Apple job to do this!
Want to protect our children? Either donate fund to FBI team who’s dealing with this issue or talk secretly to Congress to pass a law requires ALL who store our photo to scan for CSAM. Apart from these GTFO of my devices!
I’m a paying customer. I don’t like being point finger at ‘Hey! Let me check you. You COULD be a criminal’. This is not a way to treat your loyal customer.
Apple has been scanning images in the cloud for several years already.In one of the articles that MR has reported on the matter (am not able to find it at present) I remember reading about Apple saying that images are encrypted and as a result it would take a lot of computing power and programming to pro-actively scan images on icloud servers thus is much easier, simpler and quicker to scan for image hash values on a users device where there would only be a few image files to scan rather than millions of image files. Having to scan the servers on a daily basis would slow down the servers.
As for other tech companies scanning their cloud storage servers, I do not know if they encrypt images in the same manner that Apple does.
The numbers look weak for a petition. Just as I suspected — this seems to only be an issue for a loud minority, conspiracy theorists, and people who don’t quite understand the tech.
In comparison, the California recall petition received over 2M signatures.
I tell you how. People buy Apple products for their quality, security and PRIVACY! Photo Albums are very personal and private, they only get shared with whoever people choose to share them. You can imagine people not being happy having an AI scanning their photo album (which has one of the most personal, private data on the phone). What happen to Police and other authorities doing their job without having to look on every single one of billion devices?I don’t know how apple are seen as the bad guy for trying to improve reporting and protection here.
Literally in the article you responded to:
”for fear that they [the plans] would lead to the censoring of protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.”
Originally this was supposed to just compare the hash of photos to the hash of existing known child porn as it went into or out of encryption. I don’t really have a problem with that. Active AI scanning of actual images seems like a great idea, but it’s encryption back door requirements and near instant potential for misuse by governments worldwide is shocking.
If you don’t understand this, just ask yourself if you think someplace inside of Apple there will be a server full of child porn waiting to be compared to images on your phone…. Or will those images and services be provided by governments, who would then need direct unencrypted instant access to your device. Scary.
Which is why the hash database was due to be auditable.. so we can check it is only for CSAM. Any other hashes would be noticed immediately..Once they can compare file hashes for this. They can do it for anything and compare against all kinds of files. That will allow ultimate tracking across all your activities.
Not to mention the possibility of a malicious app casually writing a bunch of these types of photos to a person’s library, instant sync to iCloud, boom life ruined.
This system is idiotic. The fact that Apple even thought it was a good idea is idiotic and people who think it’s a good idea need to use their brains a little harder to see beyond this ridiculous “save the children” narrative. It’s not Apple’s job.
I run software company for more than 20 years, and I perfectly understand the tech. Countless experts are already given explanations and criticism over this backdoor, go outside the echo chamber and do your research, I will not give it to you freely. Repeating Apples PR mantra "you're holding it wrong, understand the tech" is making you look stupid and uneducated. If you want to present technical argument, please do it. But at this point in time even Apple has understanding that "tech" is easy to be fooled with adversarial networks, that is the reason for delaying it. And obviously iPhone 13 is coming out soon, so they need a PR move.The numbers look weak for a petition. Just as I suspected — this seems to only be an issue for a loud minority, conspiracy theorists, and people who don’t quite understand the tech.
In comparison, the California recall petition received over 2M signatures.
Yes it is apples job. Apple isn’t legally allowed to store child abuse images. They are responsible for ensuring this.For a million times: it’s not Apple job to do this!
Want to protect our children? Either donate fund to FBI team who’s dealing with this issue or talk secretly to Congress to pass a law requires ALL who store our photo to scan for CSAM. Apart from these GTFO of my devices!
I’m a paying customer. I don’t like being point finger at ‘Hey! Let me check you. You COULD be a criminal’. This is not a way to treat your loyal customer.
Looks like you know nothing about what people is discussing here. They are discussing scanning your local photo album and eventually upload your private photo to Apple for human review.Obviously you can have your opinion on if this feature is a good idea or not but your description would suggest you don’t understand how it works. Apple don’t need to have any images to compare to for this system to work as they describe it. No third party needs access to your phone or personal data. For people to have a sensible debate in these features it would be best if people try and understand what they are actually proposing first.
I believe that Dropbox and google drive have used this type of system for years (including for other things like copyright issues). Maybe Apple are being held to a higher standard because they use privacy in marketing a lot. It does seem odd to me that this is getting so much attention when clearly other privacy related issues are not.