More like the EFF pressures clickbait-victims for donations.
It does make a catchy headline.
More like the EFF pressures clickbait-victims for donations.
I understand the objections here: the system as planned relies that we trust Apple. Personally I believe that Apple have, at least more than any competitors, earned that trust: rather than claim that they're throwing it away, I think they're spending it wisely.
If we can assume that Apple will act in good faith then this seems like a well balanced proposal. If we can't assume that Apple will act in good faith then this proposal isn't the problem.
The last 5 minutes of this really is a great take and great point by reasonable people who are not "extreme" in any way. Everyone should watch this.
What exactly do you think those countries could ask Apple to scan for?The problem is that Apple may act in good faith, but when they are compelled by a court order, they will no longer be acting in good faith, but under orders.
Do you (not you specifically, you in general) honestly think Apple will pull out of China if China wants them to scan for something? Ignoring of course that China might already be doing it since Apple already caved and is housing Chinese iCloud servers inside China?
If the fact that Apple already caved to China isn't evidence enough, just wait until Russia does it. Or the US. Or Australia or Canada or the EU.
Apple won't withstand the pressure and it will have to cave.
Yes, the last five minutes … The battle isn’t with Android; it’s with apps and services that allow folk to switch away from Apple without losing their social media connections or their pictures, or their documents, their music …
Its mind-blowing how people are not prepared to do EVERYTHING possible to stop this kind of abuse.
Those would would trade freedom for protection deserve neither.Maybe the defenders of privacy need to talk with the families, relatives and even the children themselves who have been abused and explain to them why your privacy is more important than their protection.
The problem is that Apple may act in good faith, but when they are compelled by a court order, they will no longer be acting in good faith, but under orders.
Do you (not you specifically, you in general) honestly think Apple will pull out of China if China wants them to scan for something? Ignoring of course that China might already be doing it since Apple already caved and is housing Chinese iCloud servers inside China?
If the fact that Apple already caved to China isn't evidence enough, just wait until Russia does it. Or the US. Or Australia or Canada or the EU.
Apple won't withstand the pressure and it will have to cave.
What exactly do you think those countries could ask Apple to scan for?
Well, if it’s the law, then they’ll have no choice. then there’s the problem of state agencies getting around their privacy laws by getting agencies from other states to help them spy on their own citizens.
![]()
ProtonMail provides Swiss authorities with user data
ProtonMail was forced to provide date of account creation data to Swiss authorities after a user sent multiple death threats to Anthony Fauci.proprivacy.com
Ok, Tiananmen Square. That's one, maybe two pictures. Not enough to do anything.Tanks in Tiananmen Square for one thing.
Hong Kong protests.
The Taiwanese flag.
Rubber duck sculptures. (used to relate to tanks)
Candle icons. (Used to mourn dead protesters)
Those are some of the things that are censored.
It's NOT about trusting Apple!I understand the objections here: the system as planned relies that we trust Apple. Personally I believe that Apple have, at least more than any competitors, earned that trust: rather than claim that they're throwing it away, I think they're spending it wisely.
Says you. You have no idea if this is true. What, are you just assuming now?It’s interesting to note that Apple solves problems. That’s at its very core.
How do they know that children are being exposed to explicit images and that they’ve got lots of CSAM images on their servers?
They’ve already looked and for them to propose this action I suspect the issue is HUGE otherwise they’d do nothing!
There you go, starting out with "I understand", but turning it right around to an insult against your own fellow forum members with the "bleating" comment, and then expecting people to trust your judgment and expecting people to even want to have a conversation with you.I understand a lot of forum members have concerns about privacy issues but rather than bleating about privacy how about suggesting better alternatives to keep our children safe!
I don't care why Apple decided to do the right thing. I just care that they do the right thing. If they were worried about having a bad launch or not making revenue targets, then so be it.Personally I think Apple are going to do this regardless. They may change how they represent it and how much they tell the user but they clearly want this to happen.
The only reason I think they're put it on pause is iPhone 13 sales. Once iPhone 13 is out and iOS 15 on enough devices they'll just push it out. Its not like anyone at that point, once on iOS 15, is going to say no to software updates from then on out.
Okay, you confused me with your statements here. Can you plainly state what you mean? Are you for or against Apple using CSAM?So you don't want Apple to use CSAM? But is it ok with you that you can never delete emails, post's, tweets and more from Apple, Google, Twitter and others?
It's mind-blowing how some people have no trouble with false accusations. It's not about "doing EVERYTHING possible to stop abuse". It's about NOT accusing innocent people of such a crime. And this use of CSAM leaves way too many openings for that to happen.Its mind-blowing how people are not prepared to do EVERYTHING possible to stop this kind of abuse. People need to get it in to their heads that if they are online in anyway there is no such thing as complete privacy. Someone will be able to get into whatever you think you are protecting. Your location, everything.. its all tracked. You're fighting for an illusion.
For now. But anyway, that alone is not acceptable to me.They are only scanning on device for photos being uploaded to their server anyway.
First off, Apple has never announced end to end encryption of everything. Second, e2ee is not real e2ee if there's a back door sitting in front of it. Lastly, that's fine by me, I never asked for e2ee, and think the server is where any scanning should take place.If they keep 100% server side scanner then we have to accept we cant have true end-to-end encryption thus meaning all photos are viewable, not just abuse photos.
Yes, it does effect me, it's on my device scanning my photos if I upload them to iCloud. Big Brother.If you have no abuse photos, this feature does not affect you one little bit. If you do have abuse photos, you’re sick and should be caught.
What, over 90 organizations (not just 90 people) come out against this, and you say you don't get why? Why not do a little research into why?“The responses to Apple's plans have been damning: over 90 organizations across the globe have urged the company not to implement them, for fear that they would lead to the censoring of protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.”
I understand other hypothetical risks listed, but why or how would Apple’s original CSAM plan “have disastrous consequences for many children.”? 🤔
What, over 90 organizations (not just 90 people) come out against this, and you say you don't get why? Why not do a little research into why?
But here, I'll help you how this could have disastrous consequences for many children:
1. False accusation against a minor who receives or takes a photo that "looks like" a bad photo enough to get a hash match. Now we get into a court case where the government is going after a minor (or the minor's parents) over a photo?
2. A person who is a parent gets falsely accused for taking or receiving a photo that matches a hash. Now what? We go to court, or based on what some people are saying here maybe we just skip the court phase and directly put that parent in jail and we send all his or her kids off to state care.
There are probably a thousand other ways where this hurts children, but that's just the first two that I could think of.
Let's end all wars by eradicating mankind! What, you are against eradicating all mankind? So you like wars! There is no middle ground there.Please explain to me this line 'Just because someone is against this doesn't mean we're against finding and getting rid of this filth.'. If a person is against having CSAM on their device then they are against finding and getting rid of this filth. There is no middle ground here.
You can not say 'we are against it but we are not going to allow you to check our device'. It does not work like that no matter how you try to word it or spin it.
So it's okay if they search you every time you leave the house for illicit photos? It's not okay to me!That’s not what is happening.
To use your analogy:
Apples proposal is to scan as the person is “actually leaving the house with said image”.
No scans take place if that photo is destined to remain on device and is not being uploaded to iCloud photos.
How hard is this to understand?
apple are not documenting the csam match unless the photo is leaving the house (phone) to go outside (upload to cloud). Only then the scan and documenting occurs
You're right, private companies are not obligated to protect my constitutional rights.You seem to have a misguided notion that private companies are obligated to protect your constitutional rights. They are not. The government is obligated to do that. If you don’t like what Apple is doing, you are free to use another platform.
This is the only remaining mystery, since we by now understand the technology and have seen feedback from interest groups and academia.In all honesty, the in-device CSAM scanning makes no financial sense for Apple. Why on earth are they even considering doing it? In a long run Apple will loose some sales and market share because of this but more importantly Apple has to answer to those who want to expand the scanning from CSAM to other types of material. The only logical move for Apple is to scarp any in-device scanning plans and go back to building “more secure” mobile devices. People have paid premium for privacy and that’s what they are expecting. Literally having a device which is spying on their owner and reporting them to third party which might or might not report the user to law enforcement isn’t a good selling point.
There is one positive thing for me in this debacle: Apple's transgression is so blatant, so offensive, that it has demolished the "think-of-the-children" excuse. Proponents are flailing in desperate disbelief as "think-of-the-children" no longer silences the room and lets them have their way.
The solution is NOT to implement it.I don’t know how apple are seen as the bad guy for trying to improve reporting and protection here.
The EFF don’t seem to be proposing any alternative solution.
I don't believe icloud files are encrypted.
I have no idea what point you are trying to make.You are gonna die someday anyway, I will just give you a glass of poisoned wine to help you with the process now.
Ah, the slippery slope of self-preservation.
Tanks in Tiananmen Square for one thing.
Hong Kong protests.
The Taiwanese flag.
Rubber duck sculptures. (used to relate to tanks)
Candle icons. (Used to mourn dead protesters)
Those are some of the things that are censored.
No we don't. You need to explain better how people won't be put in jail under false accusations. It really wasn't THAT long ago that people would be burned at the stake just for being ACCUSED of being a witch.Maybe the defenders of privacy need to talk with the families, relatives and even the children themselves who have been abused and explain to them why your privacy is more important than their protection.