EFF Pressures Apple to Completely Abandon Controversial Child Safety Features

What, so criminals can be tipped off and hide/destroy the evidence?
I'm not a criminal. Not to mention that we're supposed to be innocent until proven guilty.

Possible and likely are two different matters. Apple has stated the chances of an account being falsely flagged are less than 1 in 1 trillion per year. Even if you reduce that 1000x to 1 billion, it's still incredibly unlikely.
I've come to loath statistics like that, as they're basically meaningless. (like most statistics, they can say what you want them to say)
 
I know this is a personal choice and I can respect folks different opinions. However Apple has been incredibly clear on how the process works. Its in your control. It compares hashes and not images. It gets human review to make sure that it is comparing hashes of known CSAM and that the CSAM is child pornography and not a vote for X person image.

Its near impossible to compromise the CSAM database without huge resources and even if it were it still gets human review before it goes to agencies that prosecute those folks.

Its not suseptable to spamming their inbox or messaging apps with known child porn because if a third party did do that those photos do not get added to the photos library automatically. There is not mechanism for it the user has to had them manually.

Currently the API is not available to 3rd parties and I understand that Apple stated they would be open to that, at which if left as is then maybe it could be suseptable to an attack but I’m willing to put money down that at that time the system is much more secure and other requirements of the 3rd party apps will be required of them. Speculation yes, but thats exactly what everyone that is against it is doing and they don’t address what it actually only does. They just make up scenarios and then don’t run their scenario through the CSAM system and see that if falls apart and doesn’t work or can’t work.

Looking for CSAM thankfully will not be for my personal benefit as I don’t have kids or anyone I know that is a victim. However as a decent human being, and I’m not saying you’re not, I’d gladly allow my photos to be scanned for CSAM to benefit those victims and prevent more.

This issue is that folks aren’t taking enough time to understand the system, what protections it has, and how it won’t/can’t affect them unless they are doing something illegal with child porn. Apple has been clear that they will not bow to governments that want to use it maliciously. If you don’t trust them with that then why do you trust Apple to not search for ‘grass’ on your phone and turn you into the government? You have no more of a way to know they are using that system as intended as they are the CSAM system.
Quit trolling people, it’s over for now ,Apple backed down because lots of people don’t share your views
 
I know this is a personal choice and I can respect folks different opinions. However Apple has been incredibly clear on how the process works. Its in your control. It compares hashes and not images. It gets human review to make sure that it is comparing hashes of known CSAM and that the CSAM is child pornography and not a vote for X person image.

Its near impossible to compromise the CSAM database without huge resources and even if it were it still gets human review before it goes to agencies that prosecute those folks.

Its not suseptable to spamming their inbox or messaging apps with known child porn because if a third party did do that those photos do not get added to the photos library automatically. There is not mechanism for it the user has to had them manually.

Currently the API is not available to 3rd parties and I understand that Apple stated they would be open to that, at which if left as is then maybe it could be suseptable to an attack but I’m willing to put money down that at that time the system is much more secure and other requirements of the 3rd party apps will be required of them. Speculation yes, but thats exactly what everyone that is against it is doing and they don’t address what it actually only does. They just make up scenarios and then don’t run their scenario through the CSAM system and see that if falls apart and doesn’t work or can’t work.

Looking for CSAM thankfully will not be for my personal benefit as I don’t have kids or anyone I know that is a victim. However as a decent human being, and I’m not saying you’re not, I’d gladly allow my photos to be scanned for CSAM to benefit those victims and prevent more.

This issue is that folks aren’t taking enough time to understand the system, what protections it has, and how it won’t/can’t affect them unless they are doing something illegal with child porn. Apple has been clear that they will not bow to governments that want to use it maliciously. If you don’t trust them with that then why do you trust Apple to not search for ‘grass’ on your phone and turn you into the government? You have no more of a way to know they are using that system as intended as they are the CSAM system.
For now. Still not acceptable even given one believes apple, and I don't anymore.

I don't mind my photos being scanned either, just not on my device. If apple cancels this and goes to server side scanning, I'll turn on iCoud photos because I like the backup. And yes, I've taken way more time than I needed to understand the system and my first gut reaction, big brother, remains.
 
For now. Still not acceptable even given one believes apple, and I don't anymore.

I don't mind my photos being scanned either, just not on my device. If apple cancels this and goes to server side scanning, I'll turn on iCoud photos because I like the backup. And yes, I've taken way more time than I needed to understand the system and my first gut reaction, big brother, remains.
I may not turn iCloud back on now that I’ve turned it off, it didnt cost much but I pay for lots of services and so it adds up, the benefit seems mostly making it a easier when I upgrade phones, we will see how it goes I guess
 
What is the agenda here? Classic diversion tactics in place. Apple bots are spamming this topics as crazy.
It simple people, just ignore us, we are minority. We don't matter. Or somebody is paying you?
What is the corporate task? To write next 6 days and hide all the logical and technical evidence?
It will not work. Yes, we are minority and our voices will vanish into the void of ignorance.
The bottom line: Apple never will be able to associate with privacy. Where are the billboards now?



31837-53783-190703-Privacy-xl.jpg

Apple-iPhone-privacy-billboard-Berlin-001.jpg

dims.jpeg

image.jpg
 

Attachments

  • 60dc8f0685462892025313.jpg
    60dc8f0685462892025313.jpg
    400.2 KB · Views: 82
  • 107e73b32bbc7fccae526cb09482ef87.jpg
    107e73b32bbc7fccae526cb09482ef87.jpg
    56.7 KB · Views: 90
Last edited:
I'm not a criminal. Not to mention that we're supposed to be innocent until proven guilty.

I never said you were. But what you're proposing would allow exactly what I said. Guilty until proven innocent is a principle applying to a court of law, not Apple. All Apple is doing is reporting illegal images.

I've come to loath statistics like that, as they're basically meaningless. (like most statistics, they can say what you want them to say)

If you have hard evidence that Apple is lying, then present that.
 
I never said you were. But what you're proposing would allow exactly what I said. Guilty until proven innocent is a principle applying to a court of law, not Apple. All Apple is doing is reporting illegal images.
Lets just agree to disagree here. I'd want to know, you don't -- fine.

If you have hard evidence that Apple is lying, then present that.
battery-gate, privacy, among many others. They're a corporation and will say whatever they want just to make a buck.
 
Honestly though, this could happen with iCloud, or any other cloud service, as well.

But your overall point makes sense. The slippery slope scenario, while not likely, is also not impossible, and shouldn't be completely dismissed.

Maybe, but likely not.
As I have dug into this, Google, MS, DB, and others are scanning cloud content on share, not download or upload.
Makes sense. Can you imaging the resources on the Amazon servers alone if they scanned all uploads?
 
Last edited:
Um, don't you already know what images are on your phone?

Nice side step.
What I would not know about this feature is if it is on, working, not working, can be turned off, can be turned on, etc…

I take a lot of photos.
I do not currently use iCloud Photo Backup.
Do I know what is on my phone? Generally as I take the photos.
Most get moved to a cloud and local SSD.
 
battery-gate, privacy, among many others. They're a corporation and will say whatever they want just to make a buck.

No, I'm asking you to present evidence that Apple is lying about the less than 1 in 1 trillion figure. Also, I don't see how they're making any money off CSAM detection.
 
What, so criminals can be tipped off and hide/destroy the evidence?



Possible and likely are two different matters. Apple has stated the chances of an account being falsely flagged are less than 1 in 1 trillion per year. Even if you reduce that 1000x to 1 billion, it's still incredibly unlikely.

Talking in circles.
Apple already told “criminals” how to get around this check.
Go talk to professionals (LEO) - these folks generally have libraries of this crap.

Apples “1 in a …” is a guesstimate. That is, from my understanding, on 30 false positives for an account. Not a single false positive.

Let’s just say, this solution is not “risk adverse” for the lawful device user. Love to give this to a master black belt and let them run analysis.
 
For now. Still not acceptable even given one believes apple, and I don't anymore.

I don't mind my photos being scanned either, just not on my device. If apple cancels this and goes to server side scanning, I'll turn on iCoud photos because I like the backup. And yes, I've taken way more time than I needed to understand the system and my first gut reaction, big brother, remains.

I don’t have it on either account and wouldn’t turn it on.
I’ve had my photo library trashed twice during an upgrade. Thankful for local backups.
 
What is the agenda here? Classic diversion tactics in place. Apple bots are spamming this topics as crazy.
It simple people, just ignore us, we are minority. We don't matter. Or somebody is paying you?
What is the corporate task? To write next 6 days and hide all the logical and technical evidence?
It will not work. Yes, we are minority and our voices will vanish into the void of ignorance.
The bottom line: Apple never will be able to associate with privacy. Where are the billboards now?

View attachment 1828332
View attachment 1828333
View attachment 1828334
View attachment 1828335
View attachment 1828336
View attachment 1828337

That bottom photo is apparently not.
 
No, I'm asking you to present evidence that Apple is lying about the less than 1 in 1 trillion figure.
Saying it's statistics based is enough for that. It's based on assumptions. Anyway, that's not what I was saying is a lie.

Also, I don't see how they're making any money off CSAM detection.
Never said they did, and in fact, I think it'll be a net loss. I know they'll get a few thousand less from me this year. We haven't heard the reason as to why they're doing it this way now, and it could well be to make sure they keep making bucks, or to save bucks on the back-end.
 
Talking in circles.

Hardly.

Apple already told “criminals” how to get around this check.

Turning off iCloud for photos isn't "getting around" anything, because now they're not able to use the service at all. The whole point of what Apple is doing here is NOT to detect CSAM that somone plans to keep on their phone, but rather to detect CSAM that somone intends to upload to iCloud.

Go talk to professionals (LEO) - these folks generally have libraries of this crap.

Of course. Not sure how this is relevant to my point, though.

Apples “1 in a …” is a guesstimate.

If you say so. And like I said, even if they were 1000 times off, it would STILL be a huge number (1 billion).

That is, from my understanding, on 30 false positives for an account. Not a single false positive.

Correct - but single false positives are irrelevant because they trigger nothing.

Let’s just say, this solution is not “risk adverse” for the lawful device user.

Nonsense. There's practically no chance that a lawful device user's account woulf be flagged, and even if it were, nothing would come of it because of the manual review process.
 
I don’t have it on either account and wouldn’t turn it on.
I’ve had my photo library trashed twice during an upgrade. Thankful for local backups.
Understandable. But I'm an IT guy and we like many redundant backups. I don't have many pics, but those I do are on 2 NAS's and 2 computers. :)
 
What is the agenda here? Classic diversion tactics in place. Apple bots are spamming this topics as crazy.
It simple people, just ignore us, we are minority. We don't matter. Or somebody is paying you?
What is the corporate task? To write next 6 days and hide all the logical and technical evidence?
It will not work. Yes, we are minority and our voices will vanish into the void of ignorance.
The bottom line: Apple never will be able to associate with privacy. Where are the billboards now?

View attachment 1828332
View attachment 1828333
View attachment 1828334
View attachment 1828335
View attachment 1828336
View attachment 1828337

iCloud ≠ iPhone. Such a basic misunderstanding that you and many others have. iCloud has NEVER been completely private. Apple obviously holds the encrpyption keys for their servers.
 
Quit trolling people, it’s over for now ,Apple backed down because lots of people don’t share your views
Trolling and pointing out failures in folks logic are not the same. Your logic failed and you couldn’t supply sound reasoning. You feel trolled. I think you just want to fear monger without research. Difference on viewpoints.
 
Saying it's statistics based is enough for that. It's based on assumptions. Anyway, that's not what I was saying is a lie.

No, it's based off numbers.

Never said they did

battery-gate, privacy, among many others. They're a corporation and will say whatever they want just to make a buck.

You were clearly drawing a parallel between these other situations (or at least your evaluation of them) and CSAM detection.

, and in fact, I think it'll be a net loss. I know they'll get a few thousand less from me this year. We haven't heard the reason as to why they're doing it this way now, and it could well be to make sure they keep making bucks, or to save bucks on the back-end.

Believe it or not, sometimes businesses do the right thing because it's the right thing, not to make a buck.
 
For now. Still not acceptable even given one believes apple, and I don't anymore.

I don't mind my photos being scanned either, just not on my device. If apple cancels this and goes to server side scanning, I'll turn on iCoud photos because I like the backup. And yes, I've taken way more time than I needed to understand the system and my first gut reaction, big brother, remains.
Now I’m curious why you’d be okay with scanning in the server side and not on your phone? Why does that seem better for you? Seriously, genuinely want to know so I can temper my view. For me something happening locally on my phone that I control would seem safer than something that happens server side that I don’t control. I get them implementing the feature in iOS is out of my control. But my control of its operation is mine to control. Just like the A.I. algorithm implimented in iOS is out of my control and sharing it is in my control.

I respect that you don’t trust them. Maybe Apple, and other companies, can never gain your trust.
 
No, it's based off numbers.
No, definitely not, it can't be since it hasn't been running on users phones yet.

You were clearly drawing a parallel between these other situations (or at least your evaluation of them) and CSAM detection.
No, only the ability to lie.

Believe it or not, sometimes businesses do the right thing because it's the right thing, not to make a buck.
ROFL!
 
So if we disagree with you, we're stupid? Not exactly the way to promote rational conversation.

You say it's not a child protection feature, but then you immediately say that child protection is a great use of this feature 🤔

You say you can't have child protection without "many horrible uses" as well. That's simply false. iOS isn't open-source - Apple would have to sanction any "horrible" use of this technology, which--while obviously possible--I have absolutely no fears that they will.

And again, as so many people seem to forget, this feature would only be active IF you use iCloud for photos - so if you're paranoid about it, you still have a choice (and if you're paranoid about your photos, you shouldn't be uploading them to the cloud anyway, unless you own and control the server).
If giving up your right to privacy makes you feel safer.. well.. yes, my position stands.

It's the same argument over and over. In order to protect the children, we must eliminate encryption. How can we protect children if criminals have the right to privacy.. blah blah blah. The fundamental issue here is that you can't grant privacy selectively, not to only specific people nor for specific reasons. Either everyone has privacy or no one does. But good people want to protect children, and so they buy into the idea that it's ok to give up their privacy because it makes children safer. But there is no going back from that. Once you give it up, you find out all the bad things that a lack of privacy leads to (assuming you lacked the forethought to see it coming).

Our privacy protects us from criminals. Our privacy protects our free speech. Our privacy protects our history. These things are all in jeopardy without privacy. There are other ways to protect children. We should put more energy into those.
 
Now I’m curious why you’d be okay with scanning in the server side and not on your phone?
It's their property and responsibility, they can do what they want with it. If I want to use it for iCloud, I have to accept that they'll scan it and I do. My phone is my private phone, only letting out what info I want let out.

Why does that seem better for you?
Scanning on their servers can't get to everything I have on my phone, and like I said, it's their hardware and maintenance, not mine.

For me something happening locally on my phone that I control would seem safer than something that happens server side that I don’t control.
That's just it, I don't think I'll be able to control it always -- once there's a foot in the door. Server side is never under my control and can't be thought of that way. It's public, even if it takes a password to access it, I always assume Apple has that password.
I respect that you don’t trust them. Maybe Apple, and other companies, can never gain your trust.
I did trust them, and it was my mistake.
 
If giving up your right to privacy makes you feel safer.. well.. yes, my position stands.

It's the same argument over and over. In order to protect the children, we must eliminate encryption. How can we protect children if criminals have the right to privacy.. blah blah blah. The fundamental issue here is that you can't grant privacy selectively, not to only specific people nor for specific reasons. Either everyone has privacy or no one does. But good people want to protect children, and so they buy into the idea that it's ok to give up their privacy because it makes children safer. But there is no going back from that. Once you give it up, you find out all the bad things that a lack of privacy leads to (assuming you lacked the forethought to see it coming).

Our privacy protects us from criminals. Our privacy protects our free speech. Our privacy protects our history. These things are all in jeopardy without privacy. There are other ways to protect children. We should put more energy into those.

Again, we're talking about a scan that is only happening if you upload to Apple's cloud service. iCloud has NEVER been private like local storage on your iPhone. You can wax all melodramatic about your "privacy" all you want, but literally nothing has changed.
 
No, definitely not, it can't be since it hasn't been running on users phones yet.

It's called running models. Do you really think Apple just made up a number out of thin air? LOL!

No, only the ability to lie.

You said lie JUST TO MAKE A BUCK. I even quoted your exact words. Also, we all have the ability to lie, so does that mean everything we say should be called a lie?


Laugh all you want. Doesn't change facts.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.
Back
Top