Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
No. They are matching your hashed image to a stored hash of a known child abuse image. The scan only takes place if your image is going to be uploaded to the cloud anyway.
And only if, rumored around 30+, successful hash matches would those matched photos be sent for review. None of your other photos will be manually reviewed.

most cloud services do cloud scans, how is this any different as the photos going to the cloud anyway?
This method improves encryption for non-abuse images as apple can throw away the key and law enforcement can’t view them at all. Currently they can…
please stop spreading disinformation
 
The reason is Apple wanted to still be able to employ E2EE for iCloud, which they have kept postponing.
Just a reminder, Apple never said that was the reason for this plan, nor have they announced full E2EE for iCloud.

I would actually understand their motives if they did announce E2EE was the reason, not that it would get me to buy any more Apple hardware or upgrade OS's, but I wouldn't they were quite so evil/stupid.
 
I have never seen such mountains made out of molehills in my life! What a knee-jerk society we live in. I think this is one of those situations where the amount of opposition indicates you must be doing something right to ruffle so many feathers.
 
This right here has always bothered me about this argument against CSAM. My iPhone knows about landscape, dogs, pools, etc…. Yet people never once batted an eye about it. Like literally there could already be a back door plan handing over your information to covert governments. I personally don’t think thats happening but part of the tools are already there.

I’m not against discussion against or for CSAM but I really wish I could be in the room where people are crying fowl over this CSAM business and say ‘Dude take out your phone and type in ”grass”. How may photos popped up with grass in it? Why are you now complaining and do you understand the difference?”
You haven't read much of the threads, have you.

the reason why typing grass and the phone showing your pictures of grass, is that it's for our personal benefit, and most importantly, it doesn't turn you in. It's not the scanning per se, it's the scanning for the government that's the problem!
 
  • Like
Reactions: Philip_S
Just a reminder, Apple never said that was the reason for this plan, nor have they announced full E2EE for iCloud.

I would actually understand their motives if they did announce E2EE was the reason, not that it would get me to buy any more Apple hardware or upgrade OS's, but I wouldn't they were quite so evil/stupid.
I don’t need the reminder, but thanks. I see no other valid reason why they would try and thread this needle. If they don’t plan to implement E2EE I see little reason not to handle this server side.
 
Re-read the original post: ”The planned features include scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search.”

The topic is for both features.
I’m saying it’s two different programs and you will just cause confusion to discuss both in same thread…. Which can be witnessed in the multitude of threads all week, the only thing they have in common is Apple announced both on same day
 
  • Like
Reactions: freedomlinux
I don’t need the reminder, but thanks. I see no other valid reason why they would try and thread this needle. If they don’t plan to implement E2EE I see little reason not to handle this server side.
Good. But extrapolating what you think Apple is going to do is not very productive. I would be very surprised if they did try to implement e2ee, it's a can of worms that they probably don't want to pay for.
 
This right here has always bothered me about this argument against CSAM. My iPhone knows about landscape, dogs, pools, etc…. Yet people never once batted an eye about it. Like literally there could already be a back door plan handing over your information to covert governments. I personally don’t think thats happening but part of the tools are already there.

I’m not against discussion against or for CSAM but I really wish I could be in the room where people are crying fowl over this CSAM business and say ‘Dude take out your phone and type in ”grass”. How may photos popped up with grass in it? Why are you now complaining and do you understand the difference?”
Pretty huge difference, one is a a.I. Algorithm designed to help you find stuff on your device, the other is spyware designed to look for illegal content and report you to law enforcement …. Your just not thinking this through, the a.I. Algorithm is a sandboxed on Device indexing tool, unless you opted in to help Apple improve it, it stays on your device
 
  • Like
Reactions: Philip_S
I have never seen such mountains made out of molehills in my life! What a knee-jerk society we live in. I think this is one of those situations where the amount of opposition indicates you must be doing something right to ruffle so many feathers.
You should probably be thankful so many care about privacy because yours would be gone without them. The cost of freedom is eternal vigilance, there really are bad actors out there that will pounce on any opportunity to take both your freedom and privacy away. You have to stomp out small fires quickly before they spread
 
You should probably be thankful so many care about privacy because yours would be gone without them. The cost of freedom is eternal vigilance, there really are bad actors out there that will pounce on any opportunity to take both your freedom and privacy away. You have to stomp out small fires quickly before they spread

Except I don't believe this compromises privacy in any way. That's the whole reason the scan is happening on the device (away from Apple's view, since they don't have your passcode or biometrics and there's no backdoor) vs. on their servers. And of course use of cloud services has never been for people who are paranoid about privacy. Apple can decrypt and view your data at any time (read the iCloud legal agreement) and I'm sure it's the same with most other cloud services.
 
I don’t need the reminder, but thanks. I see no other valid reason why they would try and thread this needle. If they don’t plan to implement E2EE I see little reason not to handle this server side.
If that’s the plan they can Still move forward but they will have to develop an intermediate server to process the hashes, you probably can’t call it end to end with that hole but I’m sure Apple can figure it out …. If they have accepted and understand the concern better than the people in this forum anyway lol
 
Except I don't believe this compromises privacy in any way. That's the whole reason the scan is happening on the device (away from Apple's view, since they don't have your passcode or biometrics and there's no backdoor) vs. on their servers. And of course use of cloud services has never been for people who are paranoid about privacy. Apple can decrypt and view your data at any time (read the iCloud legal agreement) and I'm sure it's the same with most other cloud services.
Well that just shows you still don’t understand the issue… everyone is fine with anything they want to do in the cloud, they Simply can not use our property in the chain to process it. They can ask you, you can give permission if it does not bother you just like they currently ask permission to upload your data to improve services. I imagine some say yes to that prompt… I don’t
 
When you receive this much backlash over a feature intended to protect kids from sexual abuse material and prevent adults from distributing said material, you know you’re doing something right.

I think many people are truly unaware of the staggering prominence of child abuse in society, if people knew how common and widely distributed the material is they might throw some support behind this.

Meanwhile, your government is actively tracking your location everywhere you go, QR code check ins show what places you visit and how long you stay. CCTV exists on every corner, every traffic light, monitoring your movement patterns through facial recognition & number plates. Every time you tap and buy something you reveal more of yourself. None of this surprisingly makes people revolt in protest, when it should, and yet the idea of Apple implementing a child-protection feature has everyone crying “encryption!”
Seriously, how are you that naive and so willing to give up the very last piece of privacy you have? Do you look forward to living in an orwellian nightmare? Just because all those are BAD surveillance systems have gained footing doesn't mean we just throw in the towel on personal privacy.

It is not a child protection feature. That's merely one use of an encryption-bypassing, privacy-violating feature. Sure, protecting children is a great use of this feature, but there are many horrible uses of these feature too. And you can't have the one without the others. It is not worth it. You don't get safer by giving up all your freedoms. Every intelligent person ever knows this.
 
Well that just shows you still don’t understand the issue… everyone is fine with anything they want to do in the cloud, they Simply can not use our property in the chain to process it. They can ask you, you can give permission if it does not bother you just like they currently ask permission to upload your data to improve services. I imagine some say yes to that prompt… I don’t

You don't own iOS. The scan is connected to your use of a particular iCloud service (photos). If you don't want your photos scanned, simply disable iCloud for photos.
 
You don't own iOS. The scan is connected to your use of a particular iCloud service (photos). If you don't want your photos scanned, simply disable iCloud for photos.

We've had this discussion elsewhere. You're right. But we own the device. Apple has historically done a good job of respecting that dynamic and towing that line. Most people are struggling with the fact that they feel like Apple is over-stepping within that dynamic now.
 
  • Like
Reactions: dk001
It is not a child protection feature. That's merely one use of an encryption-bypassing, privacy-violating feature. Sure, protecting children is a great use of this feature, but there are many horrible uses of these feature too. And you can't have the one without the others. It is not worth it. You don't get safer by giving up all your freedoms. Every intelligent person ever knows this.

So if we disagree with you, we're stupid? Not exactly the way to promote rational conversation.

You say it's not a child protection feature, but then you immediately say that child protection is a great use of this feature 🤔

You say you can't have child protection without "many horrible uses" as well. That's simply false. iOS isn't open-source - Apple would have to sanction any "horrible" use of this technology, which--while obviously possible--I have absolutely no fears that they will.

And again, as so many people seem to forget, this feature would only be active IF you use iCloud for photos - so if you're paranoid about it, you still have a choice (and if you're paranoid about your photos, you shouldn't be uploading them to the cloud anyway, unless you own and control the server).
 
You don't own iOS. The scan is connected to your use of a particular iCloud service (photos). If you don't want your photos scanned, simply disable iCloud for photos.
As I’ve stated before, people like you that are ok with it don't matter, it’s only the people that are not ok with it that caused Apple to reverse course, there was enough of us , we won the first round, now just waiting for apples next move
 
  • Like
Reactions: Philip_S
So if we disagree with you, we're stupid? Not exactly the way to promote rational conversation.

You say it's not a child protection feature, but then you immediately say that child protection is a great use of this feature 🤔

You say you can't have child protection without "many horrible uses" as well. That's simply false. iOS isn't open-source - Apple would have to sanction any "horrible" use of this technology, which--while obviously possible--I have absolutely no fears that they will.

And again, as so many people seem to forget, this feature would only be active IF you use iCloud for photos - so if you're paranoid about it, you still have a choice (and if you're paranoid about your photos, you shouldn't be uploading them to the cloud anyway, unless you own and control the server).
Perhaps you should read the article… your defending something after Apple said oops, we may need to rethink this….your kinda late trying to convince people Apple was right when they just backtracked
 
  • Like
Reactions: Philip_S
But we own the device.

And? Apple isn't dictating what you can do with your physical device. You can sell it, throw it out the window, put it on display, etc. It's sort of like complaining that mandatory seat belts in vehicles are violating your privacy because you own the vehicle and should be able to do whatever you want with it and in it. But the government has control over that particular aspect. Just like Apple has control over the operating system that runs on your device.
 
Looks like you know nothing about what people is discussing here. They are discussing scanning your local photo album and eventually upload your private photo to Apple for human review.

I feel like I understand the discussion to a decent level to be honest. Your description of what Apple are suggesting they will implement just simply isn’t accurate. Any argument around this subject should really start with people looking at what is presented rather than their own interpretation.
 
Perhaps you should read the article… your defending something after Apple said oops, we may need to rethink this….your kinda late trying to convince people Apple was right when they just backtracked

You mean the article that says this:

The suite of Child Safety Features were originally set to debut in the United States with an update to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. It's not clear when Apple plans to roll out the "critically important" features or how it intends to "improve" them in light of so much criticism, but the company still appears determined to roll them out in some form.

There's a difference between scratching one's plans and taking time to be sure you're getting them right.
 
As I’ve stated before, people like you that are ok with it don't matter, it’s only the people that are not ok with it that caused Apple to reverse course, there was enough of us , we won the first round, now just waiting for apples next move

Except they haven't "reversed course". I know that's what you want, but that's not what's happened.
 
And? Apple isn't dictating what you can do with your physical device. You can sell it, throw it out the window, put it on display, etc. It's sort of like complaining that mandatory seat belts in vehicles are violating your privacy because you own the vehicle and should be able to do whatever you want with it and in it. But the government has control over that particular aspect. Just like Apple has control over the operating system that runs on your device.

Your quote left out 90% of my comment which brought additional context and affirmed that I agree with your general premise… 🤷‍♂️

Edit: you basically missed my entire point... or ignored it and replied to something I wasn't arguing about
 
Your quote left out 90% of my comment which brought additional context and affirmed that I agree with your general premise… 🤷‍♂️

Edit: you basically missed my entire point... or ignored it and replied to something I wasn't arguing about

I'm simply quoting what I felt was the crux of your comment, which I also believe is an irrelevant point. You said I was right "BUT" . . . "we own the device." And I'm saying, how does that change anything?

I'm not sure how you feel Apple isn't respecting any "dynamic" here. If you don't use iCloud for photos, then no scan will take place. And if you DO use iCloud for photos, you're already voluntarily allowing your photos to be stored on Apple's servers (and thus accessible to them, per the terms of service), so why would you care if they're being scanned, especially when the scan is happening outside of Apple's access?
 
I have never seen such mountains made out of molehills in my life! What a knee-jerk society we live in. I think this is one of those situations where the amount of opposition indicates you must be doing something right to ruffle so many feathers.
As a colleague of mine used to say 'it's not f(x) but d(x)'. It's not where we are that is the concern. It's where systems like this could go. And when you get objections to a proposal from all quarters - from people who know enough about the topic to understand its potential advantages and disadvantage, like EFF - you're doing something wrong. Very wrong.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.