Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Enough with this already, please. Apple, STOP trying to brainwash us every day now. Apple, STOP feeding us with the bull. I have a compromise for you to take. Since you did not even bother what the consumer thinks about this CSAM and you are bringing the CSAM feature this fall...

At this point, I am willing to pay $99 per year for keeping my privacy to myself. It's a win/win. Go ahead and start charging consumer to keep their privacy to themselves. We are talking billion of dollars in PROFIT.

$99 Per Year Subscription

"No iCloud Photo/Messages Scanning, Keep Your Data to Yourself and 100% Privacy"

That's Apple. That’s iPhone.



I will wait for your response...
So, you want child pornographers to be able to opt out for 99$?
 
Read it again? If there was a single CSAM match, nothing would happen, no one would be notified.

And secondly, the system requires the threshold of images to be exceeded so trying to seek out even a single image from a person's device or set of people's devices won't work because the system simply does not provide any knowledge to Apple for single photos stored in our service.
I suppose this "Threshold" is to prevent anything accidentally or unintentionally downloaded?
 
Has this been posted yet...


hqdefault.jpg



Apple, you took a wrong turn last week....
 
Maybe one could explain to me what kind of photos are you feared to show to Apple? I know, I know, it will heavily rain those red thumbs down scores. But maybe you can share your concerns with REAL examples.

In my world: I never had and never and know no one off of my friends, relatives and parents who ever had taken at any time naked pictures of them selfes, nor do I wish to have contact with those.

Also: What if Apple's take on this would decimate child porn world wide like at about 50%? Wouldn't it be worth it?
If you did take a naked selfie, it wouldn’t likely be on a known list of photos to be in the list.
 
I don't understand your thinking here at all. Apple isn't trying to take your physical iPhone/iPad away from you. This has nothing to do with ownership of your physical device.

Ownership does not just equal possession. It also equals control. Yes, we give up some control by allowing Apple to put iOS on the device and agreeing to their terms. Historically they have respected the ability for users who own their devices to have some control over them. Again, this is taking a step that disrespects that dynamic.

This appears to be a philosophical difference in what you and I (and others) think of as ownership. So we'll likely have to disagree on this... even though I'm right, historically speaking anyway. ;)


So now iCloud is going to be more private in this regard. The article you linked (which I assume is true, though they don't cite their source for the claim "Apple has confirmed that photos already uploaded to iCloud will also be checked against CSAM hashes when the feature goes live this fall") indicates that the in-the-cloud scan is for existing iCloud photos, not new ones. That of course makes sense. It's not saying your photos are going to be double-scanned: once on the device and again in the cloud. The whole POINT of this new method is to move scanning of any new photos to the device where it's much more secure.

I agree, that article I linked needed to source their info, but historically they've been pretty good about accurate reporting. I'll look for a better source when I have more time later.

"...iCloud is going to be more private in this regard." I guess that's technically correct, but you're making the assumption that they're going to do a single, one time blanket CSAM scan of all existing iCloud photos and then never hash scan them again... even when the CSAM database is updated with new hashes? Multiple scans seem inevitable to make this whole thing worthwhile in helping cut down on CSAM.

It's a technical workaround, for a small privacy benefit that gives them privacy bragging rights, while forcing me to allow them to use my device to accomplish it.

And since it's an opt-out feature for criminals, honestly it's feeling more and more futile anyway, and I'm surprised they're getting such high praise from organizations like the NCMEC.
 
  • Like
Reactions: JMacHack
Ownership does not just equal possession. It also equals control. Yes, we give up some control by allowing Apple to put iOS on the device and agreeing to their terms. Historically they have respected the ability for users who own their devices to have some control over them. Again, this is taking a step that disrespects that dynamic.

This appears to be a philosophical difference in what you and I (and others) think of as ownership. So we'll likely have to disagree on this... even though I'm right, historically speaking anyway. ;)




I agree, that article I linked needed to source their info, but historically they've been pretty good about accurate reporting. I'll look for a better source when I have more time later.

"...iCloud is going to be more private in this regard." I guess that's technically correct, but you're making the assumption that they're going to do a single, one time blanket CSAM scan of all existing iCloud photos and then never hash scan them again... even when the CSAM database is updated with new hashes? Multiple scans seem inevitable to make this whole thing worthwhile in helping cut down on CSAM.

It's a technical workaround, for a small privacy benefit that gives them privacy bragging rights, while forcing me to allow them to use my device to accomplish it.

And since it's an opt-out feature for criminals, honestly it's feeling more and more futile anyway, and I'm surprised they're getting such high praise from organizations like the NCMEC.
Let me preface where I stand before I ask my questions.
1) Anyone who says "I don't do anything illegal so I don't care" doesn't understand history nor the dangers of giving up rights
2) I think it would be honest to say that most people never realized there was such a thing as CSAM scanning in the cloud, me included. Never crossed my mind, I've been blessed to never have been scarred by that. But there is a certain level of shock to people that their perception of current privacy wasn't reality.
3) CP is a HUGE problem, bigger than people think, and our privacy and rights are always in balance with our security.

From a technical point of view, I don't have a problem with what Apple is doing. I don't see it as being a slippery slope, only because Apple (and others) could always get into our phones. We have trusted them for over a decade that they wouldn't do it. Remember Apple getting in trouble for dropping the U2 song on people's phones? I see it much the same way, there is a creepy feeling about it, but logically there is no difference. I think both can be true and we are ok with living with some grey here.

So my questions are these:
1) Would you be ok with a toggle in settings that turned off CSAM scanning, but it would be required to be turned on for iCloud photos? For example, you go to turn on iCloud photos and it tells you have to turn on CSAM scan?
2) If Apple came out soon saying "Now that we are scanning client side and don't need to do it server side, we are excited to announce end-to-end encryption on iCloud" would that be worth it?

I'm hoping to have a real dialogue with someone, thanks
 
  • Like
Reactions: Nightfury326
Ownership does not just equal possession. It also equals control. Yes, we give up some control by allowing Apple to put iOS on the device and agreeing to their terms. Historically they have respected the ability for users who own their devices to have some control over them. Again, this is taking a step that disrespects that dynamic.

This appears to be a philosophical difference in what you and I (and others) think of as ownership. So we'll likely have to disagree on this... even though I'm right, historically speaking anyway. ;)




I agree, that article I linked needed to source their info, but historically they've been pretty good about accurate reporting. I'll look for a better source when I have more time later.

"...iCloud is going to be more private in this regard." I guess that's technically correct, but you're making the assumption that they're going to do a single, one time blanket CSAM scan of all existing iCloud photos and then never hash scan them again... even when the CSAM database is updated with new hashes? Multiple scans seem inevitable to make this whole thing worthwhile in helping cut down on CSAM.

It's a technical workaround, for a small privacy benefit that gives them privacy bragging rights, while forcing me to allow them to use my device to accomplish it.

And since it's an opt-out feature for criminals, honestly it's feeling more and more futile anyway, and I'm surprised they're getting such high praise from organizations like the NCMEC.

Well, I disagree that ownership of a device = control over the software portion, though of course Apple DOES grant users much control, including NOT using iCloud for photos, which disables any scanning process.

Yes, if you do find a primary source for that claim about Apple continuing to scan photos in the cloud, please link it. Because I'm sure the primary source explains it better than someone else's interpretation of it, and I'm sure the reality doesn't contradict what Apple has said here, because of course that wouldn't make any sense (and Apple's not stupid).

And again, Apple's goal here is NOT to prevent CSAM from being stored on the device (that WOULD require a gross breach of privacy); it's to prevent it from being uploaded and distributed on their servers. But since they obviously don't know who is going to do that, everyone's photos get scanned. So this makes things much more private for innocent people. And criminals "opting out" (i.e. disabling iCloud photo storage) means they won't be able to use iCloud to distribute their filth. It's not like they get to opt out and continue to use the service. Obviously they have other ways of distributing CSAM which of course Apple has no control over, but iCloud won't be one of them.
 
All my photos are synced on my local NAS. Sure its a little less user friendly at times but my privacy is worth it. The iMessage thing is still an issue. No doubt this will expand to other categories to flag someone for, not just for sexual content. Whatever an administration feels is a threat at the time?

When are the neuro implants from Apple coming? When you have “wrongthink” it’ll report you.
 
  • Like
Reactions: JMacHack
you mean the same vast majority of the public who posts every little detail of their life on Facebook and Instagram?
The vast majority of the public who looks UP everything they ever needed to know, from the mundane to the extremely personal, on Google search?
The vast majority of the public who does the majority of their communications with Facebook messenger, Twitter direct message and SMS?
do you know what Google, Instagram, Facebook, and Twitter all have in common? They’ve been doing this exact thing for years. The general public do not and will never care about this kind of stuff until it actually affects them, and at the moment it does not.
sure, in the hypothetical future when hypothetical governments get their hypothetical hands on this type of technology and hypothetically force tech companies into hypothetically adding all sorts of restrictions, then we can be worried. But as for the current time, 99.999% of people do not care.
they’re not gonna get a new phone because of this, they’re not gonna turn off iCloud because of this, they’re definitely not gonna give up any of their social media because of this. They just don’t care.
Well, I don't Twit and I don't FaceBorg. I don't use Google as a search engine and I don't have a Google login. I block trackers of all kinds. No Instagram, either. And, lest you think I'm some sort of Luddite, I've been on the Internet since before it was *called* the Internet (i.e. back in the CSNet/NSFNet days).

There are already regimes who would eagerly seek to expand and exploit this scanning capability. Some of them are extremely lucrative markets for Apple. This "feature" is a potential danger to dissidents, human rights activists, and journalists whether or not it's only used in very restricted and "benign" ways in the US.

If you wait for a hypothetical regime (here or abroad) to exploit this, it will already be too late.
 
So my questions are these:
1) Would you be ok with a toggle in settings that turned off CSAM scanning, but it would be required to be turned on for iCloud photos? For example, you go to turn on iCloud photos and it tells you have to turn on CSAM scan?

To clarify, the toggle would disable CSAM scanning on device, but it would be required to be turned on solely in the cloud - I think that's what you're asking? If so, then 100% yes.

2) If Apple came out soon saying "Now that we are scanning client side and don't need to do it server side, we are excited to announce end-to-end encryption on iCloud" would that be worth it?

This is the ultimate question, right? Because that's what a lot of people think they're ultimately going for here. If I'm being 100% honest, I'm still a little torn on this. I've already accepted the lack of privacy in the "cloud" (any one, not just iCloud). Those aren't my servers, I'm "renting" space on them, so to speak.

I have not yet accepted the idea of my device being utilized for this scanning system. I'm not renting my device. This whole thing blurs the lines between the cloud and local storage.

Because of that, right now I'm leaning towards saying it would not be worth it. At that point they would get to claim full E2EE of iCloud, and yet they've also opened up a methodology that allows E2EE data to be tracked/scanned/reported to authorities, therefore making one of the primary "perks" of E2EE less of a perk. I have nothing to hide, but it's the principle of the thing for me.

I'm all for a real dialogue! :)
 
Well, I disagree that ownership of a device = control over the software portion, though of course Apple DOES grant users much control, including NOT using iCloud for photos, which disables any scanning process.

Right, I agree with that, but if they want to keep users happy, they have to respect that dynamic.

And again, Apple's goal here is NOT to prevent CSAM from being stored on the device (that WOULD require a gross breach of privacy); it's to prevent it from being uploaded and distributed on their servers. But since they obviously don't know who is going to do that, everyone's photos get scanned. So this makes things much more private for innocent people. And criminals "opting out" (i.e. disabling iCloud photo storage) means they won't be able to use iCloud to distribute their filth. It's not like they get to opt out and continue to use the service. Obviously they have other ways of distributing CSAM which of course Apple has no control over, but iCloud won't be one of them.

See, I'm with you on the general thought here. But it's the part about submitting everybody to the same process to prevent a few from uploading nefarious things to iCloud - that's what I have a problem with. And yes, I know there are examples of this in every day life that we have to deal with. But none of those examples involve stepping into my personal property, so to speak (I realize that analogy isn't perfect either).

"...it's to prevent it from being uploaded and distributed on their servers." - The CSAM is getting uploaded to their servers no matter what. It's being "caught", so to speak, on the device, but it's still going to end up on their servers and then dealt with. That process can be done server-side without needing my device to do it, and the CSAM will spend the exact same amount of time on their servers before being flagged/reported/etc.
 
  • Like
Reactions: Philip_S
Maybe one could explain to me what kind of photos are you feared to show to Apple? I know, I know, it will heavily rain those red thumbs down scores. But maybe you can share your concerns with REAL examples.

In my world: I never had and never and know no one off of my friends, relatives and parents who ever had taken at any time naked pictures of them selfes, nor do I wish to have contact with those.

Also: What if Apple's take on this would decimate child porn world wide like at about 50%? Wouldn't it be worth it?
Putting everyone in prison would reduce all crime by 100%. Would that be worth it?
 
I'm glad to have at least one answer, that the hash database is going to be distributed within IOS itself. But how big is it?

Is it already within the IOS 15 public betas?

The reason I ask: the camera on my phone can take 100s of pictures in no time. I assume the producers of badness can also take lots of pictures, and therefore that means there will be a huge quantity of hashes in the database, no?
 
  • Like
Reactions: Stunning_Sense4712
But it ALWAYS could have been abused even before this, because they were scanning in the cloud already. So this new change makes no difference in that regard. It simply makes what they were already doing more private. That's a good thing.
As far as I'm aware, Apple has been scanning email in the cloud, not photos. I believe that that is probably where the 265 reports that Apple filed last year came from.
 
  • Like
Reactions: Stunning_Sense4712
Again - all for trying to do the right thing...But two things still stick out for me.

1). if they are scanning your photos for known hashtags, then they are looking for known pictures, not new ones or those abusing and making the media? Not saying any of it is good, but I want those making it and abusing to be the focus. No media, no pictures, etc.

2.) Let's say someone is in trouble, regardless if they know or don't know they have it, isn't this unconstitutional as the practice violates the right to be free from unreasonable searches. It's like entering a property without a search warrant.
Under the constitution, in law or logic, that the need to prove intent or they have this media before they can search -- this is an unknown and not sufficient to justify a warrantless search under the law. As well, without a warrant, they are using your owned and paid items to incriminate you which is a violation of the Fifth Amendment.

Once again, not trying to rationalize anyone having it on their phone and iCloud, but seems to open the door for more things to go wrong.
It doesn't violate the 4th Amendment because it's a private party (Apple) doing the search. The Bill of Rights, unfortunately, only protects you from the government. Once Apple is in possession of the information, they're perfectly free to give it to the authorities. I'm sure the Terms of Service and Privacy Policy will be amended to incorporate consent to rifle through your data and hand over any interesting tidbits, just to be sure.
 
  • Like
Reactions: JMacHack
Geez. "And" is a conjunction, not a subject. You don't start sentences with conjunctions. Wow, what has happened to our spoken language these past 10 years? Are copywriters that dang lazy or ignorant when to detect someone speaking a run-on sentence? Can people no longer form coherent sentences?
 
Enough with this already, please. Apple, STOP trying to brainwash us every day now. Apple, STOP feeding us with the bull. I have a compromise for you to take. Since you did not even bother what the consumer thinks about this CSAM and you are bringing the CSAM feature this fall...

At this point, I am willing to pay $99 per year for keeping my privacy to myself. It's a win/win. Go ahead and start charging consumer to keep their privacy to themselves. We are talking billion of dollars in PROFIT.

$99 Per Year Subscription

"No iCloud Photo/Messages Scanning, Keep Your Data to Yourself and 100% Privacy"

That's Apple. That’s iPhone.



I will wait for your response...
Why pay a subscription for that? If you want to be better secured, keep iCloud photo, iCloud email, iCloud backup, and iMessage in the cloud turned off. The same goes for Apple Notes, and any other third party app that you feel holds personal information you don't want accessed by Apple.

Apple made 265 CSAM reports last year. Guess where that came from...iCloud email.
 
Last edited:
Geez. "And" is a conjunction, not a subject. You don't start sentences with conjunctions. Wow, what has happened to our spoken language these past 10 years? Are copywriters that dang lazy or ignorant when to detect someone speaking a run-on sentence? Can people no longer form coherent sentences?
Seriously? Grammar Nazis now? Have we stooped this low?
 
  • Like
Reactions: mikecwest
Right, I agree with that, but if they want to keep users happy, they have to respect that dynamic.



See, I'm with you on the general thought here. But it's the part about submitting everybody to the same process to prevent a few from uploading nefarious things to iCloud - that's what I have a problem with. And yes, I know there are examples of this in every day life that we have to deal with. But none of those examples involve stepping into my personal property, so to speak (I realize that analogy isn't perfect either).

"...it's to prevent it from being uploaded and distributed on their servers." - The CSAM is getting uploaded to their servers no matter what. It's being "caught", so to speak, on the device, but it's still going to end up on their servers and then dealt with. That process can be done server-side without needing my device to do it, and the CSAM will spend the exact same amount of time on their servers before being flagged/reported/etc.

I think they ARE respecting that dynamic. Again, every single user has the choice to disable iCloud for photos. Isn't that control?

Oh, it's not a "few" people attempting to upload CSAM. It's a pandemic and has been for a long time. Believe me, Apple (and other cloud services) would not be spending all these resources and time on CSAM detection if it were some small problem.

Yes, it will be on their servers, but in effect "quarantined" for review and not able to be distributed. As for the "exact same amount of time" - I don't know. Was the server-side scan happening in real time as the photos were uploaded or just periodically? In any case, the point of this change isn't to reduce time on server or anything but to rather to continue to do what they've always been doing in a more private manner. That's it! I don't think Apple ever claimed that the method is more effective when done on the device itself - just more private.
 
Geez. "And" is a conjunction, not a subject. You don't start sentences with conjunctions. Wow, what has happened to our spoken language these past 10 years? Are copywriters that dang lazy or ignorant when to detect someone speaking a run-on sentence? Can people no longer form coherent sentences?

Just FYI:


 
  • Like
Reactions: jntdroid
Let me preface where I stand before I ask my questions.
1) Anyone who says "I don't do anything illegal so I don't care" doesn't understand history nor the dangers of giving up rights
2) I think it would be honest to say that most people never realized there was such a thing as CSAM scanning in the cloud, me included. Never crossed my mind, I've been blessed to never have been scarred by that. But there is a certain level of shock to people that their perception of current privacy wasn't reality.
3) CP is a HUGE problem, bigger than people think, and our privacy and rights are always in balance with our security.

From a technical point of view, I don't have a problem with what Apple is doing. I don't see it as being a slippery slope, only because Apple (and others) could always get into our phones. We have trusted them for over a decade that they wouldn't do it. Remember Apple getting in trouble for dropping the U2 song on people's phones? I see it much the same way, there is a creepy feeling about it, but logically there is no difference. I think both can be true and we are ok with living with some grey here.

So my questions are these:
1) Would you be ok with a toggle in settings that turned off CSAM scanning, but it would be required to be turned on for iCloud photos? For example, you go to turn on iCloud photos and it tells you have to turn on CSAM scan?
2) If Apple came out soon saying "Now that we are scanning client side and don't need to do it server side, we are excited to announce end-to-end encryption on iCloud" would that be worth it?

I'm hoping to have a real dialogue with someone, thanks
Apple can't get into our phones. That was the point of FBI case with the San Bernardino shooter.

Apple didn't put the U2 album on our phones. They server-side added it to the iTunes accounts as a credit/entitlement. Yes that logic and data ultimately synchronized to the iTunes app on the phones, but that was the same channel as if you bought an album via iTunes. It was just a $0 purchase.

In this CSAM situation, Apple still isn't doing anything. It's being done client side, on-device. Apple isn't scanning our photos (headlines have it wrong, but that's what generates the clicks and ad impressions on the news sites; "truth" is not what it once meant with journalists and news wires).
 
  • Haha
Reactions: freedomlinux
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.