Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Do you still believe in Apple's privacy values?

  • Yes, CSAM doesnt really affect/bother me

    Votes: 33 37.1%
  • Yes, CSAM is bad but Apple still cares about privacy more than the competition

    Votes: 15 16.9%
  • No, "privacy" is only a marketing term now

    Votes: 41 46.1%

  • Total voters
    89

wirtandi

macrumors regular
Original poster
Feb 3, 2021
179
179
Very interested to see if people still believe that Apple is a privacy-oriented company. Yes, in this day and age, it is impossible to have 100% privacy. But when you think about the competition, the other companies, do you still believe that Apple cares about privacy?
 
You should have added a "I couldn't care less" option. I honestly couldn't care less about any of this. I don't do anything in my life or with my phone that would have me worrying about much. There are far too many other areas of our day to day life that are tracked and monitored that go way beyond our phones. I don't think people truly realize just how little privacy most of us actually have.
 
iPhone is probably just the first step.

when Apple Cars come out, perhaps they can have speed monitor or red light violation detector that automatically alert the authorities and automatically debit your Apple Car with the applicable speeding or red light fines. (don't worry, you get 3% back with Apple Card) I can't see why anyone will object. Preventing people from speeding or running red lights has been proven to save lives.
 
You should have added a "I couldn't care less" option. I honestly couldn't care less about any of this. I don't do anything in my life or with my phone that would have me worrying about much. There are far too many other areas of our day to day life that are tracked and monitored that go way beyond our phones. I don't think people truly realize just how little privacy most of us actually have.
This.
 
I admit that Apple still have some of their practices that are pro privacy. Eg limiting what iOS exposes to 3rd party apps. So for a regular consumer, until the mass scanning system is switched on, iPhone is still the better phone if one concerns about privacy but wants a functioning smartphone.

However, if one wants a truly private phone, the iPhone can no longer be one. Since the mass scanning system is baked into iOS, you cannot trust the phone anymore. OTOH, for Android, you can de-google or opt for privacy focused custom ROM. So for ultimate privacy, Android is better as you have a way to install your own custom ROM.

Does Apple as a company actually really care about privacy? I believe they did. Jobs' stance on privacy was very clear, and he explicitly said it multiple times. However, seems like there are more and more people infiltrating the company, that influenced decisions in the name of virtue signaling. I can also see why more and more of the senior people from Jobs era start distancing themselves from the company. Of course, it's also hard for Apple since they're now so big that they have become a huge target for governments and woke activists. Tim Cook being more diplomatic, surely had to make some compromises. And since Apple is actually a publicly traded company, where the main focus is to maximize shareholder value, in the end Apple will do what companies do to achieve that, be it betraying their own privacy stance.

Yup, people need to remember that the main goal of a publicly traded company is to maximize shareholder value.
 
Last edited:
You should have added a "I couldn't care less" option. I honestly couldn't care less about any of this. I don't do anything in my life or with my phone that would have me worrying about much. There are far too many other areas of our day to day life that are tracked and monitored that go way beyond our phones. I don't think people truly realize just how little privacy most of us actually have.
Yes very true. there’s really truly bigger problems facing the world. Heck the water crisis in the south western states is going to be huge for California, Arizona, Nevada. And peoples are not even the least bit concerned here. Never mind water a life scarcity is real Threat to life. But ladida
 
  • Angry
Reactions: Euronimus Sanchez
You should have added a "I couldn't care less" option. I honestly couldn't care less about any of this. I don't do anything in my life or with my phone that would have me worrying about much. There are far too many other areas of our day to day life that are tracked and monitored that go way beyond our phones. I don't think people truly realize just how little privacy most of us actually have.
This, basically. It’s astonishing to me how many people are losing it over this as if our Apple/Google/Facebook overlords don’t have 100 other more invasive ways of monitoring us. Reminds me of the covid contract tracing API and people freaking out about being “tracked,” as if voluntarily carrying a smartphone with you weren’t the world’s easiest way to let yourself be tracked.
 
You should have added a "I couldn't care less" option. I honestly couldn't care less about any of this. I don't do anything in my life or with my phone that would have me worrying about much. There are far too many other areas of our day to day life that are tracked and monitored that go way beyond our phones. I don't think people truly realize just how little privacy most of us actually have.
Isn’t this option 1?
 
Yes. The CSAM feature design shows a huge amount of effort in doing an unpleasant job in the most privacy respecting way possible. There’s are new encryption techniques to blind virtually everything from everyone, which only make sense if e2ee is the next step. I’m fairly reassured that Apple were only doing this becuase Photos was about to become vastly more secure and private and that Apple feels a responsibility to make sure that their services don’t make it significantly easier to commit and conceal these vile crimes.
 
Yes, but their woke agenda trumps it. Now CSAM isn't exactly woke, but it shows they're willing to play cops on your device if they think they have the moral high ground. CSAM or not, im going to upgrade / buy a new iPhone, but I'll never look at Apple the same way again and Android is suddenly an alternative I never would have considered before. It's the equivalent of the cops showing up once at week at your house to check if you have something to hide. You're guilty until proven indocent.
 
Last edited:
This, basically. It’s astonishing to me how many people are losing it over this as if our Apple/Google/Facebook overlords don’t have 100 other more invasive ways of monitoring us. Reminds me of the covid contract tracing API and people freaking out about being “tracked,” as if voluntarily carrying a smartphone with you weren’t the world’s easiest way to let yourself be tracked.
Why not submit your own personal data voluntarily to Apple? Put all your data into a shared folder on iCloud and share it with Tim Cook. You got nothing to hide, no?
 
Yes, but their woke agenda trumps it. Now CSAM isn't exactly woke, but it shows they're willing to play cops on your device if they think they have the moral high ground. CSAM or not, im going to upgrade / buy a new iPhone, but I'll never look at Apple the same way again and Android is suddenly an alternative I never would have considered before. It's the equivalent of the cops showing up once at week at your house to check if you have something to hide. You're guilty until proven indocent.
The attitude of their own privacy head already showed what they think by implying "well don't do illegal things," which is highly offensive considering what considered illegal in other countries can differ. In some countries, homosexuality is illegal. The implication of an in-device mass scanning system is chilling. From all tech companies, Apple should've been the wiser.

A publicly traded company can change hands/management in a blink. What assurance that Apple users have that the next management team will keep the pinky promise? Look at Google, and how they deleted their own "do no evil" motto.
 
Why not submit your own personal data voluntarily to Apple? Put all your data into a shared folder on iCloud and share it with Tim Cook. You got nothing to hide, no?
I'll happily post all my personal data here. Wait, no just the additional personal data the CSAM system would reveal to Apple. Here it is...

0

A zero indicates a user has not uploaded 30 CSAM images to iCloud. I think anyone not even willing to share that bit (literally 1 bit) of information has a serious privacy fetish incompatible with the real world.
 
I'll happily post all my personal data here. Wait, no just the additional personal data the CSAM system would reveal to Apple. Here it is...

0

A zero indicates a user has not uploaded 30 CSAM images to iCloud. I think anyone not even willing to share that bit (literally 1 bit) of information has a serious privacy fetish incompatible with the real world.
Again, people seem to be stuck with the CSAM, ignoring the system itself being set up.
The CSAM is a red-herring.
 
Yes, I do still believe that Apple is a privacy-oriented company. I understand the possible ramifications, and I believe moving from technological privacy protection (you can't access information because the design of the phone prevents it) to policy privacy (you can't access information because we say so) is a step backwards.

But Apple took two important steps to mitigate that risk:

(1) They left the process on the phone and not in the cloud. I've heard assertions that it is the other way around that would be preferable, that if the process is on Apple's servers then you can protect yourself by not putting content on their servers. This is a very good point. But if the process is on the servers then that process truly becomes a black box. It certainly would have become far easier for Apple to use the feature in other ways without disclosure.

(2) They baked the comparison data (the CSAM database) right into iOS. This is a good thing. As far as I can tell from reading about it, there is no mechanism for your phone to get an updated list of photos without an iOS update. That means there is no background process that could be co-opted to get a list of other photos to compare onto the phone. It also means that watchdog groups and individuals can examine the list that is in iOS and confirm that the list is still only the CSAM list, and not other data.

I still see what I believe to be misunderstanding of the "scanning" that is being done on the phone via this process, but that could be a nomenclature disagreement. If one is worried about an iPhone evaluating every picture you take, determining if it is CSAM and alerting authorities, then that's not what is happening. But your phone is already scanning your photos this way, and has for years: that enables you to search for, say, "dog" and Photos presenting you with a list of your photos that have dogs in them. It enables Photos to offer suggestions of other pics with faces that you've already identified elsewhere. But this, too, remains on the device. It would be really convenient if I identified a face on my Mac and have that face identification automatically transmitted to the phone (or vice-versa), but it doesn't.

I feel that with this proposed feature, Apple has done its very best to keep it as privacy-preserving as possible, while still gaining some benefit for law enforcement that wasn't there before. I understand the point that "you don't get it, this isn't about CSAM, it's a bigger issue" and I appreciate that, but I do believe that the context does matter. In the same way, if Facebook announces some way in which they are going to improve your privacy, that alone doesn't make me feel like suddenly Facebook is a privacy-protecting platform.

(By the way, the argument "I don't care, because I'm not doing anything wrong. Apple can look at it all." is a bad argument that completely misses the point of what people are worried about.)

You might be cynical about why Apple is doing this. You might think it's a public ploy to undercut criticism that iPhones are the phones of pedophiles. Or you might think that Apple is doing this just for a marketing push: hey, look at us, we're against child porn! Or perhaps you take them at their word that they really ARE concerned, and really ARE sensitive to what law enforcement has said about it, and wants to help. No matter what the reason, I don't believe Apple has ceased to be a privacy protecting company. They have been public on this issue, and they have implemented it in a way where they can be monitored and called on the carpet if something seems amiss.

If you believe that they are lying to you, that their description of how this all works is a load of bull and they're tracking you anyway...well, if that's the case, then this isn't new and you wouldn't be swayed by any of these arguments and you shouldn't be.
 
Again, people seem to be stuck with the CSAM, ignoring the system itself being set up.
The CSAM is a red-herring.
No, I'm not stuck with that, but I've read how it works and I've thought about how it could be expanded to check for other things. But when you look at how it works, it's really guarded against abuse, such that those abuses would be far easier just performed entirely outside the CSAM system rather than trying to piggyback on it.

For example, if Apple were coerced into reporting every user with the Chinese Tank Man photo, they couldn't do it with the CSAM system without re-engineering the entire thing and rebuilding every user's safety vouchers because the algorithm can't possibly detect one single image. If they we're so coerced, they would simply ask Photos app to do it because the Photos app (unlike the CSAM system) does 'scan' all of your library in the open, does include stuff you don't send to iCloud and does using machine learning to interpret the contents. The only thing the Photos app doesn't do right now is send it's results to iCloud. But transmitting that Tank Man hit, which is just a single flag, would be pretty simple for Apple to do without you ever knowing. The surveillance framework has been there for years, and Apple haven't been caught using it, and even make steps to prevent themselves using it.
 
  • Like
Reactions: lkalliance
I'm going to be charitable and assume Apple came up with their on-device CSAM scanner to take government pressure off of them when they finally let their users encrypt the data they store on iCloud without giving Apple a key.

However, this system has a fatal flaw: Their users' privacy will depend, not on the strength of an encryption algorithm, but on the strength of Apple's executives and board members. They will have to successfully resist all of the governments of the world, who will insist that as long as Apple already has the technology built and installed, that other hash databases should be installed (terrorist images, COVID misinformation, election disinformation, etc.), that the government should be messaged when a child receives something age inappropriate, that Apple should match hashes against photos never uploaded to iCloud, and so on. They will have court orders. They will pass laws. Is Apple willing to break the law for their customers? I don't think so. It's especially doubtful since one reason they're adding CSAM scanners now is that they're caving in to government pressure.
 
Last edited:
No, I'm not stuck with that, but I've read how it works and I've thought about how it could be expanded to check for other things. But when you look at how it works, it's really guarded against abuse, such that those abuses would be far easier just performed entirely outside the CSAM system rather than trying to piggyback on it.

For example, if Apple were coerced into reporting every user with the Chinese Tank Man photo, they couldn't do it with the CSAM system without re-engineering the entire thing and rebuilding every user's safety vouchers because the algorithm can't possibly detect one single image. If they we're so coerced, they would simply ask Photos app to do it because the Photos app (unlike the CSAM system) does 'scan' all of your library in the open, does include stuff you don't send to iCloud and does using machine learning to interpret the contents. The only thing the Photos app doesn't do right now is send it's results to iCloud. But transmitting that Tank Man hit, which is just a single flag, would be pretty simple for Apple to do without you ever knowing. The surveillance framework has been there for years, and Apple haven't been caught using it, and even make steps to prevent themselves using it.
This system is set up literally to absolve Apple from any wrongdoings through plausible deniability when there are abuses in the future.
If Apple straight did it through the Photos app per your example, obviously it will be noticeable.

You really have to ask why the need to put such mass scanning system on every iPhone in the world, when experts in the field already indicated that it won't actually do much in terms of catching the abusers/predators.
 
I'm going to be charitable and assume Apple came up with their on-device CSAM scanner to take government pressure off of them when they finally let their users encrypt the data they store on iCloud without giving Apple a key.

However, this system has a fatal flaw: Their users' privacy will depend, not on the strength of an encryption algorithm, but on the strength of Apple's executives and board members. They will have to successfully resist all of the governments of the world, who will insist that as long as Apple already has the technology built and installed, that other hash databases should be installed (terrorist images, COVID misinformation, election disinformation, etc.), that the government should be messaged when a child receives something age inappropriate, and so on. They will have court orders. They will pass laws. Is Apple willing to break the law for their customers? I don't think so. It's especially doubtful since one reason they're adding CSAM scanners now is that they're caving in to government pressure.
The current Apple already allowed things such as the local Chinese servers for iCloud, and the pre-installed apps in Russia. Then there's the 3rd party payment in S.Korea. And Apple itself is a publicly traded company, where every management can be changed in a flash by the board of directors if their decisions don't maximize shareholder's value. Will Apple oppose anything the Chinese government ask, with the risk of exiting the Chinese market? Probably not. Considering how Apple is willing to break the line by baking in such system into iOS itself, and then tell people to just not do illegal stuff, is mind boggling.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.