Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I love the arguments that if you are against Apple on this you must either be uneducated, a conspiracy theorist, haven’t read Apple’s document or just stupid. Stop belittling people. If you think this is the greatest thing since sliced bread, then great, upgrade to iOS 15 and enjoy. I for one have lost all trust in Apple and won’t upgrade. I’m even in the process of removing more and more tech from my life. Call me crazy if you want (see above for the belittling comment), but it’s not the 2008-10 tech boom anymore. Tech has become more interested in money and power than in making life better. My eyes are wide open to life outside of big tech and starring at my phone all day.
I agree on this – if Apple proceeds I am done with iOS 15 and thereby new Apple devices. Will switch to GrapheneOS (when my 12 Pro with iOS 14 needs to be replaced) after buying my first iPhone in 2007 and a new one every year since then ...
 
If governments could compel Apple to change these features to do something awful, they could compel Apple to change any features to do something awful. And there are plenty of features that know far more about our data than a double-blind hash matching algorithm, and they aren't under this intense level of suspicion and analysis.
Name one "awful" thing any government has ever compelled Apple to do to their customers via their products and/services.

Don't worry. I'll wait.
 
Privacy does also NOT mean the ability for a private company to invade it.

Again this seems another poor effort to hide the fact that it is surveillance.

If people do deserve their privacy then you can't in the next breath contradict it.

There is such a thing as due process and law enforcement are bound to abide by that process. Sometimes yes its a pain in the butt but it is necessary.

Law enforcement agencies have no problem obtain permission through the courts if there is perceived to be a reasonable ground for suspicion, but private companies who ironically guard their privacy much more than most, and then send out threatening letters to some who even comment about potential new equipment can hardly be arbiters for what THEY check out and on whom.

Now if governments change the law and Apple are forced to engage in these machiavellian attempts to install a backdoor, then Apple should say so, then at least the electorate (in countries that allow voting) would be able to give a mandate or vote out governments that the electorate gave a mandate to govern.

Allow this in the name of child abuse/child pornography and install software on every Apple device eventually and you may not at some stage even have an election that means anything, as sure as eggs is eggs, one thing politicians love is power, and if the potential exists to do what Apple suggest, then the potential exists to extend it, and history shows us how when these things evolve they often end up being the tools of power for politicians and extreme governments, rather than any crime fighting excuse made at the outset.

Amazing Iceman. From your post it would appear you may be happier living in China or with a regime that dictates everything, or why not have mandatory chips placed in each of us as well as obviously free will is not something you particularly like and which is fundamental to freedom?

Presumably you like to choose what you wear, what car you drive, what TV channels you watch, where to go on vacation, etc. etc., well guard those freedoms well, as history shows what happens when surveillance is extended as it inevitably is, even if it starts out in the name of something as emotive as child protection.

For the record I've had both vaccines, still wear a mask to shops, because it is my choice based on the information I have available.

Its my choice whether to turn on iCloud on my hardware, my choice with most of the features on Apple iPhones, iPad, iMacs and other devices where settings and other controls allows you to choose even ironically privacy settings, but I have no choice when someone else puts software on my systems and usurps whatever choice I ever had....whether it starts off in the name of Child Safety it is a slipper slope taking away choice.

We even command about freedom and cherish it, presumably you cherish your freedom, well look after it as state sponsored surveillance en masse without reasonable cause or even via a company acting as a police force, erodes your freedom and as some have found it may finally remove whatever freedom you once had.

In an age of behemoth IT companies that wield more power than some governments, and are unelected, and even decide apparently what tax to pay and where and its ironic these organisations guard THEIR freedom, their privacy very well indeed.
I remember when people used to make a big deal about Microsoft as the evil giant tech company taking control of o0ur lives. Several movies were made about this too. But nothing happened. It was all fluff promoted by conspiracy theorists to keep people entertained talking about it and worrying for nothing.

Now the evil tech people are Google, Amazon and Apple. Microsoft doesn't even appear in the list anymore.
So no matter what these companies do, people will always find a bad motive behind it.
Let's talk again in 5 years, and laugh about how silly our comments were, and that nothing bad really happened.
 
But the same can be said for server side scans, it could be amended to scan whatever they wish.
Either way is still at that mercy. So why are we arguing?
At least client side you can decide not to update your iOS, especially as apple is providing the opportunity to stay on ios14 if you desire
I'm fine with it. I know other people have problems with server side scanning, but I'm not one of them. Client side, no way, no how, *way* too easy for Apple to change something. (like turning on iCloud photos on an update, or just going against what they say and putting something other than the CSAM DB, and scan whether I have sync on or not)

By putting that scanner and DB on my phone, I lost all trust in with they say.
 
I'm fine with it. I know other people have problems with server side scanning, but I'm not one of them. Client side, no way, no how, *way* too easy for Apple to change something. (like turning on iCloud photos on an update, or just going against what they say and putting something other than the CSAM DB, and scan whether I have sync on or not)

By putting that scanner and DB on my phone, I lost all trust in with they say.
But you trusted Apple before with a closed-source, proprietary operating system that controls your microphone, camera, precise location, and access to your communications, health, and financial records?

The truth is you either trust the hardware manufacturer or you don’t. I hear that you’ve lost trust now, but I’m not quite sure what makes this child abuse identification process any more instructive than the tools Apple already has, and presumably, for which you’ve previously trusted that they wouldn’t misuse.
 
It’s funny watching all the suddenly newly privacy focuses people Google(😂) the best privacy focused os’ across mobile and desktops and then name dropping them like they know what they’re talking about.

ooooh I’m gonna install graphineOS! Are you? Because it’s not like you think it is. Linux! Yes!! I hope you’re planning ahead with the apps you use. By the way - the only way to ensure these distros are doing what they say is if you study the code, and better, build yourselves. Probably one maximum here has the knowledge to do that, and I doubt they’ll bother. Have you heard about the linage os prankster devs that insert April fools ‘Easter eggs’ into your phone as a joke? Sound cool hey. But must be more private cos …. Not Apple

Privacy is a complicated thing, few of you realise how much. The beauty of Apple is it offers a strong layer of privacy for the masses. That’s most of you here, even if you think otherwise.

The fact of the matter is that not one post here, and very few postes in the other threads have actually come up with anything factually correct about apples implementation. Scarier still, most are pushing the benefits and virtues of server side over client side! It show a shocking lack on knowledge plastered around as if obvious fact.

Go do your homework! You’d be surprised what you learnt.
 
But you trusted Apple before with a closed-source, proprietary operating system that controls your microphone, camera, precise location, and access to your communications, health, and financial records?
Yep, I did. Live and learn.

The truth is you either trust the hardware manufacturer or you don’t. I hear that you’ve lost trust now, but I’m not quite sure what makes this child abuse identification process any more instructive than the tools Apple already has, and presumably, for which you’ve previously trusted that they wouldn’t misuse.
Eh, no. Trust can be earned, or trust can be thrown away in an instant. It's not some concrete number that never changes. Why, because they are including it on my device, nobody else does that (yet), and for good reason.
 
But it’s
Eh, no. Trust can be earned, or trust can be thrown away in an instant. It's not some concrete number that never changes. Why, because they are including it on my device, nobody else does that (yet), and for good reason.

More verifiable =on device = more private
 
  • Like
Reactions: coolfactor
And what nonsense have you provided? Just numbers of reports. Doing client side scans helps the integrity of encryption for non-scam images…
Looks like I have to spell it out for you : Apple doesn't scan all photos server-side.
 
Last edited:
  • Disagree
Reactions: DougieS
I'm confused why there's a belief that these measures break "end-to-end encryption". How so when all content remains encrypted along the entire path, from end to end?
 
  • Haha
Reactions: bobcomer
Regardless of the outcome, I will never feel the same about security with Apple products as I have previously. They can do anything thing they want in the cloud, but not on my phone.

That makes no sense. This feature _only_ works when the server-side uploading is enabled. Without iCloud Photos turned on, this entire feature is disabled on the phone. The phone-side and the server-side work in tandem. They are not separate pieces. The server-side tech only responds to the phone-side tech.

Besides, you already expect processing of your photos, for various reasons. Lightning, effects, enhancement. This is just one more fully-automated, private link in that processing chain. Nobody sees your photos.
 
  • Like
Reactions: DougieS
In the long run we’re all dead.

I make informed predictions about the near future based on the track record of the actors involved and present information, not based on generic slippery slope arguments. Or hot takes full of misplaced buzzwords like “backdoor”.
It’s not like Apple gave in and put iCloud servers in China, in a government run data center, for Chinese citizens.. and even if they did, they would at a minimum keep the encryption keys safe… right? They don’t have a history of giving into requests like this. 🤷‍♂️
 
I think there is still clarification and analysis to be done on this matter.

- As a parent, I'm very aware of the need of protecting children
- Privacy is one of Apple sales arguments, they wouldn't launch something like this if it would really compromise privacy

I'd love for 100% clarity on this. Let's wait.

Apple has explained how the system works very clearly, and made very vocal promises to not let the system get compromised or be abused by governments. All of the uproar is about what _could_ happen (aka. speculation), not what Apple has created.

1) The scanning / comparison of photos on your phone is 100% private. Nobody sees your photos.
2) Scanning only happens IF you have enabled iCloud Photos.
3) Only photes that "match" have a voucher attached. Nobody has seen your photos yet.
4) If you upload more than 30 violating photos, only then can the server-side system decrypt the photos, and only then is a *derivative* (low-resolution) version of the photo revealed, not the original. Th original remains fully encrypted.

We live in a world of "innocent until proven guilty", so Apple is innocent of any wrongdoing until there's proof that their system has been used outside of its primary purpose.

I choose to believe that Apple is not doing anything sinister here. It's a more advanced and more private means of detecting "bad" images. That is good for us, not bad.
 
Exactly what Reuters rightfully points out. Even if Apple's intentions are 100% good, this system does create a backdoor that enables the possibility that due law, of any given country, Apple could be forced by court order, to look for images of protestors, or political symbols, to filter out political protestors for purposes that are not good.

I'm surprised to see Apple doing this because they seem to be the front runners of this whole privacy mantra. It counterpoints everything where Apple stands for.

I find it also hard to believe that Apple would pull back all of their iPhones out of China if the Chinese government orders Apple to search for aspects as mentioned above.
Apple balances this sort of thing out with technically-enforced policy. For instance, the plan is not to have a “China CSAM list” - it is a global list, with multiple nations needing to agree on what to add.

Apple balances out the worst of the requests by saying, if we give the US the ability to decrypt any iPhone in the world, we will be forced to give that to China and Russia too. If we change the scanning mechanism to go outside iPhoto uploads, the world will know quickly because we publish unsigned builds that get reverse engineered for changes.
 
  • Like
Reactions: DougieS
It’s not like Apple gave in and put iCloud servers in China, in a government run data center, for Chinese citizens.. and even if they did, they would at a minimum keep the encryption keys safe… right? They don’t have a history of giving into requests like this. 🤷‍♂️
In which case, why is this an issue wrt. China? China supposedly already has the capacity to scan all uploaded iPhoto images To individually identify citizens who engage in undesired activity.
 
  • Like
Reactions: hans1972
Apple has explained how the system works very clearly, and made very vocal promises to not let the system get compromised or be abused by governments. All of the uproar is about what _could_ happen (aka. speculation), not what Apple has created.

1) The scanning / comparison of photos on your phone is 100% private. Nobody sees your photos.
2) Scanning only happens IF you have enabled iCloud Photos.
3) Only photes that "match" have a voucher attached. Nobody has seen your photos yet.
4) If you upload more than 30 violating photos, only then can the server-side system decrypt the photos, and only then is a *derivative* (low-resolution) version of the photo revealed, not the original. Th original remains fully encrypted.

We live in a world of "innocent until proven guilty", so Apple is innocent of any wrongdoing until there's proof that their system has been used outside of its primary purpose.

I choose to believe that Apple is not doing anything sinister here. It's a more advanced and more private means of detecting "bad" images. That is good for us, not bad.
Innocent until proven guilty? Then dont do warrantless searches on my phone.
 
Looks like I have to spell it out for you : Apple doesn't scan all photos server-side.
where does it say that in your “evidence”?

From the article “Higher numbers of reports can be indicative of a variety of things including larger numbers
of users on a platform”

yes it also mentions efforts might be low, but this is what this new scanning is doing - making it more effective.

It does not say they aren’t scanning all photos. It could be just in general less abuse images are being uploaded to icloud than other services, it could be people might be on the free 5gb tier and their iPhone backup uses all the iCloud storage so there is no chance for it to be scanned as it can’t be uploaded.

The PDF is just raw numbers, there is no analysis in there for you to have a strong opinion on what the article is conveying.

It’s just numbers that include sex trafficking and other abuses which has no relation to images. For example, Facebook is a social network with groups to chat in, hence the high reports. Apple doesn’t have those kind of services to get the higher counts.
 
Last edited:
My reply
In which case, why is this an issue wrt. China? China supposedly already has the capacity to scan all uploaded iPhoto images To individually identify citizens who engage in undesired activity.
my reply was to a post claiming Apple has an excellent track record and doesn’t cave to government pressure and implying such was pure speculation. If this would make anything worse for citizens of China over the scope of what they are already going through is uncertain, but there are governments out there that may be satisfied with access to this feature if modded to their liking.
 
  • Like
Reactions: Grey Area
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.