Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Again? Wasn't this article already posted?

Anyway, it's pretty much misleading from the start since it's not a backdoor in any technical sense, worse-for-privacy cloud scanning is already taking place at least at other photo library providers, and "scan users' photo libraries" conveniently forgets to mention that it's pictures being uploaded to the cloud service.

Perhaps the signatories should read the relevant technical documents and FAQs:

There’s no expectation of privacy when you are storing files on someone else’s servers. There is an expectation of privacy for files stored on your own device.
 
Last edited:
Thanks for your “rational” contribution, totally not deranged, and welcome to the macrumors family since you’ve joined on Saturday.
You are welcome. My message is not for you. Please, ignore it fully.
But your argument "joined on Saturday" is weak. Measuring the weight of information that I share by this is clearly subjective and emotional.
 
Again? Wasn't this article already posted?

Anyway, it's pretty much misleading from the start since it's not a backdoor in any technical sense, worse-for-privacy cloud scanning is already taking place at least at other photo library providers, and "scan users' photo libraries" conveniently forgets to mention that it's pictures being uploaded to the cloud service.

Perhaps the signatories should read the relevant technical documents and FAQs:

Although Apple is trying to make this as privacy-sensitive and rationally sound as possible, it is a technocratic approach to a sensitive and emotional topic. Apple ignoring the emotional reactions will not help, and I feel the resistance will grow. I understand Apple's approach quite well but still have an emotional reaction myself to the idea of on-device scanning for 'illegal' content, no matter how good the safeguards for misuse... I fear the day I go to another country for holidays, having to answer awkward questions from customs about certain content on my iPhone. I know, Apple says it has safeguards against this, but still, the concerns linger.
 
This proves my point. Thank you.

Of course it’s easier to be doomsday truthers, just to be on the safe side.

If my caution in jumping to conclusions and generic slippery slopes ends up being right, I don’t get a medal.

If your “some day, something could happen, something about this could change” ends up being right, you’ll be able to say “told ya so!”.

It’s a thankless side of the debate but somebody gotta do it.
 
If CSAM is on apples servers and someone down loads it apple would be guilty of distributing the CSAM which is an offence under uk law ! Or maybe it would only applicable in the country where the server is ?
 
So do people think Apple has something to gain by implementing this, or do they think Apple are being forced to do this by some Government agency, because I'm genuinely puzzled by why they would go to all this trouble if they didn't think it was the right thing to do.
I think the Siracusa on the latest ATP is a good listen. If you read all the docs, I think it's clear Apple does not want the ability to ever see a persons photos. Right now Apple has the encryption key to view any photos uploaded to iCloud, and it sounds like they do not want this ability.

Because Apple is so secretive though, it's hard to know if this is one feature in set of features that will overall make iCloud more secure.

One thing we do know is that this was a complete fail by Apple PR with how this was announced.
 
  • Like
Reactions: crymimefireworks
Regardless of the outcome, I will never feel the same about security with Apple products as I have previously. They can do anything thing they want in the cloud, but not on my phone.
Couldn’t agree more. Trust takes a long time to earn. I did not buy an iPhone to be policed by Apple. The hypocrisy after their ‘Privacy. That’s iPhone.’ campaign is just unbelievable.
 
After rational analysis in Occam's razor style: Apple is not forced directly by no one, they are preparing this implementation since iOS 14.3 (from recent data), this is obviously done with Apple management knowing that some governments will apply pressure over monopolistic nature of "Apple closed ecosystem" and the obvious politically correct solution is to provide a "secure" way of monitoring user behavior by labeling it "for the common good narrative".

The implementation is designed to give the users impression of "privacy" and in the same time to remove Apple from the loop of eventual legal responsibility. The tipping point is "on device scanning" where your property is reducing the cost of the process for Apple and reassuring normalization of privacy intrusion and surveillance. This "move" creates "new business" opportunity for Apple, done behind closed doors and will send clear signal to interested third parties in user data.

Apple practically is moving in Data Broker territory, takes a marketshare from Google and Facebook in the process.
The added value is "optimization" of Apples control over services.

When the users are accepting the technological solution and are convinced that this is "privacy", implementing "perceptual hashing" and Neural Hash for other purposes is no brainer.

So to summarize: This is the defining moment for New Apple. Removing rationally thinking user base - "screeching voices of minority", implementing secure "backdoor" for multiple cases of digital policing, using "CSAM" as a politically and publicly acceptable problem, entering the Big Data Broker market with power move and reassuring that no government will be compelled to break the company for monopolistic practices.

I am done with this company, and fully understand that majority of normal users will accept this just because.
I am sharing this with hope to help the minority of Apple loyal users (like me, until this stupidity) to understand that its time to tame our collective addiction of "conveniences" and learn important lesson: Never trust your private data to a closed software/hardware company again.
Its not Apple's job to police the world and it can actually make life harder for those who have to chase child abusers/pornographers.

As for whether Apple has been leaned on, its quite possible, as Apple has in the past been leaned on heavily with even the media showing officials of various government agencies accusing Apple of aiding terrorists by not providing a backdoor.

Apple are facing fines in various countries for AppStore and other issues, facing court battles with other companies and it was just a few weeks ago Apple criticised Facebook and others for their actions in reducing privacy. Apple even made privacy a mainstay of advertising, so yes it is possible they have been put in this position.

Judgements could be made in favour of Apple, countries can offer deal on AppStore where many are investigating it with prospect of mega million fines. All sorts of pressure may have been brought to bear on Apple and we know they've had such pressure before, but with the AppStore situations and certain court actions the leverage would be there.

The best way to dispel that concern is for Apple to NOT go ahead with what so many organisations, MP's, IT specialists, media, Apple Employees etc etc. etc., suggest are concerns over a backdoor and where its relatively easy if installed on every user's hardware to modify the tools.

Just reported on the news that there are now over 90 policy and rights group to abandon their plans on this matter, let alone MP's, IT specialists, media, so some here trying to condescendingly make out that everyone misunderstands on here and elsewhere are in a losing battle, and Apple won't give you a discount!

The irony is these posters still refer to Scans and then seek to mitigate it by saying they are only hashes, as if that isn't data!

We should perhaps remember that Apple has already bent to some governments:
 
Last edited:
Last edited:
My understanding is that SCANNING already occurs in your iDevice library. Or else, how would the IA knows how to group all the photos with say, a bridge, cars, waterfalls, boats etc.

Of course.
People are treating an additional step in the iCloud Photos uploading pipeline like it’s a brand new “backdoor” or something.
It was always a matter of trusting Apple (or other closed-source software vendors) that they weren’t doing anything nefarious under the hood. Or not. But this new added step doesn’t particularly change the reasons to or not to.
The funniest people are the “Waiter, there’s some Apple in my iPhone! Remove it at once” crowd…like having un-audited Apple-programmed background processes in your Apple closed-source OS is anything new…
 
Love these free speech advocates. Where were they when Apple was burning books and banning podcasts they felt had too much misinformation or were "dangerous?"

These groups stand up for child molesters but Alex Jones is beyond the pale?

Ok.
 
  • Like
Reactions: BurgDog
Again? Wasn't this article already posted?

Anyway, it's pretty much misleading from the start since it's not a backdoor in any technical sense, worse-for-privacy cloud scanning is already taking place at least at other photo library providers, and "scan users' photo libraries" conveniently forgets to mention that it's pictures being uploaded to the cloud service.

Perhaps the signatories should read the relevant technical documents and FAQs:

It is a back door. Doesn't matter if you say "well, the key can only open certain locks," or "only works in certain keyholes." That can all change without notice and without detection by users.

Apple kept its battery degradation process quiet until it got caught. You think any changes to this program would be made public? Okay. Sure.
 
We have to ignore what apple does and blindly believe on what apple says. ;-)

Curious that people don’t want Apple to “overreach“ by policing CSAM ‘cause “they’re not the police” but they’d like them to export democracy (sometimes not even 20 years of war are enough for that) to China, antagonize dictators, etc.
 
Clearly these organizations haven’t read all the documentation on how this program works.

However, Apple needs to take some blame for the heat given how poorly they announced/described this initially.

Imagine how many positions would come available if this botched launch happened under Jobs’ watch!?
 
I think it’s very obvious that Apple is under pressure from somewhere else.
And it’s very obvious child safety is just an excuse and cover up.
It’s completely against Apple’s long time policy and a complete turn.

regardless,what makes this specially worrying and frustrating is the fact that Apple are going to do the search ON DEVICE which is unprecedented.
That’s next level invasion of privacy.

Other services (google,dropbox etc) only perform such checks on THEIR cloud servers,not on USERS PROPERTY.

I will wait to see what happens,but If Apple still proceeds with such arrogant anti-privacy mass surveillance plan,I will switch to Android for good.

I just can’t tolerate and support a company who is bullying it’s customers and lying so much about privacy and I don’t want my fully paid devices searched all the time.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.