Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The amount of fanboys with pseudo technical understanding here is staggering. Someone reading a technical documentation without enough knowledge making "big assumptions" that Apple is his daddy and will never abuse "the good kids".

I can help you to overcome your cognitive tech bias. Watch this and listen.
 
Regardless of the outcome, I will never feel the same about security with Apple products as I have previously. They can do anything thing they want in the cloud, but not on my phone.

Except we’re talking about pictures you have already surrendered to iCloud so the distinction is more psychological than anything. People will get over this technicality (an additional step in the iCloud Photos upload pipeline that allows to more efficiently scan for CSAM without routinely decrypting them on-server) once the dust settles, hopefully.
 
Organizations that actually deal with this problem (as opposed to armchair CSAM experts) seem to think otherwise.
Let’s not forget the “screeching voices” quote, while quoted by Apple in their internal memo, comes from NCMEC‘s mouth.
We'll see, for pedos the logical route is to create new unknown child porn pictures, simply as that.
Tim fully endorsed the “screeching voices” quote by forwarding it to employees without commenting it, and till today he didn't partly distance himself or Apple from that customer attacking quote.
 
Last edited:
The previous poster implied that people would not sign such protest letters if they had read Apple's documentation. However, people have read and understood* the documents and still protest. Experts alone protesting would not prove much, but the objections are easy to understand, and the documents do not invalidate them.

Claiming that protestors just lack understanding plays into Apple's people-are-confused-ruse. People know and understand what Apple intends to do, that is why they protest. That is also why adding more documentation without addressing the core objections comes off as a very blatant diversionary tactic: people do not care whether the gun operator has good references or whether the safety only fails in 1 out of a trillion times - they simply do not want that gun pointed at them.

*As far as the documents allow - Apple glosses over many critical aspects of the system. The system is objectionable under any interpretation, though. More details might make it even worse.

Are on-server dragnet search techniques used by other companies subjected to this same level of public scrutiny and presumption-of-incompetence/slipperiness?
 
why though? you should be critical about what's happening client side and server side. I can't understand that.
I am, but I'm trying to see their point and what they want to accomplish has merit. On the server side, at least I can control what they get, as an example my taxes ,financials and really sensitive personal data only exists in an encrypted container on Dropbox and the server side. The remainder, I'm okay if they want to access it for a good cause... in the cloud. If I'm really concerned, I can turn off iCloud, but I don't like the loss of convenience.

But I see your point is a good one as it pertains to what come next with the overwatch on the cloud, but I think we will never be able to truly trust what has left our client devices.
 
  • Like
Reactions: pratikindia
I am, but I'm trying to see their point and what they want to accomplish has merit. On the server side, at least I can control what they get, as an example my taxes ,financials and really sensitive personal data only exists in an encrypted container on Dropbox and the server side. The remainder, I'm okay if they want to access it for a good cause... in the cloud. If I'm really concerned, I can turn off iCloud, but I don't like the loss of convenience.

But I see your point is a good one as it pertains to what come next with the overwatch on the cloud, but I think we will never be able to truly trust what has left our client devices.
understood! I guess we have to get laws that are privacy oriented. let's vote folks!
 
  • Like
Reactions: nt5672 and 09872738
So do people think Apple has something to gain by implementing this, or do they think Apple are being forced to do this by some Government agency, because I'm genuinely puzzled by why they would go to all this trouble if they didn't think it was the right thing to do.

In their original plans, they probably had to gain the right to say they can scan for CSAM effectively (hence improving their horrendous track record of harboring CSAM up until last year, with only 265 photos reported in 2020) without routinely messing with user’s photos (decrypting them, etc.) once they are already on servers. (unlike what other companies routinely do)

Then a lot of miscommunication got in the way, class-less arch-enemies like Epic and Whatsapp/FB jumped at Apple’s neck, that guy protected by Putin posted some dog memes and dramatic takes, the Apple bashing international of course wanted to claim a piece of this drama, some tech news outlets report on this daily like it’s a humanitarian crisis because slow news summer, etc.

And, among the noise, some valid concerns, questioning, calls for auditing, etc. that Apple is gradually responding to.
 
  • Disagree
  • Like
Reactions: lifeinhd and mw360
Again? Wasn't this article already posted?

Anyway, it's pretty much misleading from the start since it's not a backdoor in any technical sense, worse-for-privacy cloud scanning is already taking place at least at other photo library providers, and "scan users' photo libraries" conveniently forgets to mention that it's pictures being uploaded to the cloud service.

Perhaps the signatories should read the relevant technical documents and FAQs:

Again? The issue is "scanning on devices" vs "scanning on Apple servers".
 
  • Like
Reactions: sog1927 and Jonas07
WeChat was ordered by the government to delete and prevent the sending of pictures containing whinny the poo and their leader side by side as they both are chubby, A section of government clearly thought it was not appropriate so the image is band. Why is Apple any different to WeChat, Apple should also add whinny the poo to the list of band images on Chinese iPhones, if 30 pictures are located they should report them to the government. It is the same argument for whinny the poo and child pictures, governments decide what is not appropriate so Apple should report anyone with 30 images that any government indicates are not appropriate right? I just do not know how Apple can decide what is appropriate and what is not, only the local governments, so why would Apple build this if it was not forced to, you're basically opening the door for every government to make a list they want Apple to report back on, I don't know if that is good or bad, governments would not ask if they did not think it would help run their countries more smoothly, but its a big change to privacy.
 
Again? The issue is "scanning on devices" vs "scanning on Apple servers".

The results of the on-device scanning can’t be read until the pics are uploaded on Apple servers.

So it’s actually on-server verification of security vouchers. What happens on-device, let’s call it the “first half of the scan”, is inaccessible to humans, computers and servers alike. It’s like it doesn’t even exist until the photos are uploaded anyway.
 
The amount of fanboys with pseudo technical understanding here is staggering. Someone reading a technical documentation without enough knowledge making "big assumptions" that Apple is his daddy and will never abuse "the good kids".

I can help you to overcome your cognitive tech bias. Watch this and listen.
That is exactly the problems I find as well.
As it is all on device, it's just a matter of time before someone reverse engineer the network used for the hashing, and thus being able to do adversarial attacks to make any image pick up as an CSAM image.

Then uploading a few attacked images of cute dogs or fancy dresses on reddit or other content sharing sites, and people who download them will get in trouble.

And if Facebook and Google were dealing with 100k+ cases per year, I find it hard to believe that Apple is going to have human review of every single case.
 
That is exactly the problems I find as well.
As it is all on device, it's just a matter of time before someone reverse engineer the network used for the hashing, and thus being able to do adversarial attacks to make any image pick up as an CSAM image.

Then uploading a few attacked images of cute dogs or fancy dresses on reddit or other content sharing sites, and people who download them will get in trouble.

There are ways for Apple to counter this type of attack and ultimately there’s human review so not sure what the attacker would accomplish, maybe force Apple to hire more reviewers to discard cute dogs false positives in a split second.
 
Yes. And the Taliban could turn out to be human rights champions this time. You don't know.

Except the Taliban have a terrible track record and Apple has (mostly in functioning democracies) an industry-leading track record of championing privacy?
 
I think the function within iMessage is a good thing, could be abused for other stuff later though.
But the CSAM ******** is something that Steve Jobs would NEVER had allowed, and would have fired everyone for brining up such a crap idea.
Now he just can turn himself around in his grave and look down from his own iCloud, what these morons are doing…
(btw former Apple employee here, knowing how this idiots are thinking)
 
Integrated Intel graphics chip steals power from the CPU and siphons off memory from system-level RAM. You'd have to buy an extra card to get the graphics performance of Mac Mini [..] - Apple.com
Apple.com is probably not the best source for comparing their own chips with their rivals.
 
  • Like
Reactions: ouimetnick
I think the function within iMessage is a good thing, could be abused for other stuff later though.
But the CSAM ******** is something that Steve Jobs would NEVER had allowed, and would have fired everyone for brining up such a crap idea.
Now he just can turn himself around in his grave and look down from his own iCloud, what these morons are doing…
(btw former Apple employee here, knowing how this idiots are thinking)
SJ would never have allowed pre-scanning for CSAM in a cryptographically secure and anonymous-until-guilty way that makes competitors look gross because they’re actually mass-sifting thru your pics in an unencrypted status once they’re already on-server?

He would have convinced the world it‘s the best thing since sliced bread, and would have defended it with rage against specious attacks (especially from Epic and Facebook).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.