Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Apple is demonstrating proof of concept of a very real mission creep likely and slippery slope able technology.

it’s exactly the golden key that Tim Cook railed against a few years ago as digital cancer because extension to anything in your devices is just an Apple policy change, or a coercive government push (overtly via legislation or overtly by market access threats or endless court and PR battles) away from happening.

I don’t know how we got from Tim and Apple vehemently defending against things like this to doing it without a legislative mandate. But I fear that this is appeasement of some kind of covert coercion from one or more government officials, or politicians, in the US or China.

The US Constitution prohibits all government under its authority from doing extra judicial searches for very good and non academic reasons.

Apple acting as an arms length extra judicial searcher for any government is a terrible precedent.

The Slippery Slope analogy and warning exists for a reason and fully applies here.

Doing the wrong thing for the right reason* is still doing the wrong thing.

The only reasonable solution is for Apple to scrap this initiative, double-down on encryption, apologize to its stakeholders, and never do anything like it again.

*If Apple were coerced into doing this as appeasement by politicians, officials, or individuals, using threats to market access or of anti trust enforcements, Apple needs to issue a public statement outing the perpetrators.

If Apple implements this or anything like it, it will be, sooner or later, if not immediately, be an essential arm of big brother.

The only reasonable solution is for Apple to scrap this initiative and never do anything like it again. (Also it’s well past due for Apple to encrypt everything going into iCloud and throw away the keys.)

To paraphrase Kirk and the experimental computer MC5’s ultimate exchange:
Kirk: “What must you do to atone for your sins?”
MC5: “This unit* must die. <shuts down>.”
*Here project.
 
I run software company for more than 20 years, and I perfectly understand the tech. Countless experts are already given explanations and criticism over this backdoor, go outside the echo chamber and do your research, I will not give it to you freely. Repeating Apples PR mantra "you're holding it wrong, understand the tech" is making you look stupid and uneducated. If you want to present technical argument, please do it. But at this point in time even Apple has understandied that "tech" is easy to be fooled with adversarial networks, that is the reason for delaying it. And obviously iPhone 13 is coming out soon, so they need a PR move.

I’ve been a dev for 23 years and I disagree with your assessment. Running a software company isnt the same as having the tech knowledge
 
Looks like you know nothing about what people is discussing here. They are discussing scanning your local photo album and eventually upload your private photo to Apple for human review.
No. They are matching your hashed image to a stored hash of a known child abuse image. The scan only takes place if your image is going to be uploaded to the cloud anyway.
And only if, rumored around 30+, successful hash matches would those matched photos be sent for review. None of your other photos will be manually reviewed.

most cloud services do cloud scans, how is this any different as the photos going to the cloud anyway?
This method improves encryption for non-abuse images as apple can throw away the key and law enforcement can’t view them at all. Currently they can…
 
  • Disagree
Reactions: Rokkus76 and dk001
I’ve been a dev for 23 years and I disagree with your assessment. Running a software company isnt the same as having the tech knowledge
I am also a programmer for more than 28 years. You can be a developer and lack design system thinking and skills. That is one of the reasons why I pay salaries to dev's and not the other way around. The design fail in Apples approach is that they are introducing on device processing with third party non publicly auditable hashes. Period. The industry uses server-side processing with PhotoDNA from years. And this is enough.
 
Last edited:
This debate will get no where because it is clear there are those who will verminatly defend their right to privacy over that of anything else, in this case the protection of children.

The thing is, don't children deserve our protection? is it not up to us, as adults to find ways to protect children from harm?. It's a daming question but given the way people are defending their right to privacy, they need to ask themselves this question..is your right to privacy more important than the protection of a child? I think members here are too scared in being judged if they reply 'Yes' to that question.
 
Literally in the article you responded to:

”for fear that they [the plans] would lead to the censoring of protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.”

i must have missed the explanations for this theory. How will it have “…disastrous consequences for many children…”??? And how will it “..censor free speech…”???

I would love to hear their explanations for these points.
 
You honestly believe private companies should start monitoring their users? You want Apple Police, Microsoft Police, Google Police etc. of gathering info for some other private organisation so they can use it to their own purpose? Shouldn’t we leave hunting down the criminals to governments and law enforcement agencies and not to some shady groups who are not governed by the laws like law enforcement?

Private companies are already required BY LAW to monitor their users.
 
  • Disagree
Reactions: Philip_S
Personally, I think the damage to trust has already been done by even proposing something like this. How can Apple be trusted now not to implement this behind closed doors without anyone knowing? The code is closed source so no way to verify. The only way we would find out is if a whistleblower from the inside leaks information.
 
“The responses to Apple's plans have been damning: over 90 organizations across the globe have urged the company not to implement them, for fear that they would lead to the censoring of protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.

I understand other hypothetical risks listed, but why or how would Apple’s original CSAM plan “have disastrous consequences for many children.”? 🤔
That part of the objection comes from the SMS/iMessage live scanning I believe
 
In one of the articles that MR has reported on the matter (am not able to find it at present) I remember reading about Apple saying that images are encrypted and as a result it would take a lot of computing power and programming to pro-actively scan images on icloud servers thus is much easier, simpler and quicker to scan for image hash values on a users device where there would only be a few image files to scan rather than millions of image files. Having to scan the servers on a daily basis would slow down the servers.

As for other tech companies scanning their cloud storage servers, I do not know if they encrypt images in the same manner that Apple does.
 
Yes it is apples job. Apple isn’t legally allowed to store child abuse images. They are responsible for ensuring this.
This means it’s Apples job to scan for abuse photos before they are uploaded to the cloud..

NOthing about you in particular, but I am going to leverage your post to state, for the upteenth time, Corporations are not required to search for this stuff.(US Law) In the event they obtain knowledge about specific items they are required by law to turn it over to the NCMEC or Law Enforcement. Most cloud providers that do search, search only on share, not on upload.

For all the rest out there, take a bit of time and educate yourself on the legalities. There are several threads on this already and the amount of misinformation in the first couple of pages is sad.
 
For a million times: it’s not Apple job to do this!

Want to protect our children? Either donate fund to FBI team who’s dealing with this issue or talk secretly to Congress to pass a law requires ALL who store our photo to scan for CSAM. Apart from these GTFO of my devices!

I’m a paying customer. I don’t like being point finger at ‘Hey! Let me check you. You COULD be a criminal’. This is not a way to treat your loyal customer.

You seem to have a misguided notion that private companies are obligated to protect your constitutional rights. They are not. The government is obligated to do that. If you don’t like what Apple is doing, you are free to use another platform.
 
  • Like
Reactions: LiveM
This debate will get no where because it is clear there are those who will verminatly defend their right to privacy over that of anything else, in this case the protection of children.

The thing is, don't children deserve our protection? is it not up to us, as adults to find ways to protect children from harm?. It's a daming question but given the way people are defending their right to privacy, they need to ask themselves this question..is your right to privacy more important than the protection of a child? I think members here are too scared in being judged if they reply 'Yes' to that question.

Kindly explain to me how this actually protects children.

If you walk through what this does and does not prevent, you may get what I am asking.
There have been several good posts on this in a few threads on this site.
 
You honestly believe private companies should start monitoring their users? You want Apple Police, Microsoft Police, Google Police etc. of gathering info for some other private organisation so they can use it to their own purpose? Shouldn’t we leave hunting down the criminals to governments and law enforcement agencies and not to some shady groups who are not governed by the laws like law enforcement?

Google and Facebook already report 1000s of illegal images to authorities every day.

I don’t recall any huge issue with the topic until now.
 
You seem to have a misguided notion that private companies are obligated to protect your constitutional rights. They are not. The government is obligated to do that. If you don’t like what Apple is doing, you are free to use another platform.
Oh no, I simply go by what APPLE telling me year after year. Silly me for believing them eh?
 
Ah, the slippery slope fallacy.

The reality is if they planned to "censor protected speech" or "threaten the privacy and security of people around the world" then they could do that anyway. With or without this technology.
Said this before will say it again for those who missed the first go around.

This is not a fallacy when there is history, a lot of history mind you, of "Will anyone think of the children!" being used to take away your privacy, freedom and has been used to censor. It is used because for those trying to manipulate the public it is easy to come back with, "What you are trying to protect pedophiles?" *Faux shocked face* when anyone fights back against what is really happening.

It is a fallacy when there is no evidence of anything bad happening by the action. I am especially suspicious because Apple has spent decades trying to convince us that they are super-duper concerned for our privacy and are now suddenly willing to throw all that hard work away. The powers that be at Apple aren't stupid... so what is really going on? Whatever it is it is shady and you should be concerned.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.