Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I have lost all my politeness with this Apple move. Cannot be "politically" correct anymore.

There is a point in this world, that we are building with technology, that the people with most intellect and knowledge have to say it loud to the dopamine induced crowd: You are easy to be manipulated and subverted because of your low education, naiveté believe system, lack of rationality and years of marketing and programming put into your brains.

We have seen one disaster, where a person used Cambridge Analytica and Facebook to target people fears in successful manipulation and crowd division.

It is ok to have "opinion" but factual data, when presented by experts cannot be pushed into oblivion by crowds of "some brand followers".

This is not some tribal war for common good. This is your personal space, your data and your future. Everything that you do has a digital trace, you practically have little to no control over when and how your data will be used against you. You are "feeling good" because you participate in something "bigger" than you when using all of this dopamine inducing, dark pattern designed and legally protected software and hardware, made by soulless corporations with one goal: To use you as a fuel to run their greedy machinations.

Governments are using Covid situations to control public behavior. Powerful entities are working 24/7 to manufacture consent.

The only acceptable solution for Apple is not only not to deploy this breach of trust, government backdoor and false advertisement, but to accept failure and remove all responsible in command of this fiasco.

We have not heard nothing acceptable about Pegasus / NSO breach. We have not heard nothing acceptable about Apples internal workforce problems. We have not heard a word from CEO, who more than 5 years is building this "privacy charade" using marketing tricks. We have not heard anything about Apple monopoly.

What is this madness? Is this company creating the new normal where shareholders and users have no say in the future of their life? What is this silence? It is disrespectful on a whole new level.

Screeching voices, my ass.
 
The thing is, apponents of the system are seeing it two ways one being a privacy issue and two being open to abuse issue. Now personally for me, I do not have a problem with 1: privacy issue if it means children are protected. Those who question the privacy asspect of the issue I would have to question why because do they not want children to be protected?, therefore a persons privacy is more important than the protection of children? That concept is appalling to me. A childs protection comes before my privacy.

As for the second issue, open to abuse, yes it's a very very valid argument because checking systems do get abused.
 
1: privacy issue if it means children are protected. Those who question the privacy asspect of the issue I would have to question why because do they not want children to be protected?, therefore a persons privacy is more important than the protection of children? That concept is appalling to me. A childs protection comes before my privacy.
For a million times: it’s not Apple job to do this!

Want to protect our children? Either donate fund to FBI team who’s dealing with this issue or talk secretly to Congress to pass a law requires ALL who store our photo to scan for CSAM. Apart from these GTFO of my devices!

I’m a paying customer. I don’t like being point finger at ‘Hey! Let me check you. You COULD be a criminal’. This is not a way to treat your loyal customer.
 
“The responses to Apple's plans have been damning: over 90 organizations across the globe have urged the company not to implement them, for fear that they would lead to the censoring of protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.

I understand other hypothetical risks listed, but why or how would Apple’s original CSAM plan “have disastrous consequences for many children.”? 🤔
 
I think many people are truly unaware of the staggering prominence of child abuse in society, if people knew how common and widely distributed the material is they might throw some support behind this.
How prominent is it, and how do we know that information?

If it is as common as you suggest, isn't it likely that police forces around the world will be flooded with reports? Will the relevant justice systems be able to handle so much extra load?
 
In all honesty, the in-device CSAM scanning makes no financial sense for Apple. Why on earth are they even considering doing it? In a long run Apple will loose some sales and market share because of this but more importantly Apple has to answer to those who want to expand the scanning from CSAM to other types of material. The only logical move for Apple is to scarp any in-device scanning plans and go back to building “more secure” mobile devices. People have paid premium for privacy and that’s what they are expecting. Literally having a device which is spying on their owner and reporting them to third party which might or might not report the user to law enforcement isn’t a good selling point.
 
The only logical move for Apple is to scarp any in-device scanning plans and go back to building “more secure” mobile devices. People have paid premium for privacy and that’s what they are expecting.
Apple should do the opposite of what they‘re trying to do now: tell anyone who wants them to do this that they have to MAKE IT INTO A LAW for them to do it because they can’t just simply do this to their customers.
 
The thing is, apponents of the system are seeing it two ways one being a privacy issue and two being open to abuse issue. Now personally for me, I do not have a problem with 1: privacy issue if it means children are protected. Those who question the privacy asspect of the issue I would have to question why because do they not want children to be protected?, therefore a persons privacy is more important than the protection of children? That concept is appalling to me. A childs protection comes before my privacy.

As for the second issue, open to abuse, yes it's a very very valid argument because checking systems do get abused.
But Apple can also scan the iCloud archive - there is no need to install spyware on your phone that could be abused.

And another point - the evil guys will just use another photo app that doesn't store anything within Apples photo library or will simply disable iCloud sync. They'll just copy and sync it to dropbox, ownCloud or whatever they'll like to do.

So why exactly is Apple doing this?
A: Apple is stupid
B: They want to implement a surveillance method
C: Marketing department decided Apple needs to do something about child abuse (what is good) and nobody told em about the issues. This brings us back to A.
 
For a million times: it’s not Apple job to do this!

Want to protect our children? Either donate fund to FBI team who’s dealing with this issue or talk secretly to Congress to pass a law requires ALL who store our photo to scan for CSAM. Apart from these GTFO of my devices!

I’m a paying customer. I don’t like being point finger at ‘Hey! Let me check you. You COULD be a criminal’. This is not a way to treat your loyal customer.
Why isn't it Apple's job? If a criminal is using an Apple device to take indecent pictures of children and then upload them to Apple servers so other criminals can access the images, why can't Apple put in controls to prevent this?

This issue comes down to an individuals attitude. matrix07 attitude is that he/she is being treated like a criminal, my attitude is that I am not hence why I do not have a problem. This is just two of us but multiply this by a few thousand or a few million and you can see why there is such decisive views on the matter.
 
In one of the articles that MR has reported on the matter (am not able to find it at present) I remember reading about Apple saying that images are encrypted and as a result it would take a lot of computing power and programming to pro-actively scan images on icloud servers thus is much easier, simpler and quicker to scan for image hash values on a users device where there would only be a few image files to scan rather than millions of image files. Having to scan the servers on a daily basis would slow down the servers.

As for other tech companies scanning their cloud storage servers, I do not know if they encrypt images in the same manner that Apple does.
Apple has been scanning images in the cloud for several years already.
 
  • Like
Reactions: BurgDog and DougieS
At the moment, iCloud is only partly end-to-end encrypted. The rest is encrypted with Apple’s own key. Therefore, when law inforcement request it (with a warrant in US), Apple decrypts what they can and promptly provide it. The new CSAM solution will make it possible to end-to-end encrypt everything yet still satisfy any requirements for CSAM scans. This would be a massive win for privacy as, even with a warrant, Apple wouldn‘t be able to provide any data to law inforement. Well, except for CSAM offenders.
 
Once they can compare file hashes for this. They can do it for anything and compare against all kinds of files. That will allow ultimate tracking across all your activities.

Not to mention the possibility of a malicious app casually writing a bunch of these types of photos to a person’s library, instant sync to iCloud, boom life ruined.

This system is idiotic. The fact that Apple even thought it was a good idea is idiotic and people who think it’s a good idea need to use their brains a little harder to see beyond this ridiculous “save the children” narrative. It’s not Apple’s job.
 
The numbers look weak for a petition. Just as I suspected — this seems to only be an issue for a loud minority, conspiracy theorists, and people who don’t quite understand the tech.

In comparison, the California recall petition received over 2M signatures.

if you think this is a good idea, then you’re the one who doesn’t understand the tech
 
I don’t know how apple are seen as the bad guy for trying to improve reporting and protection here.
I tell you how. People buy Apple products for their quality, security and PRIVACY! Photo Albums are very personal and private, they only get shared with whoever people choose to share them. You can imagine people not being happy having an AI scanning their photo album (which has one of the most personal, private data on the phone). What happen to Police and other authorities doing their job without having to look on every single one of billion devices?
 
Literally in the article you responded to:

”for fear that they [the plans] would lead to the censoring of protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.”

And sharing of child abuse images, which is as a result of physical abuse, isn’t also disastrous for children?
 
Obviously you can have your opinion on if this feature is a good idea or not but your description would suggest you don’t understand how it works. Apple don’t need to have any images to compare to for this system to work as they describe it. No third party needs access to your phone or personal data. For people to have a sensible debate in these features it would be best if people try and understand what they are actually proposing first.

I believe that Dropbox and google drive have used this type of system for years (including for other things like copyright issues). Maybe Apple are being held to a higher standard because they use privacy in marketing a lot. It does seem odd to me that this is getting so much attention when clearly other privacy related issues are not.


Originally this was supposed to just compare the hash of photos to the hash of existing known child porn as it went into or out of encryption. I don’t really have a problem with that. Active AI scanning of actual images seems like a great idea, but it’s encryption back door requirements and near instant potential for misuse by governments worldwide is shocking.

If you don’t understand this, just ask yourself if you think someplace inside of Apple there will be a server full of child porn waiting to be compared to images on your phone…. Or will those images and services be provided by governments, who would then need direct unencrypted instant access to your device. Scary.
 
  • Like
Reactions: Jason216
Once they can compare file hashes for this. They can do it for anything and compare against all kinds of files. That will allow ultimate tracking across all your activities.

Not to mention the possibility of a malicious app casually writing a bunch of these types of photos to a person’s library, instant sync to iCloud, boom life ruined.

This system is idiotic. The fact that Apple even thought it was a good idea is idiotic and people who think it’s a good idea need to use their brains a little harder to see beyond this ridiculous “save the children” narrative. It’s not Apple’s job.
Which is why the hash database was due to be auditable.. so we can check it is only for CSAM. Any other hashes would be noticed immediately..

photos stored from apps have this data in the meta, look how WhatsApp makes it own album etc.. you can search Pokémon go for all pogo saved photos.
If an app was doing this.. it should be easily traceable
 

Attachments

  • 194FADCA-3FC9-4B1F-A44D-395F57C118C0.png
    194FADCA-3FC9-4B1F-A44D-395F57C118C0.png
    1 MB · Views: 75
The numbers look weak for a petition. Just as I suspected — this seems to only be an issue for a loud minority, conspiracy theorists, and people who don’t quite understand the tech.

In comparison, the California recall petition received over 2M signatures.
I run software company for more than 20 years, and I perfectly understand the tech. Countless experts are already given explanations and criticism over this backdoor, go outside the echo chamber and do your research, I will not give it to you freely. Repeating Apples PR mantra "you're holding it wrong, understand the tech" is making you look stupid and uneducated. If you want to present technical argument, please do it. But at this point in time even Apple has understanding that "tech" is easy to be fooled with adversarial networks, that is the reason for delaying it. And obviously iPhone 13 is coming out soon, so they need a PR move.
 
Last edited:
For a million times: it’s not Apple job to do this!

Want to protect our children? Either donate fund to FBI team who’s dealing with this issue or talk secretly to Congress to pass a law requires ALL who store our photo to scan for CSAM. Apart from these GTFO of my devices!

I’m a paying customer. I don’t like being point finger at ‘Hey! Let me check you. You COULD be a criminal’. This is not a way to treat your loyal customer.
Yes it is apples job. Apple isn’t legally allowed to store child abuse images. They are responsible for ensuring this.
This means it’s Apples job to scan for abuse photos before they are uploaded to the cloud..
 
Obviously you can have your opinion on if this feature is a good idea or not but your description would suggest you don’t understand how it works. Apple don’t need to have any images to compare to for this system to work as they describe it. No third party needs access to your phone or personal data. For people to have a sensible debate in these features it would be best if people try and understand what they are actually proposing first.

I believe that Dropbox and google drive have used this type of system for years (including for other things like copyright issues). Maybe Apple are being held to a higher standard because they use privacy in marketing a lot. It does seem odd to me that this is getting so much attention when clearly other privacy related issues are not.
Looks like you know nothing about what people is discussing here. They are discussing scanning your local photo album and eventually upload your private photo to Apple for human review.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.