Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
People do realise that companies such as Google, Adobe, Facebook et. al already use some form of automated technology to scan for and detect CSAM? Adobe does it with Creative Cloud:


That's just one example.
Most people don't, unfortunately.
But other companies doing the same doesn't justify Apple's new policy in any way, and I'm glad people are finally complaining. Mass surveillance like this shouldn't become the norm.
 
  • Like
Reactions: Mendota
As it's been stated before (several times) Apple currently will not view photos. Just scan for the hash of photos that some poor Law Enforcement Agent had to view, flag as illegal and add to a database of similar images. But this has so many ramifications:

1. It doesn't matter what Google, Adobe, or other companies are doing - in terms of this conversation. This is about Apple rolling out this 'feature' on your Apple device.

2. Chester the Molester isn't going to keep his illegal collection of photos on iCloud anyway. The percentage of offenders who do this must by infinitely small. I'm sure Apple already has these statistics and is aware of this. Why subject 99.999 % of population to this? Do police stop all cars on the road and check IDs to try to catch the 0.01%?

3. For the last two decades, we've seen tremendous loss of freedoms begin with the 'to protect the children' line... You can't really rally against that, are you a monster?! They know it, we know it, they know we know it. It's common practice cause it works.

4. This doesn't even 'actively protect children' as the photos must already be cataloged and reside in a database of known perv images. For active abusers - there won't be any known hash to scan for. So they're only looking for images that are already widely shared.

5. Many think that what's on our phone is an extension of what's in our brain - it holds our thoughts (if you write them down), photos, locations, preferences, searches, contacts, feelings and life experiences of our friends that communicate with us etc. While no court order can get into your brain, they can with your devices. But with this new 'feature' - a court order isn't even needed. They automatically scan for illegal hashes or whatever.

6. This isn't really a 'slippery slope' - we've seen this technology used again and again in nefarious ways by govt agencies. By allowing a company to scan your private data 24/7 for what the govt today deems is illegal - is a big gaping hole in the floor called 'the future.' Who is to say what the govt will find as illegal in 2yr, 5yr, 10yr?

7. The trend line isn't good. We already see people that promote views that are against the govt's line getting removed from social platforms (at the governments request) - and labeled as 'extremists.' Don't want a vaccine in 1999? You're just a fool - today, you're now an extremist. Where will it end?

8. Imagine waking up one day and finding that your iPhone updated and iCloud was turned on by default with the new update? (oops - we're sorry). Or additional 'spyware' was installed, or the govt sent Apple new things to search for? Law Enforcement needs a warrant to search your property today - well, that just flew out the window, as all your data has already been captured.

9. Does anyone 'really' believe that the NSA would not utilize every bit of data it can get it's hands on for their own purposes? Do you really think they're girl-scouts selling cookies for merit badges? Will scanning of your hard drive come next? In the coming years we probably won't even have hard-drives - everything will be cloud based (already headed that way), so essentially you've just handed over your freedom of privacy for the future.

10. Apple has made a big deal of privacy in the last few years. Funny (but sad) to see it was all BS.
 
In my country, it's illegal to have pornographic materials. And the wording of the law includes "pornoactions," which was defined as "indecent acts," and it ranges from bikinis to kissing. There was a case about a couple (forgot the details) having a private video that was stolen and leaked. That couple was fined and jailed, longer than the people spreading the video itself.

You can kinda have an idea the implication if a Blackbox mass scanning system from a private company is implemented in a country with these kinds of laws.
Lukashenko will also love these kind of mass surveillance tools.

 
Last edited:
This is being pushed by organizations with a noble cause - so can we get some data on how often they are catching people using this?

Apple say we aren't allowed to have that information.

There can be no oversight of their algorithm.
 
  • Like
Reactions: mr_jomo
Ah, so all this only applies when you are outside your home?

Also, you are unaware that CCTV cameras are already capable of matching faces to known criminal databases? That literally one of the roles of a police officer is to prevent crime - and they could do so by many means? You fear their body-cams used to record your actions?
apple is no police
 
Anyone that believes Apple will stick to their script of only scanning what they said they will scan needs to have 'FOOL' tattooed on their forehead because it is a well known fact that tech companies lie not only to consumers but to governments. Google, Facebook, Apple, Microsoft, Amazon have all been caught out at one time or another lieing, sometimes on numerous occasions. It's in a companies nature to tell the comsumer they are doing one thing when in fact they are doing the complete opposite.

A few months or even a year or years from now, icloud users will start to complain that their stored pictures have been erased and Apple will deny it's their fault. Then some security researchers will get involved and they will report that Apple's CSAM scanning as quietly been scanning everyone's icloud account and been accidently erasing users pictures due to a glitch with the scanning process. Then people will say 'hold on Apple, you said you wasn't going to do this type of invasive scanners on users icloud accounts'. Apple will then issue a PR release saying 'sorry, we will update our procedures so something like this does not happen again'.

It's going to happen, you just know it will

Do me a favor, please jump into the time machine you apparently own and tell me what stock prices will increase by 1,000%.
 
These images must be removed from circulation.

And considering that false positive rate is one in a trillion, and there will be one trillion photos uploaded to iCloud this year, the fearmongering nonsense is ludicrous.
A better question is what is the true positive rate? What is the effective rate of catching people with child pornography? Doing things for theoretical benefit or because it makes you look or feel good is not usually enough. How much is being caught for the trade off of implementing technology that violates fundamental rights.
 
Since Apple software is always bulletproof, there will never be false positives.

We continue to trust that the company that has broken our trust will never misuse this technology in a way that will harm us. Not even when an oppressive government forces them to do so.

Apple knows that they have their customers by the balls. They can do whatever they like and continue with their virtue signaling.
 
Third option: Let it be known to Apple that people don’t want to be the subjects of mass surveillance from an Apple product, until they back off this very odd (who in Apple came up with this idea?), and ill conceived situation.

”Keep on trucking”. Right. Keep letting our rights to privacy get chipped away until we are just like China/Russia.

God, how I yearn for the good old days of 8-bit computers and no internet access.
 
People do realise that companies such as Google, Adobe, Facebook et. al already use some form of automated technology to scan for and detect CSAM? Adobe does it with Creative Cloud:


That's just one example.
I think the difference is that everyone expects Google, Adobe, and Facebook to be completely morally and ethically bankrupt while Apple seemed not to be
 
All of these "features" are opt-in. If you want to use iCloud you now know that the hashes - which give no information about a picture's content - will be checked against known hashes of content violating CSAM policy, which is highly illegal in most countries (and rightfully so). If you don't want that don't use iCloud. Messages will be checked only if you configure this for your child. If you don't want that - don't do it.

And sorry, the argument that Apple could abuse this system if they wanted to doesn't fly since if they wanted to they could backdoor your phone with the next update and nobody would notice until the damage is done. The same is true for Android, or literally any non-open source operating system running on any networked device. Everyone who's the least bit informed knows this and has either widely accepted it, or has opted out of using closed hard- and software. Alternatives do exist. They may not be pretty, and may sometimes be impractical, but they do exist.

The question is: is it in Apple's interest to do this? Is Apple willing to risk its reputation, which is the foundation of them earning three digit billion dollar sums annually? The answer is: no. It is not. If widespread abuse of this kind would become known - and it would become known, since hardly any company has so many eyes on them and what they do - it would cost Apple tens of billions of dollars.

Apple's business is sales-driven, and despite them being highly profitable they only occupy a niche in the industry. They are not in the business of selling your data. They are in the business of selling you highly priced boutique hardware and highly priced services. You can refuse to fund them at any given moment. You can't refuse to fund Google unless you decide not to use the internet since they practically own it.

Apple does and will continue to do what is best for their business. They follow the capitalist mandate rigorously. And this means they will not abuse this system or any other since it would cost them money. More money than any company or state could offer them.
 
All of these "features" are opt-in. If you want to use iCloud you now know that the hashes - which give no information about a picture's content - will be checked against known hashes of content violating CSAM policy, which is highly illegal in most countries (and rightfully so). If you don't want that don't use iCloud. Messages will be checked only if you configure this for your child. If you don't want that - don't do it.

And sorry, the argument that Apple could abuse this system if they wanted to doesn't fly since if they wanted to they could backdoor your phone with the next update and nobody would notice until the damage is done. The same is true for Android, or literally any non-open source operating system running on any networked device. Everyone who's the least bit informed knows this and has either widely accepted it, or has opted out of using closed hard- and software. Alternatives do exist. They may not be pretty, and may sometimes be impractical, but they do exist.

The question is: is it in Apple's interest to do this? Is Apple willing to risk its reputation, which is the foundation of them earning three digit billion dollar sums annually? The answer is: no. It is not. If widespread abuse of this kind would become known - and it would become known, since hardly any company has so many eyes on them and what they do - it would cost Apple tens of billions of dollars.

Apple's business is sales-driven, and despite them being highly profitable they only occupy a niche in the industry. They are not in the business of selling your data. They are in the business of selling you highly priced boutique hardware and highly priced services. You can refuse to fund them at any given moment. You can't refuse to fund Google unless you decide not to use the internet since they practically own it.

Apple does and will continue to do what is best for their business. They follow the capitalist mandate rigorously. And this means they will not abuse this system or any other since it would cost them money. More money than any company or state could offer them.
Thank you for your common sense.
 
Whilst a good idea to rid this world of the perverts, Apple hasn’t thought this through. What about somebody who has young children, taken pics of their children in the paddling pool with not much on. There’s nothing seedy about that, they are your children. I have young children and have thousands of photos on my iCloud. 99.9% full clothed and the rest, said paddling pool pics etc. I am not happy for anybody to be going through MY photos to check if they are child porn. They are my private collection of the kids childhood and they are not for other peoples eyes if I choose it to be. This will backfire massively and literally achieves nothing as anybody who is as sick to watch or create these pics, I would imagine, keep their stash offline anyway and share the pics online manually. This is just an excuse to get a back door in and I for one will not tolerate it. I will sell my Apple stuff and move back to windows/android and keep my pics and vids of the family offline. I only moved from windows/android due to privacy.

Thank you for your contribution 2021 account that after 3 days still hasn’t understood how this all works.
 
Completely false. Apple employees never get to look in your library. When you trigger enough positives to cross a threshold, a reviewer gets to see a visually derived image of only the offending pictures. This is not a flexible Apple policy or a server-side variable, it's a baked-in feature of the encryption that happens on your device.
Yea a false positive of a photo taken by somebody in private that is now being viewed by somebody that has no right to look at it.
 
  • Haha
Reactions: ohio.emt
Whilst a good idea to rid this world of the perverts, Apple hasn’t thought this through. What about somebody who has young children, taken pics of their children in the paddling pool with not much on. There’s nothing seedy about that, they are your children. I have young children and have thousands of photos on my iCloud. 99.9% full clothed and the rest, said paddling pool pics etc. I am not happy for anybody to be going through MY photos to check if they are child porn. They are my private collection of the kids childhood and they are not for other peoples eyes if I choose it to be. This will backfire massively and literally achieves nothing as anybody who is as sick to watch or create these pics, I would imagine, keep their stash offline anyway and share the pics online manually. This is just an excuse to get a back door in and I for one will not tolerate it. I will sell my Apple stuff and move back to windows/android and keep my pics and vids of the family offline. I only moved from windows/android due to privacy.
While I am sad to see a member of the Apple community leaving, I understand your decision to move towards companies with a much better track record for privacy and ethical conduct such as Microsoft and Google
 
To all the people that think this is Apple spying on you. Spying is done to see what you're doing and extract information.
Since for 99.99% of people, no information will be seen by Apple, or leave your devices... how is this "spying".

It's MUCH more like anti-malware software. There's a set of fingerprints, or what you know better as virus definitions, sent TO your devices.
How often does a malware scanner mis-identify a normal file as a threat? I'd bet no-one here has ever seen that. It's possible a few have but unlikely.

Apple's devices are already "scanning" your photos... how do you think those name tags appear? How do you think it knows there's a tree, or the photo is mostly orange?

This new system is simply leveraging the existing things your iOS (All devices to day) do already, and just focuses on one particular aspect of possible abuse.

So, please... tell me... where's the 'spying' you're all screaming about?
 
To all the people that think this is Apple spying on you. Spying is done to see what you're doing and extract information.
Since for 99.99% of people, no information will be seen by Apple, or leave your devices... how is this "spying".

It's MUCH more like anti-malware software. There's a set of fingerprints, or what you know better as virus definitions, sent TO your devices.
How often does a malware scanner mis-identify a normal file as a threat? I'd bet no-one here has ever seen that. It's possible a few have but unlikely.

Apple's devices are already "scanning" your photos... how do you think those name tags appear? How do you think it knows there's a tree, or the photo is mostly orange?

This new system is simply leveraging the existing things your iOS (All devices to day) do already, and just focuses on one particular aspect of possible abuse.

So, please... tell me... where's the 'spying' you're all screaming about?
Stop infringing peoples right, you monster!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.