Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It’s not a backdoor,

It is a backdoor if people other than Apple are able to specify "what is ok" and request/force they look for it and report back.

(all without the user knowing or consenting by the way)

Things like this always start as one thing and end up having unimaginable scope creep, access creep and ultimately very very dark outcomes.
 
Why does everybody talk about the government? When the system detects multiple known child abuse pics in an iCloud library, Apple will disable the account ano raise an alert at NCMEC, a private NGO. Not the government.

Also, I love it when people say they’re going to switch platforms because of this CSAM stuff. Only to ignore other companies had these systems in place for years.

Google, Facebook and Microsoft put similar systems in place: Google implemented a "PhotoDNA" system in 2008 with Microsoft following suit in 2009. Facebook and Twitter have had similar systems in place since 2011 and 2013 respectively.

If nothing else, Apple is late to the party.

Because some people aren't able to feel good without being able to exert a tiny bit of power by having a good bash at Apple. That's been going on for years.
 
It is a backdoor if people other than Apple are able to specify "what is ok" and request/force they look for it and report back.

(all without the user knowing or consenting by the way)

Things like this always start as one thing and end up having unimaginable scope creep, access creep and ultimately very very dark outcomes.

By this logic anything is a backdoor ‘cause at gunpoint you could be forced to specify what is ok or push a silent update to implement a system like this without even telling it to the public.

The only way onward is
- fully open source constantly audited hardware
- fully open source constantly audited OS and software
- going fully self-hosted, no cloud, people should have a home server like they have a washing machine or an oven
- decentralized blockchain for contracts, transactions, file storage, etc.

I see no other way out.
Let the revolution begin.
Wait, you’re telling me people being upset about this still use Facebook, Google Search, any non open source software (who knows what happens under the hood? who knows if a dictator pressured that company?), etc.?
 
This is a horrendous idea with so many ways this tech could go wrong.

Limiting it to the U.S. is not a solution and it’s obtuse of Apple to think so. Apple needs to stop now. Get rid of the feature, both the iCloud and Messages versions. No one wants this.
why did you first consider this is great we have technical right like seriously the specific nature and the purpose of this even being done by Apple you’re completely ignoring.
Snowed in as great as he is really has no children does he even think of that does he even offer a better solution know he’s just being a mouthpiece or something at the kings privacy but they’re not even thinking about what happened to the privacy of the children ended up in child pornography industry put it online holy macral.
 
So you mean to tell me the U.S. government will have to decide. When it comes to handling our privacy?

Technology and Government should not work together when it comes to dealing with PRIVACY.

I bet you there’s more to this story.

Someone needs to start a petition to put this to STOP.

View attachment 1815608

The message is clear: Do not store your stuff on iCloud. Make sure you order 1TB iPhone this year. (You're going to be needing the mega storage).
You keep posting these fun little pictures.
I’d like to see how the petition begins out going against help or not if you are not offering any alternative for just whining and complaining and thinking about all the ones own personal privacy and you’re not even considering the privacy of children who is explicit nonconsent photos or videos or even used to them being shared it to the world wow
 
You own the phone but not the software. Scanning will still be done on your phone even with iCloud Photos off. Apple just won't be alerted.

That feels like a "gray area" justification to me. I'm trying to think of another area of life where owning the item and not having control over what it does is acceptable.

I guess we could go back and forth on that and ultimately end up at the conclusion of buying a different phone if you don't like it...
 
  • Like
Reactions: peanuts_of_pathos
how many times do we have to go down this road? If another company does it, nobody cares. If Apple does it, the internet is on fire. In this particular case: multiple tech companies already do image scanning for CSAM. Google, Twitter, Microsoft, Facebook, and others use image hashing methods to look for and report known images of child abuse.
Do you know of any other phone manufacturer that does this on their hardware?
 
  • Like
Reactions: peanuts_of_pathos
That feels like a "gray area" justification to me. I'm trying to think of another area of life where owning the item and not having control over what it does is acceptable.

I guess we could go back and forth on that and ultimately end up at the conclusion of buying a different phone if you don't like it...
I was just stating fact. As users of the OS, we are bound by the TOS because we don't own the software.
 
  • Like
Reactions: peanuts_of_pathos
So let me try to understand this. You are against any and all measures to protecting a child?! Why are some vocal critics so afraid of this? are they hiding something??how could anyone not support something that protects an innocent child?? Apple, once again, is doing what is right!

If you support any and all measures to protect children there are a number of organisations known to have raped thousands of children we could ban right?
 
Even considering development of such a system is the single worst decision ever made since the founding of Apple. The surveilence privacy abuse and security implications of this system are astounding. I hope every imaginable avenue of communication to Apple is being flooded with requests to scrap this big brother system and instead focus all developer time on features that actually improve privacy and security and prevent survilence of any kind, in any country.
 
By having a threshold. One picture isn't enough. Maybe they have set the threshold to 50.

My impression is that many of the people having such pictures has thousands of them, even hundreds of thousands.

I read that WhatsApp reported 400 000 cases last year and someone here at MacRumours wrote that Facebook reported 20 million cases. If true, you can probably disregard those who just have a few photos.

But it’s still an estimation based on a threshold. And an image is just a bunch of pixels. So mathematically there can be a collision with a legit picture if it has the same general “feel” but with an adult/legal subject.
 
So let me try to understand this. You are against any and all measures to protecting a child?! Why are some vocal critics so afraid of this? are they hiding something??how could anyone not support something that protects an innocent child?? Apple, once again, is doing what is right!
I weep for America and the future of privacy when I see people make the argument “well, if you don’t have anything to hide, what are you worried about”?

Which is the argument you’re making. That’s not how privacy works. If some random cop that passes you on the street and asks to inspect your phone, you’ll hand it over right? After all, you have nothing to hide.
 
Last edited by a moderator:
The semantics of the “how” doesn’t really matter here. The point is that a supposedly privacy oriented company just openly said that they will process your images in the cloud for a purpose other than what users expected. That is the epitome of a contradiction which tarnishes further whatever pitch of privacy they have.
They (Apple, Google, Microsoft, Dropbox) already process your images in the cloud, and have for years. There is nothing new here other than apple is now doing it just before it uploads to their servers instead of after.
 
This intrusive way to search phones for vile material will go wrong, this type of technology always goes wrong. In the UK where I live Maybe 7 year old (not 10 as someone as else said said yuk) at the beach or babies in the bath with mum and dad could potentially flag the system one day if its near enough to what looks like a known image and a family will have thier lives runied by a mistake made my an automated creep going though your images. Face recognition technology isnt good enough yet even though the London Met use it, and its made mistakes, and apples system will make mistakes too.

https://www.wired.co.uk/article/face-recognition-police-uk-south-wales-met-notting-hill-carnival

As others have mentioned the potential for certain countries like Saudi Arabia to use this to persecute say the LGBTQ+ community or so called 'dissidents' who stand up to the corrupt and brutal government in Belarus are huge, this will be abused by them at some point I am sure. Apple have opened a huge can of worms here. Ive never used iCloud, I just dont trust what isn't on a secure thumbdrive saved safely at home or at a off premises location as a back up. I for the first time wont update to the latest iOS 15 even though I don't use iCloud because I don't agree that this is the best way to get paedophiles from swapping images, soon it will be all Apple devices that will use this tech and the idea of and that's way to 1984 ish for me, let alone buy a new iPhone with this baked in.


Child abuse is horrendous I went though physical violent abuse as a child, but this is not the way to fight it, so much for what happens on your iPhone stays on your iPhone.There are better ways to fight paedophile gangs I'm sure that we don't know about that the forces already use I would imagine than this very intrusive way to rifle though your images. All they need is to get one image wrongly classified and this will be a train wreck. In countries where the age of consent is different (16 in the UK) or beach pictures of bath time with the kids is seen in a different light to say the USA or Canada and the whole house of cards that is already wobbling and this could ruin Apples reputation when not if this happens.

The fact governments are loving this idea should say enough. At last a way onto your phone as Apple fold to give governments around the world the control they want over individuals who like to be private for very meaningful reasons in countries where their lives could be at risk. Also the "if you've done nothing wrong" really doesn't apply here, as being gay in Saudi Arabia isn't wrong but it sure will be easier to find out now and the governments will get access its a given, its not safe technology and I for one at this point in time wont be loading iOS 15 or getting a new iPhone because I don't trust technology like this not to f*ck up and cause immense harm to someone innocent, while vile images are swapped in all probability not electronically because paedophiles and other criminals probably don't trust the web to spread their vile and disgusting habits anyway. The EFF said it very well in this article below.

"Apple's filtering of iMessage and iCloud is not a slippery slope to backdoors that suppress speech and make our communications less secure. We’re already there: this is a fully-built system just waiting for external pressure to make the slightest change."

https://www.eff.org/deeplinks/2021/...t-encryption-opens-backdoor-your-private-life
For such a long post you clearly have no idea what you are talking about.
 
This is system is not looking for child porn in general. Only specific, known pictures which has been determined by law enforcement agencies to be illegal.

So your two scenarios will be fine from this system.
That’s BS Apple PR statement. Anyone who understands how these things work knows it’s complete and utter BS. Don’t listen to me. Read what Edward Snowden said about it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.