Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Kindly explain to me how this actually protects children.

If you walk through what this does and does not prevent, you may get what I am asking.
There have been several good posts on this in a few threads on this site.
It will protect children by catching perpetrators who make illegal child images and it will protect children by catching perpetrators who distribute illegal child images.

If a criminal intends to use their iphone to browse illegal images of children, something that they can currently do with ease as there are no checks in place, having CSAM on the iphone means they will get caught out.
 
  • Haha
Reactions: Pummers
Actually Google mail does it. Why Apple should not?.
The pedo “industry” is doing a great job in putting god’s fear into the public!.
 
  • Disagree
Reactions: Philip_S
You honestly believe private companies should start monitoring their users?

If the purpose is to detect and combat child abuse, then HELL YES, ABSOLUTELY!!!

Sounds like you may not realize how big and widespread of a problem this actually is.
 
why not just cut to the chase and arrest everybody with a phone just in case they are thinking about molesting children? you never know right. this outweighs ny privacy or logic. just arrest everybody over the age of 18 and lock them up and the children will all be safe.
 
The thing is, don't children deserve our protection? is it not up to us, as adults to find ways to protect children from harm?. It's a daming question but given the way people are defending their right to privacy, they need to ask themselves this question..is your right to privacy more important than the protection of a child? I think members here are too scared in being judged if they reply 'Yes' to that question.
Yes, my right to privacy is more important. At some point child protection measures become so intrusive that privacy is more important than the protection offered by the measures. I am quite certain that you agree on this. Otherwise, what would you say to my child protection plan where the entire population has to wear bodycams with 24/7 surveillance of every move? Would you accept this, or are you not willing to sacrifice everything for the protection of a child? I do not believe you would.

Protecting children does not trump everything. There is a line where other things become more important. We can argue whether the CSAM detection crosses the line (I think it does), but it is perfectly moral not to give absolute priority to child protection.
 
I don't care about the "children". Make the law against that sort of thing truly terrifing, like boiled alive when caught. Law is about Deterrence. Not about trying to scan through photos. Those f**ers will do anyway with a different phone. Rest of us pay the price. knowing Apple they will screw it up anyway
 
In one of the articles that MR has reported on the matter (am not able to find it at present) I remember reading about Apple saying that images are encrypted and as a result it would take a lot of computing power and programming to pro-actively scan images on icloud servers thus is much easier, simpler and quicker to scan for image hash values on a users device where there would only be a few image files to scan rather than millions of image files. Having to scan the servers on a daily basis would slow down the servers.

As for other tech companies scanning their cloud storage servers, I do not know if they encrypt images in the same manner that Apple does.
Adding a pool to my house would be much easier for me if the neighbors paid for it and did all the work.

Apple wants to scan for CSAM. They do not have to, they want to. If their servers are too slow for this, they need to buy more and faster servers, not exploit the hardware of other people.
 
As a father of two small children, a lot of you make me sick. Child abuse and the distribution of such material is a HUGE WIDESPREAD problem. After many years, Apple *finally* wants to implement a system that detects child abuse material on someone's phone to counter this problem. And you bunch of babies cry about your precious "privacy".

If you do not store child abuse images on your phone, how will this even affect you IN ANY WAY? And don't give me all this slippery slope BS about what this possibly *could* lead to. We are talking about a very specific piece of technology designed for one very specific purpose. When they are proposing 24/7 body cams for all adults or scanning phones for political content, we'll talk about that. BUT THEY ARE NOT. They are proposing detecting child abuse images on people's phones. They deserve applause.
 
The bad things about this "delay" ist, that the framework will be already installed. Just not activated.

CSAM is really bad cause it demonstrates that Apple doesn't care about privacy. Money talks.
Agreed, 100%. As someone said the other day. They'll quietly roll it out in a .1 update.

Also, agree with you about apple and privacy– I read somewhere that apple accepts millions to make google the default search engine in safari. Even if they did it for free, makes you wonder why on earth they would do that if they're so hell bent on privacy.
 
When you receive this much backlash over a feature intended to protect kids from sexual abuse material and prevent adults from distributing said material, you know you’re doing something right.

Bollocks, you completely miss the point as with others...like Apple.

I think many people are truly unaware of the staggering prominence of child abuse in society, if people knew how common and widely distributed the material is they might throw some support behind this.

You assume too much, pretty sure most know exactly how bad it is.

Meanwhile, your government is actively tracking your location everywhere you go, QR code check ins show what places you visit and how long you stay. CCTV exists on every corner, every traffic light, monitoring your movement patterns through facial recognition & number plates. Every time you tap and buy something you reveal more of yourself. None of this surprisingly makes people revolt in protest, when it should, and yet the idea of Apple implementing a child-protection feature has everyone crying “encryption!”

Your Government...China?
 
Personally I think Apple are going to do this regardless. They may change how they represent it and how much they tell the user but they clearly want this to happen.

The only reason I think they're put it on pause is iPhone 13 sales. Once iPhone 13 is out and iOS 15 on enough devices they'll just push it out. Its not like anyone at that point, once on iOS 15, is going to say no to software updates from then on out.
Believe me, plenty of people bought their last iPhone if Apple still implements this monstrosity.

I, for one, will.
 
Last edited:
Still not clear, how could iMessage live scanning harm children?
What if the kid has abusive parents? Texting is now monitored more closely, bad on its own, but also has a chilling effect that’s broader
 
  • Like
Reactions: dk001
So that's where they should do their scanning.
They are only scanning on device for photos being uploaded to their server anyway.
It adds additional meta to the upload.
If they keep 100% server side scanner then we have to accept we cant have true end-to-end encryption thus meaning all photos are viewable, not just abuse photos.
If you have no abuse photos, this feature does not affect you one little bit. If you do have abuse photos, you’re sick and should be caught.
 
What if the kid has abusive parents? Texting is now monitored more closely, bad on its own, but also has a chilling effect that’s broader

If a child has abusive parents, it is surely bad enough already. How is iMessage scanning will make it worse for a child, especially considering that it only works if enabled by a parent via the Family account. 🤷🏻‍♂️
 
When you receive this much backlash over a feature intended to protect kids from sexual abuse material and prevent adults from distributing said material, you know you’re doing something right.

I think many people are truly unaware of the staggering prominence of child abuse in society, if people knew how common and widely distributed the material is they might throw some support behind this.

Meanwhile, your government is actively tracking your location everywhere you go, QR code check ins show what places you visit and how long you stay. CCTV exists on every corner, every traffic light, monitoring your movement patterns through facial recognition & number plates. Every time you tap and buy something you reveal more of yourself. None of this surprisingly makes people revolt in protest, when it should, and yet the idea of Apple implementing a child-protection feature has everyone crying “encryption!”
Yeah you'd be correct if this were going to make any kind of significant dent in CSAM distribution. It won't. Apple has already told predators if they disable iCloud photos they won't be scanned. This will ultimately harm more children once weaponized as a political tool in China, Russia, etc. than it will ever save from scanning devices. It's a hollow attempt veiled as protection when it's really just a massive privacy invasion.
 
If Apple gets hacked. Creating a backdoor into such a far reaching detection system means it is possible Apple would not be aware of how its devices are being scanned and manipulated.

“Apple is inventing a world in which every product you purchase owes its highest loyalty to someone other than its owner. To put it bluntly, this is not an innovation but a tragedy, a disaster-in-the-making.”

To date, Apple has defended its CSAM detection system saying it was poorly communicated. But in the last weeks researchers, who worked on a similar system for two years, concluded “the technology was dangerous” saying “we were baffled to see that Apple had few answers for the hard questions we’d surfaced.”


from an other article
Snowden points out that the entire system is easily bypassed which undermines the stated aim behind its creation:

“If you’re an enterprising pedophile with a basement full of CSAM-tainted iPhones, Apple welcomes you to entirely exempt yourself from these scans by simply flipping the ‘Disable iCloud Photos’ switch, a bypass which reveals that this system was never designed to protect children, as they would have you believe, but rather to protect their brand. As long as you keep that material off their servers, and so keep Apple out of the headlines, Apple doesn’t care.”

And, for those of you already thinking ahead, Snowden points out there is an obvious next step to this process: governments compelling Apple to remove the option to Disable photo uploads to iCloud.

“If Apple demonstrates the capability and willingness to continuously, remotely search every phone for evidence of one particular type of crime, these are questions for which they will have no answer. And yet an answer will come — and it will come from the worst lawmakers of the worst governments. This is not a slippery slope. It’s a cliff.”
 
Last edited:
Too many people here who have no actual idea on what Apple are proposing.

Too many people here are concerned about the rights of child abusers.
1) Sounds like an Apple problem, then. They announced a wildly controversial program without first taking the time to wargame what the best way would be to do so without causing this exact sort of uproar. If apple put such poor planning into the optics of the program, how much thought went into designing it in the first place?

2) No. That is a trash statement and you know it. There is a reason the stereotypical Pearl Clutching “Think of the children” meme is a thing. This sort of program is a literal “road to hell” in terms of privacy and potential abuses, both of which were the very reason Apple formerly declined any pressure from governmental agencies to install a back door into their tech. Once a door is there you have no guarantee it will stay closed to anyone but authorized visitors. Just the existence of that door in the first place opens your device up to invasion with nefarious intent.

3) Apple has the ability to perform these sorts of scans on their own servers and indeed has already been doing so. If preventing the transmission of sexual images of children is their goal, they already have an effective means of doing so without compromising the integrity of individual devices. Cracking an Apple server is a lot harder than compromising an iPhone. Just ask those hundred or so journalists who found theirs cracked by malware a month or so ago.

4) There comes a point where protection becomes intrusion, and that is where Apple is heading. Apple could have prevented this by making groups like the EFF part of the process from the start, and I’m frankly shocked at how poorly a job Apple did planning not only the rollout of this feature but the implementation of said feature itself. The fact that Apple was taken so off guard by the massive backlash is not comforting, as how could they expect any less from such a significant tonal shift in regards to respecting and protecting individual user privacy.
 
Apple wants to scan for CSAM. They do not have to, they want to. If their servers are too slow for this, they need to buy more and faster servers, not exploit the hardware of other people.

And there’s the problem. One thing that the “think of the children” brigade are determined to ignore is that folk don’t have a problem with the actual scanning. The problem is doing it on the phone.

Apple, Google, MS and Facebook have been running server side scans for ages. But it appears that Apple doesn’t seem to catch anywhere near as many images as the others. So either pedophiles don’t use iCloud (unlikely) or Apple’s server side scan doesn’t work as well as the competition.

The difference, I guess, is that the other companies run their services on their own hardware; iCloud sits on top of AWS and Google services. Now imagine the increased billing they’d get from Google and Amazon if they have to process every single file, as opposed to whatever they’re doing now that gives them such a poor hit rate.

I can see why they’d want their customers to shoulder the burden.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.