Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
That's a fair point. But that's placing a LOT of trust on them, and basically making Apple the police in that situation. They hold all the power in that moment of deciding whether or not to shut you down and report you, or not.

What advantage would there be for Apple to waste their time reporting innocent images to NCMEC? It won't go anywhere. I simply cannot understand where you're coming from here. Sorry.

#1 - A middle school girl, we'll say 13 years old, sends a photo of herself topless to her 14 year old boyfriend. The boyfriend is an idiot and sends it to a few friends. Because of iCloud's photo backup, the photo gets saved, analyzed, and reported to this manual review group. Then suddenly they're sitting in front of a topless photo of a 13 year old girl. That's child porn. What do they do? And the boys that simply received a text message... they get reported / shut down?

No, that photo would not be flagged because it's not a "known" child abuse image. Same thing goes for your 2nd situation with the video.

#3 - An adult is looking at pornographic pictures and saves one that ends up being an underaged girl - she's 17 but he thought she looked 21. He's now in possession of what could be considered child pornography, and for all we know it could be an image that matches up with something on the NCMEC. Does his account deserve to get shut down and reported to NCMEC?

If that image is a known child abuse image, then of course it will be flagged. If people are downloading porn indiscriminately, then that's the risk they run. But keep in mind they've made it pretty clear that just one image is not going to be investigated. They said there's a certain number that have to be both flagged AND uploaded to iCloud before they review them. That makes sense, because most people who purposely download child porn aren't going to just download a few images. They normally have hundreds or thousands of images.
 
  • Sad
Reactions: peanuts_of_pathos
Good! The fact that it’s not a 1:1 hash comparison and it can find matches on cropped, color adjusted, rotated, and transformed pictures means there is a chance for legit pictures to be flagged between two consenting adults. And software is NEVER open to bugs right? It’s just asking for disaster.
Not sure how this has anything to do with 2 adults sending their own pictures to each other. Apple is scanning against a known database of child abuse images. Your d**k pics you send would have a different hash.
 
  • Like
Reactions: citysnaps
So I have to put up with battery life being used, processor resources being consumed and storage space being taken up just so Apple can sniff through my photos to make sure I'm not a pedo? Personally I'm offended that Apple are treating me like a criminal and looking at my data without my consent.

Enjoy all the photos of the insides of my arcade machines Tim Cook you creepy voyeur.
 
Not sure how this has anything to do with 2 adults sending their own pictures to each other. Apple is scanning against a known database of child abuse images. Your d**k pics you send would have a different hash.

I think people really haven't bothered to read much around this. Just the headline, then cracked their knuckles and went full tinfoil hat keyboard warrior.

This image is on the main page of Mac rumours.

apple-csam-flow-chart.jpg
 
If that image is a known child abuse image, then of course it will be flagged. If people are downloading porn indiscriminately, then that's the risk they run. But keep in mind they've made it pretty clear that just one image is not going to be investigated. They said there's a certain number that have to be both flagged AND uploaded to iCloud before they review them. That makes sense, because most people who purposely download child porn aren't going to just download a few images. They normally have hundreds or thousands of images.

I missed that in the original article - and that does add a bit of reassurance.

In general, though, it still feels like it's taking too much of law enforcement's responsibility and pushing it onto a technology company.
 
On the bright side I’m going to save thousands in the future when I buy a new smart phone by getting the 128 gig version instead of the 1 tb and 512 gb versions. Now that I’ll no longer be trusting these devices with all my images and videos no need for that big storage option for me. I’m ready to save that money. I didn’t need every photo/video I took or saved from 2008 on me at all times anyway. If only I had known a few months ago I could have saved $700 by getting the base M1 iPad Pro.
 
I appreciate Apple's sentiment, but I can't imagine why Apple is pursuing this functionally anencephalic idea.

OK - so the scanning process matches your private photos with templates of child abuse photos using processing and memory local to your device. First, how much memory and battery is that going to chew up? Second, this is simply going to trigger an arms race between paedophiles and Apple. The paedophiles are going to edit/encrypt pictures and eventually that will lead to new templates, more edits, more templates... until the number of templates becomes prohibitive. Unless of course the template matching does not require an exact match. In that case, there will be inevitably false positives. Which leads to my third point: if your photo erroneously 'matches' a template, some human at Apple will review it. So some person could be looking at your private photos, perhaps of you, your kids or your partner, that have been misclassified. If that doesn't define creepy, I do not know what does.

And this is if the system works as advertised, what prevents the templates from being changed to detect memes, flags, political slogans, etc. that could allow governments to abuse this type of surveillance, as governments have abused every other form of surveillance made available to them? Also, when is Apple going to start scanning our audio material for key words (oh, I forgot, the NSA do that), start scanning our text documents for verboten thoughtcrime, etc. etc.???

This is such a monumentally stupid idea that I wonder if the government is pressuring Apple and other tech companies to engage in surveillance and Apple wanted to provoke a pre-emptive public reaction against it.

P.S. This sounds like a thread that in the past would have been put in PRSI *cough* because in addition to the Apple-specific news it raises general questions regarding the safety of the public versus the rights of individuals. Good luck keeping this thread just about Apple's hair-brained move.
 
Last edited:
What’s all over? Your crimes?
What a foolish comment..that’s all you understood from this whole privacy concern? you think only sickos with kiddy porn are concerned??
If Apple can scan your photos and messages (for whatever reason),others can too..that means no privacy..

it’s like you leaving your house door for the police to enter anytime they want..guess what? burglars can enter too!
 
On the bright side I’m going to save thousands in the future when I buy a new smart phone by getting the 128 gig version instead of the 1 tb and 512 gb versions. Now that I’ll no longer be trusting these devices with all my images and videos no need for that big storage option for me.

Or if you have a Mac, just don't upload them to iCloud and it becomes a moot point and you can continue to store all your photos and videos on your phone and back them up to Photos on your Mac instead of iCloud.
 
  • Sad
Reactions: peanuts_of_pathos
So I have to put up with battery life being used, processor resources being consumed and storage space being taken up just so Apple can sniff through my photos to make sure I'm not a pedo? Personally I'm offended that Apple are treating me like a criminal and looking at my data without my consent.

Enjoy all the photos of the insides of my arcade machines Tim Cook you creepy voyeur.
So you believe Tim Cook will be personally reviewing photos of the insides of your arcade machines?
 
  • Haha
Reactions: LiE_
I just want to repeat this again:
1. For those “criminal“ comments, slave was legal back in 1800, look at what it is now. What’s legal today could be illegal tomorrow. Look at what Sydney does: protesting is ok before, but illegal during lockdown.
2. For those “bash child porn” comments, this tech will do far more harm than good. It’s not like current technology is so bad for catching criminals that mass surveillance is required.
I’d like to declare the death of privacy happens today. Everyone will be a criminal at some point, intentionally or not, and there’s nothing to stop it from happening.
 
Not sure how this has anything to do with 2 adults sending their own pictures to each other. Apple is scanning against a known database of child abuse images. Your d**k pics you send would have a different hash.


And a cropped image from the database will have a different hash. It’s not a 1:1 hash comparison which can lead to false positives.
 
others can too..that means no privacy..

That's a rather big stretch.

I'll have a stab at this line of thinking.

"If Apple can scan my photos using AI to find cats, dogs and faces, others can do it too, that means no privacy"
 
Not sure if people are too obtuse or if Eff and Snowden are trying to fool anyone (looking at this thread it’s easy to fool I guess ) but iCloud has never been end to end encrypted. Only data which exclusively resides on your phone is. Apple has been sharing information with police on users iCloud data (not iPhone data) since iClouds inception. The “client side processing” is Apple allowing the iCloud photos to be scanned. Was everyone living under the rock that they didn’t know this about iCloud ?
 
There’s something bigger going on here if Apple is acting as an arm of the government which this program absolutely is.

Apple may not be giving the golden key to the state but it’s sure accessing and providing the contents that the key was supposed to protect.

Do it here, do it there, eventually do it everywhere.

This marks the end of privacy because we can’t trust Apple to see that privacy, which they claim is a human right, must be treated as an absolute by them.
 
So if i want to cause troubles to someone is enough to make an apple account with its infos and fill icloud with child porn. easier and cheaper than sending drugs from deepnet to someone's home...
 
You sound a bit out there. Snowden is an enemy of the government because he is a traitor. Any government in human history would feel the same way. Russia is harboring him to act like they’re better than us, not because they agree with him. If he did to them what he did to the US then he’d simply be poisoned to death.

But no, he’s not going to be written out of history. You need to wake up and get a grip.
Ok fine. He will be written into history as a traitor. Which he is not. His government was being evil, and he acted with a higher purpose of right versus wrong to address it.

Snowden revealed that your own government already was treating you like a criminal, and you should be outraged. If you think I’m “out there” for thinking this, I’d doubt your own motivation.
 
As a long time iPhone user,I’m shocked and angry by Apple’s foolish new policy,probably another marketing stunt just to look good,at the cost of destroying customers privacy.unbelievable.

just when I was trying to forget the whole Pegasus malware story and how absolutely USELESS the whole iPhone security/privacy claims really are,we hear about this stupid policy.

My next phone will not be an iPhone.
I bought iPhones because I thought I will have excellent privacy.all other features are available on Android.
 
Awful step by apple. And the 'alert your parents if you view a nude' is some awful overreach but typical of apple's strangely prudish approach.
Yep! It is so symbolic of the mental illness that is modern American sensibilities.

Sharing false information about diseases on social media or downloading files so you can 3D print a gun? All good!

Looking at pictures of Michelangelo’s David one of the greatest works of art ever created? ALERT ALERT ALERT!
 
As a long time iPhone user,I’m shocked and angry by Apple’s foolish new policy,probably another marketing stunt just to look good,at the cost of destroying customers privacy.unbelievable.

just when I was trying to forget the whole Pegasus malware story and how absolutely USELESS the whole iPhone security/privacy claims really are,we hear about this stupid policy.

My next phone will not be an iPhone.
I bought iPhones because I thought I will have excellent privacy.all other features are available on Android.

Google awaits your photos for full scanning :D
 
  • Haha
Reactions: peanuts_of_pathos
Why does this fanfare of criticism only come into force when Apple do something? Google (and the other tech giants) have been doing this since 2008.
 
  • Angry
Reactions: peanuts_of_pathos
I'm OK with the scan as long as it happens on the phone. And if I'm reading correctly about how it happens it tallies the number of hits it gets and if there is more than some threshold then it notifies Apple who then reviews it. This part concerns me. First, Apple has not been transparent about the review process. This needs to be so in case something goes wrong so they can be held accountable for mismanagement of the data. Second, they are not elected officials by the people. They are a company of private individuals. They should not be able to decide for us what is OK and and what is not. This review should be done by legal definitions and law enforcement.
Should we be working against the exploitation of children? Yes. Should we be putting how this is done in the hands of non-government officials and organizations? Absolutely not.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.