Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
No, I believe Apple is being forced to create a backdoor. CSAM is just the excuse. On phone scanning can be expanded to literally cover anything. In the USA, re: terrorist threats, pictures of your massive gun stockpiles? Muslim countries, photos of women not in burkas? If it can scan images, text is a breeze. Apple has been a champion of privacy. So why suddenly is naked images of children so important to eradicate now? It’s important to listen to what Apple is saying. But even more important is what Apple is NOT saying... If Apple was totally on board with this we’d have know years ago. It wouldn’t be a shocking 180° turn around from their core value of privacy, that they heavily advertise worldwide. Once that hash scanning has been installed into iOS, the NSA can get anything on any iPhone anywhere. & the NSA doesn’t talk about all the once illegal/still unconstitutional things they do.
i am one of those who believe this is a sop to the government because apple is rolling out e2ee and they need to “look” like they are doing something to combat child porn since so much data has gone dark, i would be surprised if they don’t roll out e2ee in a couple of weeks, it makes perfect sense to explain why this thing is happening
 
It’s all fun and games when they’re coming for the pedophiles, because I’m not a pedophile.

It’s all fun and games when they’re coming for the atheists, because I’m not an atheist.

It’s all fun and games when they’re coming for the foreigners, because I’m not a foreigner.

It’s no longer fun and games, because I said something they didn’t like, and now they’re coming for me.
What?

You are equating child abuse with being a foreigner?

Come for the pedophiles! Arrest them! But with warrants and due process, not by violating rights of everyone in the process.
 
I understand why this is a slippery slope but I don’t like the idea of child predators breathing a sigh of relief.
Ridiculous ! they either are unaware of this or have long deactivated their iCloud Libraries.

It's always how it starts : "FOR THE CHILDREN" or some other "worthy" cause.

And then it's everyone that displeases the local police force, the local anti-abortion crazies (hello there Texas, or the local dictator),
 
No, they are scanning for manipulated images of known CSAM, it's scanning for pictures that "seems to be these CSAM pictures". Which means your family photos can be false positives. That's why they have the 30 picture threshold, the more pictures you have in the iCloud , the more likely you'll end up getting false flagged.

Once you've flagged and your family photos are getting reviewed, pray it's something that a American deems cultural appropriate in their country or else...
no they are matches hashes of known csam against hashes of photos you upload to icloud, apple says they are accurate to 1 in a trillion false positives so the naked kids in the bath pictures or romping through the sprinklers will not get flagged and then you have to have 30 of these image before they will have a human look, but if indeed, they are matching to a 1 in 1,000,000,000,000 accuracy there is no way you will ever get in front of human reviewer unless you have actual csam that exists in at least 2 databases in 2 different legal jurisdictions
 
No child predators are uploading their illegal content to iCloud, let’s be serious. This feature is just a huge misstep and slippery slope to privacy for everyone ELSE. They were never going to catch child predators with this feature…
that’s not what i have read, icloud is apparently used for the distribution of csam regularly and widely, it makes sense if apple has only reported less than 300 people and facebook has reported 20 million people, if i were distributing csam i would be looking at icloud as a good distribution method
 
Are you okay with everyone having to use a breathalyzer every time they start their car because we need to make sure we get all those drunk drivers.
Are you okay with your car calling the police on you when you go over the speed limit because speed kills you know.
Are you okay with your new TV having a camera looking for drugs in your room because we need to get rid of all those drug addicts.
These three examples are not the same. Because when it comes to CP, it's 100 percent wrong, 100 percent of the time.

You know what though, I would like cars to have a breathalyser. That would dramatically reduce the carnage from drink drivers.
Also, some countries in the world like Australia have far less speeding because they have cameras everywhere.
And there are plenty of things that can be done to reduce drug usage.
So there are things that can be done to reduce the issues. Is the way Apple doing it the right way? Probably not.

Are you okay a year down the road when Apple expands the CSAM to scan for other illegal activities our government doesn't like because those will be good causes too. Just wondering what your line in the sand is.
That's not a fact. That's a what if. You are only guessing. We can think about worst case scenarios and that's important, but presenting it like it's a fact does everybody a disservice.
 
I understand why this is a slippery slope but I don’t like the idea of child predators breathing a sigh of relief.
Do you really think it could stop them?
Do you believe it won’t be used against other people who are not allined with how government thinks?
 
Sorry. There was no glossing over. Your info was factually wrong.
And I would even ask about the bolded section. I believe you may be incorrect about the procedure there as well in terms of the photos being sent from your iPhone to Apple.

FROM APPLE:
Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account.

Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC.

------------

It does not mention it accesses the photos on your phone. Please provide literature where that is clearly stated, otherwise you are making it sound like Apple is directly accessing your phone when that is seemingly not the case.
From Apple.com…
Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.
 
they'll try sneak it in next year with a 15.x update but until then please buy the iPhone 13 (/s)

i know they don't want to host that stuff on their servers but if they care about user privacy they won't do on device scanning. keep it so that images are scanned when someone uploads it then flag their account immediately and if it's illegal content report them to authorities.
 
Last edited:
One swallow doesn’t make a summer.
And the referenced article demonstrates the simplicity of the methods that be used to groom children.
The article calls on big tech to do more (a-la CSAM scanning) and nowhere does it support your assertion that paedophiles are particularly intelligent.

Paedophiles aren’t playing with a full deck…
 
I bet they did some "test" scans already and found out way too many people would beincorrectly referred to police for the content found.

As others have said, this is also for government’s under the disguise of “THE CHILDREN” and Russia, China and Saudi must be licking their lips at this happening.

Never thought I’d say this but I just don’t trust Apple at all.

I knew they were sneaky and cheap with things like ram, storage, OLED and hardware in general but I always trusted their privacy even if it became a little OTT lately with marketing.
 
They will end up CANCELLING IT. 🤫

It’s been a hot mess. Even good ol’ Craig (executive) admitted it. He knew Apple’s approach was wrong.

They won’t cancel it like the other comment they will wait till after the iPhone release so they don’t hurt early sales.
 
  • Like
Reactions: Viva
From Apple.com…
... This voucher is uploaded to iCloud Photos along with the image.

Yeah. Exactly. You are arguing my point. It's not from your phone, it's from icloud. I think we are actually arguing the same thing here, but you worded it quite poorly. The human review is triggered from icloud, not from your phone.

This is what you wrote:" After a 30 hash match on your iPhone, if iCloud photo sharing is ON, a notification & an image from that iPhone will be sent to Apple for a person to review"
 
The search happens on your phone BEFORE your pictures are loaded into iCloud. To do this the search even uses a hashed database that gets installed ON your phone. And the human inspector seems to be able to look into your real unhashed pictures.

I like that they rethink this whole idea. My private device should be made untouchable and not considered somebody else's search tool, good intentions or not.
 
  • Like
Reactions: Mendota and BurgDog
These three examples are not the same. Because when it comes to CP, it's 100 percent wrong, 100 percent of the time.

You know what though, I would like cars to have a breathalyser. That would dramatically reduce the carnage from drink drivers.
Also, some countries in the world like Australia have far less speeding because they have cameras everywhere.
And there are plenty of things that can be done to reduce drug usage.
So there are things that can be done to reduce the issues. Is the way Apple doing it the right way? Probably not.


That's not a fact. That's a what if. You are only guessing. We can think about worst case scenarios and that's important, but presenting it like it's a fact does everybody a disservice.
Where i come from, a little drunk driving is not okay. Regardless, these were just some examples off the top of my head. I am sure you will agree that there are numerous good causes that require policing. So why out of the blue has it become Apple’s obligation to police just this one thing?

If everyone is okay with Apple policing child porn, why isn’t it right for them to start policing other things? Do you not agree that this tech and/or other new tech that Apple can implement to police other things would not be equally useful and important? In other words? You are giving them the authority to police so why shouldn’t they expand their policing for the betterment of society? Since you are okay with your liberties taken away and living under a police state, you should and probably are okay with that. I am not.

Knowing this, Lets stick with child porn then. Are you okay if the tv you buy comes with a camera to monitor your room for CP? Really…its only going to be used just for that, why, because Sony says so and it would be a “guess” to think otherwise.

My point was not that CP isn’t 100% bad and that it shouldn’t be policed. My point was why are some people okay for random private companies that we buy products from to start policing and get to choose what and how its done. If you Allow Apple this right, you should be okay with other companies to police what they want and how they want to do it. As I mentioned, we have police for this. The Police should police and Apple should stick to making computers and phones.
 
Think the other way around: You pay Apple for your phone. So how about you start to inspect Apple finances for -say- tax evasion? You hack into their system and look for financial and tax data in case they violated something. Whenever 30 of your lawyers believe to have found something relevant you call the state prosecutor for tax fraud.
Good intention maybe but Apple would not like it for sure.
(Just fictional for debate I am not claiming Apple does anything wrong concerning taxes)
 
  • Like
Reactions: PC_tech and ian87w
Even if the first goal of this initiative looks legit, it can go wrong at any time. The main point being that Apple isn't (and shouldn't be) in position to decide what's safe or not for people, because next year for example, they can decide to "protect" children against what Apple/Google/Facebook consider politically incorrect stuff. And that's very likely to happen, considering how strong is the political involvement of Apple nowadays.
 
  • Like
Reactions: BurgDog
Pedophiles and sex traffickers everywhere are cheering their victory in swaying the opinion of the cowardly sheep that didn’t understand this useful technology. Apple just got sick of the incessant bleating.

I'm not so sure it was going to do all that much anyway. If you don't have icloud photos/photo stream turned on, it wasn't going to trigger anything (for now at least). Ps would have just changed phones. Like I said previously, it would be interesting to see the sales figures for Android phones in Hollywood once this is introduced.
 
  • Like
Reactions: Pummers
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.