Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Apple is not scanning your photos visually just the hash. Apple doesn't know what is in the photo until the hash match. Pedophiles should be scared, other people not.
How do you not understand this? Nobody is afraid of their images being scanned. What people are worried about is the fact that once this door is open, it can never be closed. Apple cannot simply refuse a demand from any national government for the furtherance of this technology to be used in other applications. It WILL be abused. And there will be no turning back once it is released.

Honestly, if people like you don't understand the true issues of this situation, you really shouldn't be talking about it.
 
Apple would not jeopardize to do that. If they do… huge lawsuit is coming.
Agree but for that , people must realise that apple is doing it...and without acces to the servers...its almost impossible without a judge order...for example, now we just suspect that all of our stuff in the cloud is monitored ...but is just speculation from our part since nobody cant come here and show us the code and the skeleton of that proof
You cant make a lawsuit that you can win it based on speculations and win it
So yea, i suspect that Apple will address this in a gentle way for people to calm down but in the end they will still do it hardly. They already state that they dont give to the government anything else but this..so yea, it already started
Also i suspect Apple will go even further and speak about it at the September event
 
how is it you have a problem with that?
How can you not see how easily this system is manipulated?

Apple has nothing to gain from getting rid of CSAM, but they have everything to gain from getting rid of "pirated" content.

Look at how incentives are aligned here. Say you're watching a web rip of Ted Lasso on your iPad. Apple now has incentive and the technology to report you. That's the next logical step.

Then governments want in on that. Because yeah, CSAM is bad, but if you're in the UK, the government considers transphobia and Islamophobia much worse problems, so obviously the database should include that too, right? And in China, memes being critical of the government should be added to those databases, obviously. And in America? Wait, you still own that episode of Community where Senor Chan paints his face black? Off to the Gulag with you!

Obviously they're rolling out the technology under the guise of combating something that's universally reviled, but their real incentives are about copyright.
 
People who think those who don't have kiddie porn shouldn't be worried. First of all, I can almost guarantee you that those who actually have that stuff, don't have it on their phones, especially not stored in iCloud. So scanning this will be moot because they won't actually catch any (or at least VERY few) sickos.
There are users and other security experts, including those working in the field, actually stated that this thing would probably not do much. Thus it begs the question, why the sudden move to shoehorn this in such a short period of time. Note that this wasn't even touched nor mentioned during WWDC, and there were no legal pressures on Apple about this. But Apple made a sudden decision, along with "holier than thou" statement. It really begs the question the actual purpose of setting up a system like this.
 
I think that Apple should stop implementing these kinds of techniques. It doesn't matter how you look at it, it's still a backdoor and this creates an opportunity for authorities to state that Apple should "scan" for other cases within a person's his or her phone, and Apple might find itself in a position where they can't refuge because

a. it's possible for them to apply
b. they can be forced to accept these requests because they need to follow the law of that specific country

Even though the intentions of Apple are noble, they do open the box of pandora with this.

But do you think if Apple stop implementing this technology, the technology will cease to exist?
 
So from what I understand, Apple takes the image hash and compares it to an image hash database of known ‘blacklisted’ images. Now I haven’t looked into the CSAM feature too deeply, but there’s many ways of changing an image hash without ‘changing’ the image itself. For example, you could use steganography to make one simple change to the image and this would change the hash.
No. The hash algorithm in question is designed to be resistant to basic image manipulation. In particular, steganography, which typically doesn't change the appearance of an image, would not have any effect on the image hash.
 
I think people are getting riled up for no reason. Come September, the new iPhone 13 Pro Private and Pro Max Private will take care of it. Pay extra for privacy and pick the Private models.

Calm down everyone. Apple can do no wrong. It’s an Apple forum, it does not behove us to bash Apple. I mean, look at how we treat anyone who has a problem with Apple that we don’t.

Just relax.
 
Apple needs to figure out another way to catch Pedophiles. Scanning your photos or visually hashing it is not the strategic solution.

As a consumer, Why are we getting punished?
Apple is not trying to catch anybody. Apple is only complying with the law to not allow illegal materials to be stored in their servers.

IMHO, this will lead to E2EE for iCloud Photos, which is currently still stored in iCloud where Apple has the keys to decrypt the contents. Once Apple can prove to the authorities they have a meaningful way to prevent unlawful materials being stored in their servers, they can proceed to enable E2EE.
 
...Obviously they're rolling out the technology under the guise of combating something that's universally reviled, but their real incentives are about copyright.

But why? Why bother with the CSAM step at all? If they wanted to use the technology for detecting copyright infringement (or anything else), why not just go straight for that? What would they stand to gain by using CSAM scanning as a first step towards that?
 
There are users and other security experts, including those working in the field, actually stated that this thing would probably not do much. Thus it begs the question, why the sudden move to shoehorn this in such a short period of time. Note that this wasn't even touched nor mentioned during WWDC, and there were no legal pressures on Apple about this. But Apple made a sudden decision, along with "holier than thou" statement. It really begs the question the actual purpose of setting up a system like this.
Because if they announced E2E encryption on iCloud Photos before doing this there would political blowback which if sustained enough might end with us all getting some US Senator’s ‘solution’ for child porn on iCloud, rather than a solution from technologists, cryptography experts and privacy advocates.
 

Enough is Enough, Apple! It is incredibly distasteful for you to SPY on us, as consumers. Apple it will be in your best interest to give us the Opt-Out option from CSAM feature, please.​


Where are you, Craig? You said this yourself in WWDC - 2021. What happened?

“At Apple, we believe privacy is a fundamental human right,” said Craig Federighi, Apple’s senior VP of software engineering. “We don’t think you should have to make a tradeoff between great features and privacy. We believe you deserve both.”

This is a punishment and a big slap in the face for anyone who owns an iPhone. Whatever you are trying to accomplish with this Apple, leave it in government and law enforcement hands.

1. How about CSAM scans Apple's executives' iPhones first? No one wants their privacy to be exposed. Please stop this nonsense and RESPECT our PRIVACY. It is our fundamental human right.

2. How come this CSAM stuff was not mentioned by Apple during WWDC - 2021. Apple is up to something. Why now? When we are almost to the release date of iOS 15...


Also, this guy needs to be FIRED from Apple. He is the mastermind behind CSAM. What a joke!

View attachment 1818081

Does anyone here have a game plan on how we can stop this crappy CSAM feature?​


Don't turn on iCloud Photo Library. But quite honestly, as all this detects is child pornography, it makes me wonder why you wouldn't want it turned on.
 
And yeah a bit surprised by Apple’s move but if they only send a hash over the internet and you have done nothing wrong, why the worry.
From my understanding, no security voucher will be generated for any uploaded iCloud Photos if the hashes do not match the CSAM database. Only matched hashes of photos will be have the security voucher generated and only photos that are uploaded to iCloud will be hashed. If there's no upload, there will be no hash generated, even if the photos stored in device are indeed in the CSAM database.
 
But why? Why bother with the CSAM step at all? If they wanted to use the technology for detecting copyright infringement (or anything else), why not just go straight for that? What would they stand to gain by using CSAM scanning as a first step towards that?
Exactly. I get fed up of people making a "slippery slope" argument - it means you cannot ever travel in any direction, ever.

Lower taxes = "carry this on and soon there will be no tax at all"
Raise taxes = "carry this on and we'll be in a communist country soon"

If people think preventing child pornography is fine - which everyone should - then they should be happy with this feature. If their concern is what Apple might do next, then raise it when Apple does whatever terrible thing next.
 
I think that Apple should stop implementing these kinds of techniques. It doesn't matter how you look at it, it's still a backdoor and this creates an opportunity for authorities to state that Apple should "scan" for other cases within a person's his or her phone, and Apple might find itself in a position where they can't refuge because

a. it's possible for them to apply
b. they can be forced to accept these requests because they need to follow the law of that specific country

Even though the intentions of Apple are noble, they do open the box of pandora with this.

i can’t get over the feeling apple is doing this because of some pending porn legislation somewhere (GB or EU), some government coercion somewhere else regarding market access (China) or some broad decryption lawsuits or threats of monopoly breakup somewhere else (US, from DOJ/FBI / new sideloading bill in US congress), and is either trying to comply, get ahead of, or appease.

Fact is by doing this, Apple is demonstrating proof of concept.

Fact is that Apple won’t be able to refuse when some government somewhere now enacts law, based on expanding this proof, that scans for other images, symbols, words, or text strings, without apple screening and with results delivered directly to that government, or else risk indictments, civil suits, market closures, test suits, breakup or regulatory threats and actions.

Fact is that Apple could be made to comply without being allowed to publicize its objections. For reference, just recall the National Security Letters that forbade/forbid companies from discussing the mere existence of being served such surveillance orders (this is in the USA, which theoretically is more transparent than repressive countries not to mention has a written Bill Of Rights that many countries lack.)

Fact is that Apple has encouraged us to think of, and use, our Apple devices as secure extensions of our brains, presumably subject to the same protections as our brains under their Privacy As A Human Right (BS), and further, that presumably as a US company their principles are informed by the traditions and the protections afforded by the Bill of Rights, especially the 4th and 5th Amendments pertaining to Search and Seizure, and Prohibiting Self Incrimination, but not forgetting specifically that a person is presumed innocent and no searches shall ensue without a judicial warrant justified by a reasonable suspicion of guilt.

Fact is that Apple has just (voluntarily?) become a willing extra-judicial adjunct of state security and law enforcement, with its plan to willfully perform warrantless searches, while thumbing its nose at the protections enshrined in the Bill Of Rights.

Fact is that Apple has already announced willingness (actually intention) to take further steps down the slippery slope by expanding to other countries and to 3rd party apps after starting with its own Photos app and iCloud services.

Fact is Millenia of human existence has given us some fundamental immutable lessons: a) the state will try to overrun the rights of the individual on a pretext, b) mission creep is a real thing, c) moral zealotry is a dangerous thing, d) just because you can do it doesn’t mean you should, e) appeasement doesn’t appease, f) if you have stated principles, they must be inviolate, else they are not principles, g) doing the wrong thing for the right reason is still doing the wrong thing, and h) the road to hell is paved with good intentions.

I think most people would agree that CSAM is a scourge but Apple is now so on the wrong side of its rhetoric, stated principles, the Bill Of Rights, millenia of learning and just good sense that one really wonders how Apple got itself tangled up in this issue, and wonders how Apple could at turns be both so naïve and so arrogant as to think that a legal push won’t now come to a statutory shove, one Apple won’t be able to “vehemently refuse”.
 
Last edited:
Your privacy isn't being impacted. Apple gain zero info on you from this. The only people it may impact are those who have EXACT CSAM matches.
An “Exact” hash match can still generate false positives.

But the biggest concern is that this will not stop at CSAM images. China will merrily demand that dissident images are added to this list and Apple will either comply or leave the market. The UK will want “offensive porn” (legal to do in your bedroom, but illegal to own images of :rolleyes:) added to the list and Apple will either comply or leave the market.

Repeat that across every country with some level of authoritarian government or spy-happy state apparatus and either Apple will cease to exist, having withdrawn from many markets, or their CSAM scanning will now be doing much more than just CSAM.
 
Who told Apple it was Apple's job and mandate to catch pedophiles and if so, when?
If there is an official written order from some body I would appreciate it being posted.
In the UK, since 2016, Apple would be committing a criminal offence if they divulged the existence of such an order. Yes, really.

As a communication equipment provider, they are also obliged to provide technical means of obtaining private information.
 
the whole point of this isn’t to find and prosecute child abusers, it is to get a system in place for monitoring everything we own digitally. by saying they are going after paedophiles they have chosen a subject that it is very hard to argue against, witness the posts earlier saying you must be a paedo if you are against it. so the public will think it is all a good idea, until their door gets kicked in for a meme about transgender people or whatever their government takes offence at.
you never know quite how slippery the slope is until you are in it and it is too late, and if you give power to anyone they will ALWAYS abuse it
 
Exactly. I get fed up of people making a "slippery slope" argument - it means you cannot ever travel in any direction, ever.

Lower taxes = "carry this on and soon there will be no tax at all"
Raise taxes = "carry this on and we'll be in a communist country soon"

If people think preventing child pornography is fine - which everyone should - then they should be happy with this feature. If their concern is what Apple might do next, then raise it when Apple does whatever terrible thing next.

But why not better wait for the next incrementally terrible thing that follows the last incrementally terrible thing to be absolutely sure.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.