Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I understand the objections here: the system as planned relies that we trust Apple. Personally I believe that Apple have, at least more than any competitors, earned that trust: rather than claim that they're throwing it away, I think they're spending it wisely.

If we can assume that Apple will act in good faith then this seems like a well balanced proposal. If we can't assume that Apple will act in good faith then this proposal isn't the problem.

The problem is that Apple may act in good faith, but when they are compelled by a court order, they will no longer be acting in good faith, but under orders.

Do you (not you specifically, you in general) honestly think Apple will pull out of China if China wants them to scan for something? Ignoring of course that China might already be doing it since Apple already caved and is housing Chinese iCloud servers inside China?

If the fact that Apple already caved to China isn't evidence enough, just wait until Russia does it. Or the US. Or Australia or Canada or the EU.

Apple won't withstand the pressure and it will have to cave.
 
The last 5 minutes of this really is a great take and great point by reasonable people who are not "extreme" in any way. Everyone should watch this.

Yes, the last five minutes … The battle isn’t with Android; it’s with apps and services that allow folk to switch away from Apple without losing their social media connections or their pictures, or their documents, their music …
 
  • Like
Reactions: Alex_Mac
The problem is that Apple may act in good faith, but when they are compelled by a court order, they will no longer be acting in good faith, but under orders.

Do you (not you specifically, you in general) honestly think Apple will pull out of China if China wants them to scan for something? Ignoring of course that China might already be doing it since Apple already caved and is housing Chinese iCloud servers inside China?

If the fact that Apple already caved to China isn't evidence enough, just wait until Russia does it. Or the US. Or Australia or Canada or the EU.

Apple won't withstand the pressure and it will have to cave.
What exactly do you think those countries could ask Apple to scan for?
 
Yes, the last five minutes … The battle isn’t with Android; it’s with apps and services that allow folk to switch away from Apple without losing their social media connections or their pictures, or their documents, their music …

Exactly. It’s a battle against lock-in, which Apple has been genius about keeping people locked into their ecosystem (myself included).
 
Its mind-blowing how people are not prepared to do EVERYTHING possible to stop this kind of abuse.

Really? EVERYTHING? So are you advocating eliminating the 4th Amendment in the US? Cameras in your house on 24/7 so we can do "EVERYTHING possible to stop this kind of abuse"? Microphones active there 24/7? Recordings 24/7 to be able to verify?

No curtains in your home, after all, if you don't have something to hide, why would you need curtains? Or doors on the bathroom?
 
The problem is that Apple may act in good faith, but when they are compelled by a court order, they will no longer be acting in good faith, but under orders.

Do you (not you specifically, you in general) honestly think Apple will pull out of China if China wants them to scan for something? Ignoring of course that China might already be doing it since Apple already caved and is housing Chinese iCloud servers inside China?

If the fact that Apple already caved to China isn't evidence enough, just wait until Russia does it. Or the US. Or Australia or Canada or the EU.

Apple won't withstand the pressure and it will have to cave.

Well, if it’s the law, then they’ll have no choice.



 
Well, if it’s the law, then they’ll have no choice. then there’s the problem of state agencies getting around their privacy laws by getting agencies from other states to help them spy on their own citizens.



Exactly.

So if one is rely on Apple to be "doing the right thing" it is a false sense of security.
 
Tanks in Tiananmen Square for one thing.

Hong Kong protests.

The Taiwanese flag.

Rubber duck sculptures. (used to relate to tanks)

Candle icons. (Used to mourn dead protesters)


Those are some of the things that are censored.
Ok, Tiananmen Square. That's one, maybe two pictures. Not enough to do anything.

Everything else you listed is absurd. Do you know how many pictures of protests, flags, ducks, and candles exist? How are you going to scan for them?
 
  • Haha
Reactions: Pummers
Wow, fast moving comment section!

I understand the objections here: the system as planned relies that we trust Apple. Personally I believe that Apple have, at least more than any competitors, earned that trust: rather than claim that they're throwing it away, I think they're spending it wisely.
It's NOT about trusting Apple!

How many times do we have to say this? Your own government is the most dangerous player in all this.
It’s interesting to note that Apple solves problems. That’s at its very core.

How do they know that children are being exposed to explicit images and that they’ve got lots of CSAM images on their servers?
They’ve already looked and for them to propose this action I suspect the issue is HUGE otherwise they’d do nothing!
Says you. You have no idea if this is true. What, are you just assuming now?
I understand a lot of forum members have concerns about privacy issues but rather than bleating about privacy how about suggesting better alternatives to keep our children safe!
There you go, starting out with "I understand", but turning it right around to an insult against your own fellow forum members with the "bleating" comment, and then expecting people to trust your judgment and expecting people to even want to have a conversation with you.
Personally I think Apple are going to do this regardless. They may change how they represent it and how much they tell the user but they clearly want this to happen.

The only reason I think they're put it on pause is iPhone 13 sales. Once iPhone 13 is out and iOS 15 on enough devices they'll just push it out. Its not like anyone at that point, once on iOS 15, is going to say no to software updates from then on out.
I don't care why Apple decided to do the right thing. I just care that they do the right thing. If they were worried about having a bad launch or not making revenue targets, then so be it.

But they need to shut down the program.
So you don't want Apple to use CSAM? But is it ok with you that you can never delete emails, post's, tweets and more from Apple, Google, Twitter and others?
Okay, you confused me with your statements here. Can you plainly state what you mean? Are you for or against Apple using CSAM?
Its mind-blowing how people are not prepared to do EVERYTHING possible to stop this kind of abuse. People need to get it in to their heads that if they are online in anyway there is no such thing as complete privacy. Someone will be able to get into whatever you think you are protecting. Your location, everything.. its all tracked. You're fighting for an illusion.
It's mind-blowing how some people have no trouble with false accusations. It's not about "doing EVERYTHING possible to stop abuse". It's about NOT accusing innocent people of such a crime. And this use of CSAM leaves way too many openings for that to happen.

At least in the US, our Founding Fathers felt that it was not permissible to have a legal system that punished people who were innocent of the charges against them. They felt that it was wrong that the people where they came from didn't even have the freedom to speak their minds, or to worship as they saw fit.

The Founding Fathers knew that oppressive regimes would not hesitate to put people away just for saying their opinion out loud. And they knew this because they had experienced it themselves.

And today, we have a lot of countries that STILL do that.

So I don't know why some people won't do EVERYTHING possible to PREVENT false accusations. I don't know why some people won't do EVERYTHING possible to PREVENT people from being shut down, shut up, or outright canceled.
 
They are only scanning on device for photos being uploaded to their server anyway.
For now. But anyway, that alone is not acceptable to me.
If they keep 100% server side scanner then we have to accept we cant have true end-to-end encryption thus meaning all photos are viewable, not just abuse photos.
First off, Apple has never announced end to end encryption of everything. Second, e2ee is not real e2ee if there's a back door sitting in front of it. Lastly, that's fine by me, I never asked for e2ee, and think the server is where any scanning should take place.

If you have no abuse photos, this feature does not affect you one little bit. If you do have abuse photos, you’re sick and should be caught.
Yes, it does effect me, it's on my device scanning my photos if I upload them to iCloud. Big Brother.
 
  • Like
Reactions: BurgDog
“The responses to Apple's plans have been damning: over 90 organizations across the globe have urged the company not to implement them, for fear that they would lead to the censoring of protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children.

I understand other hypothetical risks listed, but why or how would Apple’s original CSAM plan “have disastrous consequences for many children.”? 🤔
What, over 90 organizations (not just 90 people) come out against this, and you say you don't get why? Why not do a little research into why?

But here, I'll help you how this could have disastrous consequences for many children:

1. False accusation against a minor who receives or takes a photo that "looks like" a bad photo enough to get a hash match. Now we get into a court case where the government is going after a minor (or the minor's parents) over a photo?

2. A person who is a parent gets falsely accused for taking or receiving a photo that matches a hash. Now what? We go to court, or based on what some people are saying here maybe we just skip the court phase and directly put that parent in jail and we send all his or her kids off to state care.

There are probably a thousand other ways where this hurts children, but that's just the first two that I could think of.
 
What, over 90 organizations (not just 90 people) come out against this, and you say you don't get why? Why not do a little research into why?

But here, I'll help you how this could have disastrous consequences for many children:

1. False accusation against a minor who receives or takes a photo that "looks like" a bad photo enough to get a hash match. Now we get into a court case where the government is going after a minor (or the minor's parents) over a photo?

2. A person who is a parent gets falsely accused for taking or receiving a photo that matches a hash. Now what? We go to court, or based on what some people are saying here maybe we just skip the court phase and directly put that parent in jail and we send all his or her kids off to state care.

There are probably a thousand other ways where this hurts children, but that's just the first two that I could think of.

Photos you receive don't get scanned, photos that merely look like a bad photo don't get you in trouble, and one photo isn't enough to raise an alarm. Your worries are not based in reality.
 
Please explain to me this line 'Just because someone is against this doesn't mean we're against finding and getting rid of this filth.'. If a person is against having CSAM on their device then they are against finding and getting rid of this filth. There is no middle ground here.

You can not say 'we are against it but we are not going to allow you to check our device'. It does not work like that no matter how you try to word it or spin it.
Let's end all wars by eradicating mankind! What, you are against eradicating all mankind? So you like wars! There is no middle ground there.

Seriously, there is a lot of middle ground there. The ends do not always justify the means. The on-device CSAM detection is the wrong method. This is no spin - your position is so absurd that I have trouble believing you are honest about it.


There is one positive thing for me in this debacle: Apple's transgression is so blatant, so offensive, that it has demolished the "think-of-the-children" excuse. Proponents are flailing in desperate disbelief as "think-of-the-children" no longer silences the room and lets them have their way.
 
That’s not what is happening.
To use your analogy:
Apples proposal is to scan as the person is “actually leaving the house with said image”.
No scans take place if that photo is destined to remain on device and is not being uploaded to iCloud photos.

How hard is this to understand?

apple are not documenting the csam match unless the photo is leaving the house (phone) to go outside (upload to cloud). Only then the scan and documenting occurs
So it's okay if they search you every time you leave the house for illicit photos? It's not okay to me!
 
You seem to have a misguided notion that private companies are obligated to protect your constitutional rights. They are not. The government is obligated to do that. If you don’t like what Apple is doing, you are free to use another platform.
You're right, private companies are not obligated to protect my constitutional rights.

But I EXPECT them to not violate my rights, particularly my right to due process under the law. Or I'll fire them and find somebody else who doesn't violate my rights.
 
In all honesty, the in-device CSAM scanning makes no financial sense for Apple. Why on earth are they even considering doing it? In a long run Apple will loose some sales and market share because of this but more importantly Apple has to answer to those who want to expand the scanning from CSAM to other types of material. The only logical move for Apple is to scarp any in-device scanning plans and go back to building “more secure” mobile devices. People have paid premium for privacy and that’s what they are expecting. Literally having a device which is spying on their owner and reporting them to third party which might or might not report the user to law enforcement isn’t a good selling point.
This is the only remaining mystery, since we by now understand the technology and have seen feedback from interest groups and academia.

We've discussed this point in many of the other threads but are, obviously hampered by our lack of inside data.

Below is a a summary of the prevailing hypothesis with a dash of Occam's razor applied as we go along:

1 - An assumption that Apple has suddenly turned into a company willing to throw years of consumer trust overboard for a noble social cause sits very poorly with the evidence that shows Apple to be predominantly extraordinarily cautious and commercially cold blooded and analytical. Therefore we accept that Apple has taken this initiative after careful commercial and legal evaluation.
2 - Once we accept the premise from 1, we must consider possible reasons for the action:
2.1 Apple projects increased revenues and sales from this initiative. Give the almost universal, public negative feedback and Apple's to date lack of interest in the area it's unlikely that's the reason.
2.2 A more likely explanation is, that Apple is facing a potential loss from their inaction - as one of the few IT companies - towards removing CSAM from their services. We can't gauge the size, probability or the timetable for the loss of inaction to occur, but given Apple has taken action, we hypothesize it's substantial and likely. Let's define that weighted loss as WLoss(inaction)
2.3 @Nuvi is entirely correct, there is certainly a loss associated with the action taken. Let's define that weighted loss as WLoss(action)
2.4 Apple's analysis must have shown WLoss(inaction) > WLoss(action). CSAM scanning then gets implemented, to minimize the projected loss.

Without more information, we can't refine our analysis further. So, here we are 🥳.

I wonder how much Apple has actually gamed this out, and if we're on one of their playbooks or into unknown territory.

Please, can someone sue already - so we can get to discovery and get at those emails 😂
 
  • Like
Reactions: BurgDog
There is one positive thing for me in this debacle: Apple's transgression is so blatant, so offensive, that it has demolished the "think-of-the-children" excuse. Proponents are flailing in desperate disbelief as "think-of-the-children" no longer silences the room and lets them have their way.

This is exactly the correct take. Everyone should be appalled at the circumventing of the 4th amendment by use of the private search doctrine while effectively wiretapping your device. The irony is countries that have already eroded civil liberties and don't recognize the privacy rights of their citizens don't even waste time on the stopping child abuse head games as that is always just a way to get a foot in the door. Apple effectively nullified every positive stance they took on user privacy in the past with this one act.
 
Last edited:
Tanks in Tiananmen Square for one thing.

Hong Kong protests.

The Taiwanese flag.

Rubber duck sculptures. (used to relate to tanks)

Candle icons. (Used to mourn dead protesters)


Those are some of the things that are censored.

The alternative suggested ie server side scans are equally succeptable to corruption of the source CSAM data sources. In addition, governments currently have access to all user iCloud content that isn’t end to end encrypted and that includes all photos, files, backups, etc that users have stored there. Why on earth would they be somehow modifying the CSAM sources when they can just scan everything that isn’t end to end encrypted which at the moment is the vast majority of user content. It makes zero sense.
 
Maybe the defenders of privacy need to talk with the families, relatives and even the children themselves who have been abused and explain to them why your privacy is more important than their protection.
No we don't. You need to explain better how people won't be put in jail under false accusations. It really wasn't THAT long ago that people would be burned at the stake just for being ACCUSED of being a witch.

So maybe you should explain to the families, relatives, and children of people who were falsely accused and STILL ended up in jail or worse, killed just for being different.

Let's make this personal, just for argument's sake, of course. What safeguards are going to be put in place to protect YOU from getting accused just because some vindictive person you don't even know sent you a bad picture? Or haven't you even thought of that yet? Just how certain are you that you don't have any enemies in life? All it takes is one who is willing to frame you, and then wind up and turn loose your shiny new political machine (and it WILL be politicial, make no mistake) on you to handle everything else.

I haven't heard anything about that from you or anybody saying that this is permissible. Yeah, even YOU are at risk for being falsely accused, and then YOU become the subject of the witch hunt. And then oh man, you'll really learn what "canceled" means. Like your job? Say buh-bye. Like your family? Say buh-bye. Like your reputation? Say buh-bye. Like your freedom to move about within your city or country? You won't be doing that fun stuff anymore.

There are better ways to catch criminals of ALL types. Or have we just given up on trying?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.