Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Always how it happens with big tech. That's what this issue fundamentally comes down to, the consumer's power has diminished to the size of plankton and now the corporations have all the say. Apple can literally do whatever they want and they will not suffer any consequences, especially when emotionally charged propaganda is behind them. If this were a much smaller company (start up size) and there were many other alternative options they would be dead within a week as most people do not accept this surveillance technology.
Spot on with the propaganda part. The social justice world savior complex is a tough argument to fight because it’s a baited one.

“Think about the children!” is the oldest type of emotionally baited argument, and people have definitely caught on to it’s false premise. Still, some will say if you’re on the opposing side, you clearly must want children to be targeted by rapists and pedos, or something, never mind debating the actual facts of whether or not this AI program is worth anything.

Not everyone toes the social justice line because it isn’t all it’s cracked up to be and generally these causes do little if any good to improve the world. Apple might want to start realizing this and get back to doing what they do best.

I think this will piss a lot of people off, and hopefully it’ll get to those who typically have their head in the sand and couldn’t care less what Apple does. Ignorance isn’t always bliss.
 
  • Like
Reactions: BurgDog and Pummers
Slow boiling the frog.

They're starting with something that's universally reviled, then they'll add other stuff gradually.

Transphobia in the UK, insufficient reverence for black people in the US, homosexuality in Saudi Arabia etc.
And at every step of the way, people like you will say:

"Well as long as you're not a pedo / a racist / a homosexual / infringe on copyright / have impure thought / criticize the government you have nothing to fear!"
Exactly. People are too lazy or “busy” (lol) to bother with standing up for themselves and just roll their eyes. Then, when it’s too late they say “oh ****” like a complete moron when they could’ve done something to stop it.

Are we really an advanced population or are we just lazy idiots?
 
  • Like
Reactions: BurgDog and RedRage
In what way is "we're going to extend our current searching of your photos for CSAM materials on our iCloud servers to scanning your entire local computer for copyright material" an easier sell than just announcing it directly?

If this was Apple's end game do you honestly think this would be the best possible way they'd come up with approaching it?

Well, just remember that Apple is rather schizophrenic when it comes to big announcements, on one side there is the polished brilliance of WWDC and major product presentations and on the other side there the brain-dead rollout, lawsuits and damage payments caused by battery aging software debacle.

This latest blunder is definitely on the autistic end of the scale vs the polished end.
 
Last edited by a moderator:
  • Like
Reactions: Johnny907
Apple is going too far, but the criticism won't help. They do it, people still buy the devices anyway. And those who have something to hide, which sound stupid because privacy should concern everyone, they just buy something else (as long as this is still possible). Everyone else lives with the potential danger and forgets about it over time. Apple has won.
 
Any collaborators of child abusers could just turn off iCloud photos. It's hypocritical to boast this technology as the purpose it claimed to serve, yet in the dark paving way for the authoritarian regimes like CCP.
 
If you don't see a problem with this technology. Apple is definitely playing you. Pretty sure you don't want Apple to be looking/scanning/analyzing/identifying your wife's pictures. This CSAM stuff needs to be shut down. Apple needs to respect our privacy period. Apple needs to figure out another way if they are really interested in catching pedophiles... God knows for what reason.
You clearly don't know how this works. You need to educate yourself on how CSAM works, and how other companies have been doing this for years already. Apple is only just catching up. Use google photos app? use Gmail, Hotmail, Instagram, Facebook... they all scan your images you upload for CSAM hashes and match them against a database of current hashes... they don't scan the image in your photo, so your wifes and kids pictures are safe.
 
I agree with your comments. But in other threads you are such a cheerleader for spending lots of cash on the new iPhones etc. How do you reconcile your double personalities?

Enough is Enough, Apple! It is incredibly distasteful for you to SPY on us, as consumers. Apple it will be in your best interest to give us the Opt-Out option from CSAM feature, please.​


Where are you, Craig? You said this yourself in WWDC - 2021. What happened?

“At Apple, we believe privacy is a fundamental human right,” said Craig Federighi, Apple’s senior VP of software engineering. “We don’t think you should have to make a tradeoff between great features and privacy. We believe you deserve both.”

This is a punishment and a big slap in the face for anyone who owns an iPhone. Whatever you are trying to accomplish with this Apple, leave it in government and law enforcement hands.

1. How about CSAM scans Apple's executives' iPhones first? No one wants their privacy to be exposed. Please stop this nonsense and RESPECT our PRIVACY. It is our fundamental human right.

2. How come this CSAM stuff was not mentioned by Apple during WWDC - 2021. Apple is up to something. Why now? When we are almost to the release date of iOS 15...


Also, this guy needs to be FIRED from Apple. He is the mastermind behind CSAM. What a joke!

View attachment 1818081

Does anyone here have a game plan on how we can stop this crappy CSAM feature?​

CSAM itself is not the problem... it is the mechanism of it which is ripe for potential abuse.
 
  • Like
Reactions: BurgDog
"Well as long as you're not a pedo / a racist / a homosexual / infringe on copyright / have impure thought / criticize the government you have nothing to fear!"
Yeah, but a lot of people are and they use social media/messaging apps, like WhatsApp, so the moment one of them sends something out that may be of questionable acceptability into a group you happen to be in an app like WhatsApp is going to get that scanned by "the system" for you if you aren't careful.

Don't forget I use WhatsApp as an example because there are plenty of people that have that app put things directly into their Photos library upon receipt. If one isn't careful you can be tagged as something simply through association and failing to carefully manage, and understand the implications of, your app settings.
 
  • Like
Reactions: Pummers
People are really taking this seriously and planning on leaving Apple. They don't care about iOS 15 anymore... They don't care about the Apple's Fall line up.
haha yeah and where are they going to go? Android? Google already do this and scan for CSAM. No difference.
 
Regardless of which side you stand on when it comes to CSAM this is true:
Some employees are worried that Apple is damaging its industry-leading privacy reputation.
Fact is there is the perception whether rightly or wrongly that Apple is walking back some of its privacy marketing with this move and that's not good for the brand.
 
  • Like
Reactions: BurgDog
My issue with all of this is that I haven't seen much evidence that CSAM Hash Scans are catching people at such a high rate that it seems smart to put it on the millions of iOS devices.

Regardless, this decision seems to be one proposed by someone that is not plugged into the daily hearings of Apple. The average consumer is not dumb anymore, people are getting more hip to technology. You see what's happening with facebook, people jumped to twitter in droves and started to be more cautious of the FB platforms.
 
  • Like
Reactions: BurgDog
"We will refuse any such demands... except when the FBI ask us not to encrypt iCloud Backups.

Oh, and when the CCP ask us to hand over iCloud decryption keys"
How do you know that CCP doesn’t already have those keys which belongs to Chinese accounts? For the most part, iCloud isn’t end-to-end encrypted — the cloud operator has the decryption keys.
 
i can’t get over the feeling apple is doing this because of some pending porn legislation somewhere (GB or EU), some government coercion somewhere else regarding market access (China) or some broad decryption lawsuits or threats of monopoly breakup somewhere else (US, from DOJ/FBI / new sideloading bill in US congress), and is either trying to comply, get ahead of, or appease.

Fact is by doing this, Apple is demonstrating proof of concept.

Fact is that Apple won’t be able to refuse when some government somewhere now enacts law, based on expanding this proof, that scans for other images, symbols, words, or text strings, without apple screening and with results delivered directly to that government, or else risk indictments, civil suits, market closures, test suits, breakup or regulatory threats and actions.

Fact is that Apple could be made to comply without being allowed to publicize its objections. For reference, just recall the National Security Letters that forbade/forbid companies from discussing the mere existence of being served such surveillance orders (this is in the USA, which theoretically is more transparent than repressive countries not to mention has a written Bill Of Rights that many countries lack.)

Fact is that Apple has encouraged us to think of, and use, our Apple devices as secure extensions of our brains, presumably subject to the same protections as our brains under their Privacy As A Human Right (BS), and further, that presumably as a US company their principles are informed by the traditions and the protections afforded by the Bill of Rights, especially the 4th and 5th Amendments pertaining to Search and Seizure, and Prohibiting Self Incrimination, but not forgetting specifically that a person is presumed innocent and no searches shall ensue without a judicial warrant justified by a reasonable suspicion of guilt.

Fact is that Apple has just (voluntarily?) become a willing extra-judicial adjunct of state security and law enforcement, with its plan to willfully perform warrantless searches, while thumbing its nose at the protections enshrined in the Bill Of Rights.

Fact is that Apple has already announced willingness (actually intention) to take further steps down the slippery slope by expanding to other countries and to 3rd party apps after starting with its own Photos app and iCloud services.

Fact is Millenia of human existence have tried to teach us a) the state will try to overrun the rights of the individual on a pretext, b) mission creep is a real thing, c) moral zealotry is a dangerous thing, d) just because you can do it doesn’t mean you should, e) appeasement doesn’t appease, f) if you have stated principles, they must be inviolate, else they are not principles, g) doing the wrong thing for the right reason is still doing the wrong thing, and h) the road to hell is paved with good intentions.

I think most people would agree that CSAM is a scourge but Apple is now so on the wrong side of its rhetoric, stated principles, the Bill Of Rights, millenia of learning and just good sense that one really wonders how Apple got itself tangled up in this issue, and wonders how Apple could at turns be both so naïve and so arrogant as to think that a legal push won’t now come to a statutory shove, one Apple won’t be able to “vehemently refuse”.
At least Apple has been up front about this. Unlike our own (US) government and their alphabet spy agencies. If they work, we at least have ways to turn off Apples scan. The governments don’t provide that option.
 
Apple is not asking you to prove your innocence. Apple is required to ensure that nothing illegal is stored in their servers. Apple is already scanning for CSAM photos since 2019 for all photos uploaded to iCloud Photo.
The interesting thing to wonder is whether it is only detecting known CSAM pictures (already seen and catalogued in said database) or if it is scanning the content of pictures and determining if nudity/infantile attributes exist and raising alarm bells…

One would be a privacy-protecting hash comparison system that detects photos that are already known to authorities. The other would be the screening of individual photos and detection of features deemed to be so (including original pictures) and would be a less affective but more grave violation of the privacy values espoused by Apple.

I still haven’t seen a clear answer on this, only ever that pictures currently known and catalogued are detected, but nothing about original content that also contains CSAM.
 
Don't turn on iCloud Photo Library. But quite honestly, as all this detects is child pornography, it makes me wonder why you wouldn't want it turned on.
I don't understand why anyone would.

If you don't have this material, then all it's doing is wasting your CPU and battery to determine you are innocent.

If you do have this material, then I assume you don't want to be caught (and thus don't want it turned on).

🤷‍♂️
 
  • Like
Reactions: nerdherdster

Enough is Enough, Apple! It is incredibly distasteful for you to SPY on us, as consumers. Apple it will be in your best interest to give us the Opt-Out option from CSAM feature, please.​


Where are you, Craig? You said this yourself in WWDC - 2021. What happened?

“At Apple, we believe privacy is a fundamental human right,” said Craig Federighi, Apple’s senior VP of software engineering. “We don’t think you should have to make a tradeoff between great features and privacy. We believe you deserve both.”

This is a punishment and a big slap in the face for anyone who owns an iPhone. Whatever you are trying to accomplish with this Apple, leave it in government and law enforcement hands.

1. How about CSAM scans Apple's executives' iPhones first? No one wants their privacy to be exposed. Please stop this nonsense and RESPECT our PRIVACY. It is our fundamental human right.

2. How come this CSAM stuff was not mentioned by Apple during WWDC - 2021. Apple is up to something. Why now? When we are almost to the release date of iOS 15...


Also, this guy needs to be FIRED from Apple. He is the mastermind behind CSAM. What a joke!

View attachment 1818081

Does anyone here have a game plan on how we can stop this crappy CSAM feature?​


What I read between the lines when the initial announcement was made was : "If you want privacy, don't buy Apple". They won't be able to avoid mission creep. We should appreciate they had the decency to inform us in advance that they're going to start scanning our phones and computers....
 
That's not how it works. Have a read up, Apple have put out a technical white paper on it.
Did you read and understand the whitpaper? I have not read it but have read elsewhere that much remains unclear. Apple supposedly does not deliver everything.
You can write papers that are incredibly complex and still miss the crucial things.
 
Slow boiling the frog.

They're starting with something that's universally reviled, then they'll add other stuff gradually.

Transphobia in the UK, insufficient reverence for black people in the US, homosexuality in Saudi Arabia etc.
And at every step of the way, people like you will say:

"Well as long as you're not a pedo / a racist / a homosexual / infringe on copyright / have impure thought / criticize the government you have nothing to fear!"
Agreed!

Folks who see things this way demonstrate their ignorance of or willful rejection of Niemöller’s Warning:

 
A lot here depends on the thresholds and specific features. Apple may very well have tuned it so strictly that images need to be close to identical to match. But semantic similarity means a similarity of content, e.g. are both images showing similar people doing similar things. Analysing this is hard, and Apple would not have to bother with this if they only cared about visual similarity, minor alterations and the like.
I see what you mean. I suppose the fall back method is Apple will physically compare the images if x amount of images someone has on their phone match these CSAM images. Which would be a very unlikely scenario.

The examples they give in the paper are things like colour difference or cropping. But the language they use is comparing "content". So are they comparing an exact palm tree or a face, or are they saying there is a palm tree in and a face in these two locations?

I think Apple should demonstrate that their algorithm doesnt do what people fear it may be doing. i.e. Get a normal picture of two people doing specific poses and can I recreate that picture manually with different people and get a match?

I think if that could happen, then its a huge problem. But I cant imagine Apple going for something that wasn't absolute water tight here.
 
Or how about just not storing stuff on icloud? Every month i just transfer pictures from my phone to computer.
 
At least Apple has been up front about this. Unlike our own (US) government and their alphabet spy agencies. If they work, we at least have ways to turn off Apples scan. The governments don’t provide that option.
I’m not much comforted by the “upfrontedness” of Apple as it feels more like due care to prevent a future civil suit from biting; “but we warned them.”
 
  • Like
Reactions: Frustratedperson
It’s the technology itself that is terrifying people, not Apple’s use of it, which I understand. You could essentially build a database of anything digital (pictures, movie files, pdf’s), attribute a unique hash to them and scan for them across people’s devices.
Yep, how long until there is a UK court order to add copyrighted content to the database?
 
Don't turn on iCloud Photo Library. But quite honestly, as all this detects is child pornography, it makes me wonder why you wouldn't want it turned on.
If you haven't been able to read all the concerns expressed by so many people, then I hate to say it, but, at this point, you're not going to get it. Which says a lot about your critical thinking skills. It's also pretty disgusting to imply that people are pedos just because they don't like what's going on. That just says a lot about you as a person. And it doesn't say anything good at all.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.