Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
And you have no right to do a walkthrough without getting permission first. Owning the property only grants you some rights, not absolute authority over the property.
When you upload a photo to Apple's servers, you're agreeing that they have the right to scan it -- whether they scan it on the server or use their software that you've agreed to install on your phone is besides the point.
 
And if you knew even the first thing about what Apple is implementing here, you'd realize your privacy is being INCREASED with this move if you've been using iCloud for photos.
How is it increased? They know MORE about the content of your images.
 
  • Like
Reactions: clunkmess
What I find astonishing is so many here believe Apple executives just woke up one morning and decided to put this program in place all on their own, without any outside mandate or pressure from the government. And without pushing back.
 
You are renting space. If you rent a house/apartment you are the one responsible for what is in that house. If you keep drugs there, you will go to jail and nothing will happen to the landlord. Because privacy is expected, the landlord has no right to enter the house without your permission.

(Considering this will roll out first in US) Furthermore if we take in consideration the 4th amendment it extends to papers as well. It would stand to reason that privacy would extend to our devices as well. And in case of renting space on a cloud, why wouldn’t that be as with real estate? Encrypt everything on the cloud and the ‘owners’ can’t even access the data you store. They are mearly renting you space (disk drives).

This is a matter of principle and such changes shouldn’t be rushed and shouldn’t be discusse away from the public.

The rules regarding child porn are often strict - just possessing them, without even knowing it - can be a crime. The law in a lot of jurisdictions simply treats child porn differently from real estate.
 
At this point, I dont know how Apple can salvage this situation of its own making. Its actually worse then antennagate since Steve Jobs handled that situation perfectly. Apple Should backtrack and either say thier not going to release csam tech or spend the next month, or months, on a public relations blitz explaing and calming the fears everyone has.
 
So you’d rather that they do the scanning on their own servers in secret, instead of in the device itself where we will know instantly if they start comparing to taboo Chinese photos because security researchers have access to phones?


I honestly don’t get your point one bit.
Um...how about neither? Apple shouldn't be scanning ANY of my content. Whether it's on my phone or in the cloud, it is MY content created on a phone I paid for.
 
What I find astonishing is so many here believe Apple executives just woke up one morning and decided to put this program in place all on their own, without any outside mandate or pressure from the government. And without pushing back.
Or the alternative --- that so many are deciding that this was the result of mandate or pressure from the government... as if it's not possible that they actually might have altruistic reasons for this and came up with it all on their own? It's truly conjecture from both sides.
 
  • Like
Reactions: videosoul
At this point, I dont know how Apple can salvage this situation of its own making. Its actually worse then antennagate since Steve Jobs handled that situation perfectly. Apple Should backtrack and either say thier not going to release csam tech or spend the next month, or months, on a public relations blitz explaing and calming the fears everyone has.
So you want Apple to tell everyone it’s hunky dory if you want to upload child porn onto their servers? Who do you think that will comfort?
 
  • Haha
Reactions: RedRage
The rules regarding child porn are often strict - just possessing them, without even knowing it - can be a crime. The law in a lot of jurisdictions simply treats child porn differently from real estate.
And how we deal with child porn possession should never come to assuming anyone might have it.
 
  • Like
Reactions: Philip_S
I disagree. I paid for the storage on iCloud. Nothing should be happening. If it is then the attention this is bringing needs to stop the behavior.

As someone else already said, you don't own any server space on iCloud. You're renting it and entered into a legal agreement with Apple regarding it. Here's the document:


Note especially the following paragraph:

1628884567103.png
 
Um...how about neither? Apple shouldn't be scanning ANY of my content. Whether it's on my phone or in the cloud, it is MY content created on a phone I paid for.
They have a legal requirement to not host CSAM on their servers, which they own and you are renting. They get in trouble if they let you store it.
 
Or the alternative --- that so many are deciding that this was the result of mandate or pressure from the government... as if it's not possible that they actually might have altruistic reasons for this and came up with it all on their own? It's truly conjecture from both sides.

As an alternative... you believe that Cook, Federighi, and other Apple execs on a whim or strong personal beliefs decided to put this system in place all on their own. And felt confident that Apple customers would be just fine with it, with no adverse consequences to the company because privacy isn't that much of a biggie after all?

Really?
 
As someone else already said, you don't own any server space on iCloud. You're renting it and entered into a legal agreement with Apple regarding it. Here's the document:


Note especially the following paragraph:

View attachment 1818341
So the real issue is we need to fix how cloud storage agreements work. There is no excuse for this policy to exist as you quoted.
 
  • Like
Reactions: Philip_S
Not to mention people from Germany who keep disliking all my posts, when this system doesn’t even work in Deutschland.
To be fair, this shouldn’t come as a surprise given how people have been loudly misunderstanding things on purpose during a global pandemic for the last year and a half.
 
  • Like
Reactions: videosoul
Weird, because I haven’t seen a single coherent explanation of why it’s a problem if your own device scans your photos for child porn, and only does so if you are trying to upload onto apple’s servers, and only produces information to Apple if you have at least thirty child porn photos that you are trying to upload.
It's multi-facited

1) I just don't like my phone spying on me. Full stop, I'm not OK with this.
2) Just because iCloud might be off today, doesn't mean it doesn't get enabled accidentally during an update
3) Just because it targets child porn in the USA, today, doesn't mean it won't target political rivals or "wrong think" tomorrow.
4) I don't know where these hashes come from. If I take my phone to China, and have a photo (or 30) of CCP opposition leaders on it, will I get arrested?

Ultimately, as it's written, everything Apple is doing with CSAM is probably no different than what they're already doing on iCloud, they're just moving the processing to the end user so that they can cut down on cloud computing costs. And I'm OK with that, really. What I'm not OK with is Apple building a framework that's ripe for abuse. What flies in one country doesn't fly in others (memes of the prophet Muhammad, Tiennamen Square, etc), and it's ridiculously easy to take an already implemented feature and co-opt it for something else.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.