At which point they know the results and violate your privacy. So... what's your point?
If you're uploading anything to iCloud, let alone CSAM, you are waiving any rights to 100% privacy. Read the iCloud legal documents on Apple's website.
At which point they know the results and violate your privacy. So... what's your point?
When you upload a photo to Apple's servers, you're agreeing that they have the right to scan it -- whether they scan it on the server or use their software that you've agreed to install on your phone is besides the point.And you have no right to do a walkthrough without getting permission first. Owning the property only grants you some rights, not absolute authority over the property.
How is it increased? They know MORE about the content of your images.And if you knew even the first thing about what Apple is implementing here, you'd realize your privacy is being INCREASED with this move if you've been using iCloud for photos.
You are renting space. If you rent a house/apartment you are the one responsible for what is in that house. If you keep drugs there, you will go to jail and nothing will happen to the landlord. Because privacy is expected, the landlord has no right to enter the house without your permission.
(Considering this will roll out first in US) Furthermore if we take in consideration the 4th amendment it extends to papers as well. It would stand to reason that privacy would extend to our devices as well. And in case of renting space on a cloud, why wouldn’t that be as with real estate? Encrypt everything on the cloud and the ‘owners’ can’t even access the data you store. They are mearly renting you space (disk drives).
This is a matter of principle and such changes shouldn’t be rushed and shouldn’t be discusse away from the public.
I mean the cryptography. It's difficult to understand and even more difficult to explain if you don't have a background in the field.It’s easy to explain, but difficult to justify.
I disagree. I paid for the storage on iCloud. Nothing should be happening. If it is then the attention this is bringing needs to stop the behavior.If you're uploading anything to iCloud, let alone CSAM, you are waiving any rights to 100% privacy. Read the iCloud legal documents on Apple's website.
and for the experts that do understand, their just as freaked out as the public.I mean the cryptography. It's difficult to understand and even more difficult to explain if you don't have a background in the field.
How is it increased? They know MORE about the content of your images.
Um...how about neither? Apple shouldn't be scanning ANY of my content. Whether it's on my phone or in the cloud, it is MY content created on a phone I paid for.So you’d rather that they do the scanning on their own servers in secret, instead of in the device itself where we will know instantly if they start comparing to taboo Chinese photos because security researchers have access to phones?
I honestly don’t get your point one bit.
Shall we do a yearly inspection of every house to find child porn?The rules regarding child porn are often strict - just possessing them, without even knowing it - can be a crime. The law in a lot of jurisdictions simply treats child porn differently from real estate.
Or the alternative --- that so many are deciding that this was the result of mandate or pressure from the government... as if it's not possible that they actually might have altruistic reasons for this and came up with it all on their own? It's truly conjecture from both sides.What I find astonishing is so many here believe Apple executives just woke up one morning and decided to put this program in place all on their own, without any outside mandate or pressure from the government. And without pushing back.
So you want Apple to tell everyone it’s hunky dory if you want to upload child porn onto their servers? Who do you think that will comfort?At this point, I dont know how Apple can salvage this situation of its own making. Its actually worse then antennagate since Steve Jobs handled that situation perfectly. Apple Should backtrack and either say thier not going to release csam tech or spend the next month, or months, on a public relations blitz explaing and calming the fears everyone has.
And how we deal with child porn possession should never come to assuming anyone might have it.The rules regarding child porn are often strict - just possessing them, without even knowing it - can be a crime. The law in a lot of jurisdictions simply treats child porn differently from real estate.
Shall we do a yearly inspection of every house to find child porn?
I disagree. I paid for the storage on iCloud. Nothing should be happening. If it is then the attention this is bringing needs to stop the behavior.
And you have lost any credibilty.I’d be fine with that.
They have a legal requirement to not host CSAM on their servers, which they own and you are renting. They get in trouble if they let you store it.Um...how about neither? Apple shouldn't be scanning ANY of my content. Whether it's on my phone or in the cloud, it is MY content created on a phone I paid for.
Or the alternative --- that so many are deciding that this was the result of mandate or pressure from the government... as if it's not possible that they actually might have altruistic reasons for this and came up with it all on their own? It's truly conjecture from both sides.
So the real issue is we need to fix how cloud storage agreements work. There is no excuse for this policy to exist as you quoted.As someone else already said, you don't own any server space on iCloud. You're renting it and entered into a legal agreement with Apple regarding it. Here's the document:
![]()
Note especially the following paragraph:
View attachment 1818341
To be fair, this shouldn’t come as a surprise given how people have been loudly misunderstanding things on purpose during a global pandemic for the last year and a half.Not to mention people from Germany who keep disliking all my posts, when this system doesn’t even work in Deutschland.
It's multi-facitedWeird, because I haven’t seen a single coherent explanation of why it’s a problem if your own device scans your photos for child porn, and only does so if you are trying to upload onto apple’s servers, and only produces information to Apple if you have at least thirty child porn photos that you are trying to upload.
By making our phones run an algorithm that isn’t meant to serve us, but surveils us, it has crossed a line.