Damn, you know this is an absurd invasion of privacy when even people on this site are universally against it.
Tbh, I can’t make my mind.What is your theory on why Apple, after years of positioning themselves as leaders in privacy, would choose to implement this policy now?
I might be mistaken, but I believe when the new Mac OS, called Monterey(?) is released, it will possess the same technology as the OS for iPads and iPhones.Tbh, I can’t make my mind.
I have a few theories why that could be the case:
1.) one option is they want to implement end-to-end encryption and that is why they have to scan on iphone
2.) the legislation is being prepared for this kind of surveillance. Taking into account the EU report I linked here. The legislation might be coming up in more places than just EU at the same time. Might be something theh have discussed at some summit
3.) there might be real interests in eroding digital privacy. Who would be the actors or lobbyists that would want that, I truly have no idea. There might be multiple actors pushing this.
4.) there are legitimate actors who think that in the digital age we have a way of combating child pornography and find such invasion acceptable. Considering it is veiled under hashes, etc. This actors actually act on a novle cause and are just misguided by the zeal of it’s privacy concerns. So I think of this as not malicious in their intent, but unaware of the consequences.
In the end I don’t think this is an Apple agenda in its own. They might be either trying to find the least harmful solution or being forced into this. I can’t fathom what personal interest would Apple have in this. Except as someone mentioned - copyright.
And lastly this is done only on iphones. How will they manage this on macs? Seems that is a big oversight in detecting CSAM.
Tbh, I can’t make my mind.
I have a few theories why that could be the case:
1.) one option is they want to implement end-to-end encryption and that is why they have to scan on iphone
2.) the legislation is being prepared for this kind of surveillance. Taking into account the EU report I linked here. The legislation might be coming up in more places than just EU at the same time. Might be something theh have discussed at some summit
3.) there might be real interests in eroding digital privacy. Who would be the actors or lobbyists that would want that, I truly have no idea. There might be multiple actors pushing this.
4.) there are legitimate actors who think that in the digital age we have a way of combating child pornography and find such invasion acceptable. Considering it is veiled under hashes, etc. This actors actually act on a novle cause and are just misguided by the zeal of it’s privacy concerns. So I think of this as not malicious in their intent, but unaware of the consequences.
In the end I don’t think this is an Apple agenda in its own. They might be either trying to find the least harmful solution or being forced into this. I can’t fathom what personal interest would Apple have in this. Except as someone mentioned - copyright.
And lastly this is done only on iphones. How will they manage this on macs? Seems that is a big oversight in detecting CSAM.
Thank-you for taking the time to write a thoughtful post.Tbh, I can’t make my mind.
I have a few theories why that could be the case:
1.) one option is they want to implement end-to-end encryption and that is why they have to scan on iphone
2.) the legislation is being prepared for this kind of surveillance. Taking into account the EU report I linked here. The legislation might be coming up in more places than just EU at the same time. Might be something theh have discussed at some summit
3.) there might be real interests in eroding digital privacy. Who would be the actors or lobbyists that would want that, I truly have no idea. There might be multiple actors pushing this.
4.) there are legitimate actors who think that in the digital age we have a way of combating child pornography and find such invasion acceptable. Considering it is veiled under hashes, etc. This actors actually act on a novle cause and are just misguided by the zeal of it’s privacy concerns. So I think of this as not malicious in their intent, but unaware of the consequences.
In the end I don’t think this is an Apple agenda in its own. They might be either trying to find the least harmful solution or being forced into this. I can’t fathom what personal interest would Apple have in this. Except as someone mentioned - copyright.
And lastly this is done only on iphones. How will they manage this on macs? Seems that is a big oversight in detecting CSAM.
I’m not a lawyer, expert or even an American, but even I know the 4th amendment doesn’t constrain the actions of private companies. I’m fairly sure that nothing else in The US Constitution does either.I have asked some of our legal team, albeit not constitutional lawyers, to explain to me how this does not trample all over the 4th amendment:
**********
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
**********
The consensus seems to be on one hand, it is a violation of constitutional rights, but on the other hand, Apple must have vetted this were some well respected experts.
What is your theory on why Apple, after years of positioning themselves as leaders in privacy, would choose to implement this policy now?
Yes, the surveillance CSAM feature will be dormant on every iPhone, it’s the AppleID country setting which triggers the activation. If you’re one of these guys who created a US based AppleID to have access to other content, few people used to do that in the past, then you’ll be affected.As not an engineer, I am curious as to if it is possible to release the technology ONLY on US based OS's. From what I have been able to read, this technology seems incredibly complex, and needs to be "baked in" to the OS.
Would I be paranoid to think the technology will be in every OS for every country, but dormant? And at the appropriate time, Apple will simple activate the technology? For example Apple decides it is appropriate, necessary, or for other reason, and send a simple minor upgrade (remember Apple is wonderfully opaque in detailing update contents).
AgainOr rather you tunnel vision so hard you have given up foreseeing the trend of how thing would go.
CSAM scans images, which can be repurposed to scan any other type of images in the future. The technology is ready in mass scale.
Next, same technology can be used to scan video with some modification. Or maybe new technology would need to be developed. I dunno, but it's only a matter of time.
Text scan and analysis was and will never be any big deal of a computer as powerful as iPhone 12.
Apple sells devices across hundreds of countries around the world, whereas government generally only has full control over people residing in a city, county etc. Apple as a private company will not need a warrant to search anything on your device and stuff stored over iCloud, whereas warrantless search is not super popular in western countries.
But yeah, just keep holding on what YOU believe, but I hope someone else can get what I want to say.
That would be a back door as they’re is no simple way to replace that database.Apple’s system doesn’t need a back door. It has a massive front door that’s wide open.
Want to know what’s on someone’s phone? Okay well just replace this CSAM database with whatever other database you’re interested in. Done.
I have asked some of our legal team, albeit not constitutional lawyers, to explain to me how this does not trample all over the 4th amendment:
**********
The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
**********
The consensus seems to be on one hand, it is a violation of constitutional rights, but on the other hand, Apple must have vetted this were some well respected experts.
Apple’s system doesn’t need a back door. It has a massive front door that’s wide open.
Want to know what’s on someone’s phone? Okay well just replace this CSAM database with whatever other database you’re interested in. Done.
Exactly, they’re not scanning iCloud anymore, they’re doing something worse which is building a technology that is capable of scanning your stuff on YOUR property irrespective of whether or not you have iCloud enabled. This sets a worse precedent than cloud scanning.
Don’t want a backdoor. Can we stop this by not updating to iOS 15?
How is it increased? They know MORE about the content of your images.
You mean like when Apple forced people who do not use icloud, to upload their photos without their consent by enabling icloud stream on by default ??? (without asking user's permission...)
Again, for the 100th time.Again
for the 100th time.
if you are talking slippery slope hypotheticals. WHY hasnt this already happened considering cloud storage and tech companies including apple giving cloud data to other countries has been happening since 2005. Why would governments find something NEW to scan when the data available TO scan on device AI is the exact copy of cloud data. all the other data is still locked out by the user with the secure enclave?
no one is answering this question directly because it collapses their narrative that fundamentally there is ZERO difference between cloud and device data available to scan. There is ZERO new data for device to scan compared to cloud data and there are no new ways to scan that governments havent already requested for cloud
So your concern about scanning video. Why hasnt video scanning been requested for clouds already. HINT: it already been scanned on cloud drives.
What I don't understand is why Apple chose to make this feature in the first place. They have always been about privacy, and even took measures to make it easier for them to fight government requests for information. These features don't benefit the end-users (although I suppose one can make a case about the sexting detection for parents) and it completely opens them up to government requests. Where did this come from?
You will not get what you want, and I am not joking.You mean like when Apple forced people who do not use icloud, to upload their photos without their consent by enabling icloud stream on by default ??? (without asking user's permission...)
It doesn't matter how you try to twist it. Apple tried to pass a very specific message about privacy all those years.
If you do not mind, please answer this. Why do you think apple should protect children from child pornography? Do you think it's apple's role?
If yes, should it protect them from searching other topics, like suicide, guns etc? I would really appreciate if you would answer those questions please.
Also, iCloud backup isn't really one-on-one device backup either, and some choose to not enable it, whereas local scan will always be able to find data cause surprise, it is on the device.