Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Normally I'm down for telling Uncle Sam to piss out and get out of my business.... but I also don't like creating a utopia for child pornographers and terrorist plots either.
"Pedophiles prefer iPhone"
Just because something *could* be used nefariously is not justification to violate everyone's privacy, especially when there are considerably many more legitimate uses of the same technology.

Criminals will be criminals whether Apple encrypts their data or not. If government were truly concerned about pedophiles, Epstein's client list would be public by now and prosecutions underway, and a number of members of Congress would possibly be in the spotlight as well. This to me is enough to prove that giving government unfettered access to my private information (which has nothing to do with any crimes), "for the sake of the children", isn't going to make a damned bit of difference in whether they catch or prosecute more pedophiles, and those in positions of power certainly won't suddenly become any more likely to be arrested and tried. There's still a member of the Royal Family who did all but confess still walking around as a free man.
 
  • Like
Reactions: nt5672
I for one would like to store information in Notes that I would not want a hacker to ever access, so this is a very positive development for me.

But a hacker could still access it, for vital information, don't ever store it in the cloud.
 
If you have HomePods or Apple TV’s, you cannot enable the feature unless you remove those devices from your iCloud account or wait until release versions of all of the OS’es come out. Final releases usually come a few days or at most a week after the RC, so I’d wait until all the releases are out.
This is helpful. I wondered how my photos would be business on Apple TV without an update. It would have been helpful if that were stated in an article somewhere.
 
You are really overthinking this. I appreciate privacy, especially from government eyes, as much as anyone else, but the reality is they are already able to get around E2E encryption if you're truly a target. The Pegasus flap is undeniable evidence of that. If you think NSO was the only player or even the biggest, that alphabet agencies have at their disposal, then think again. Odds are very good we don't even know of their most powerful tools yet. And if your concern is that a government can remotely deactivate ADP so as to get to your data, you should realize that if the can do that, they already have access to your data right from your phone.

You're never going to stop every attack vector or plug every security hole. Vigilance is good, but it should be aimed at what is truly effective, not channeled to paranoia towards Apple about things Apple cannot control. The fact is that for most users, this is a step forward in terms of security and I applaud the move by Apple to do it. It also highlights why Apple a year ago wanted to scan photos on-device before upload, which I did and still oppose as an overly intrusive breach into peoples' personal privacy.

If Apple can't access your keys, then they can't hand them over to anyone, and it's doubtful Apple has code established to remotely disable this encryption, but time will tell. Even if they don't, someone else like NSO will come up with a way and governments will find ways to exploit it on targets of interest. The only sure way to prevent that is (A) don't make yourself stand out as a potential target, and (B) don't use your phone for sensitive (legal or not) or illegal purposes.

Even if you don't use your phone for those things, utilize the speed bumps at your disposal to protect your privacy because even if they aren't foolproof, they can still deter certain actors based on perceived cost/benefit ratio. Similar to locking your front door so thieves will move on to look for the unlocked door unless they are certain picking the lock or breaking down your door is worth the risk and effort.
Could be, that I am over thinking this. But, the statement "If Apple can't access your keys," is just plain wrong, Apple states in the overview document that they "don't" access your keys, not that they can't.

So maybe this standard is good enough for you, but Apple can and should do better. Are they probably limited by the government? Yes. But that does not make it right. Furthermore, it will never get better without an expectation of it getting better. More people need to have this expectation of privacy or we will continue to lose to illegal government actions.

So while your comments are on point IMO, they do not lead to a better future.
 
If they don't access or store keys, how are they to "have access" to said keys?
The document says they "don't" access the keys, not that they "can't" access the keys. "Don't access" implies "Don't normally access". Otherwise they would have said "Can't" or "Will never" access the keys.

Words have meaning and you can be sure this document was reviewed at the highest level by lawyers and Apple's government handlers.
 
On the flip side, governments don't seem especially thrilled about the new option given to users." In the US at least, the Constitution protects against "unreasonable search and seizure", which would seem to prevent random sniffing of stuff by law enforcement. Get a warrant, if there is a reason to search, then prove it to a judge, dah.

Now I do totally get that even with a warrant, unless the police are able to get the decryption key, they are out of luck. But realistically, there are usually a lot of other sources out there to provide evidence. And this makes the cloud storage no different than hard encryption of external disks. Same stuff, still encrypted, and still inaccessible without the keys or some way serious hacking skills. I don't want some sketchy President who subverts the Constitution for his own ends to get his hands on anything for Political purposes. So the dilemma.

Didn't people say the Congress should address this with laws. Oh, yah, I see why that didn't work - no laws yet
 
Could be, that I am over thinking this. But, the statement "If Apple can't access your keys," is just plain wrong, Apple states in the overview document that they "don't" access your keys, not that they can't.

So maybe this standard is good enough for you, but Apple can and should do better. Are they probably limited by the government? Yes. But that does not make it right. Furthermore, it will never get better without an expectation of it getting better. More people need to have this expectation of privacy or we will continue to lose to illegal government actions.

So while your comments are on point IMO, they do not lead to a better future.
if you read it, apple doesn't have the keys, they are "on-device". Apple disclaims any ability to recover your data, they have a mechanism where you can share the keys with someone else, for recovery, because they can't do it for you.
 
  • Like
Reactions: chabig
The document says they "don't" access the keys, not that they "can't" access the keys. "Don't access" implies "Don't normally access". Otherwise they would have said "Can't" or "Will never" access the keys.

Words have meaning and you can be sure this document was reviewed at the highest level by lawyers and Apple's government handlers.


Advanced Data Protection for iCloud​

Advanced Data Protection for iCloud is an optional setting that offers Apple’s highest level of cloud data security. When a user turns on Advanced Data Protection, their trusted devices retain sole access to the encryption keys for the majority of their iCloud data, thereby protecting it with end-to-end encryption. For users who turn on Advanced Data Protection, the total number of data categories protected using end-to-end encryption rises from 14 to 23 and includes iCloud Backup, Photos, Notes and more.

Advanced Data Protection for iCloud will be available to U.S. users by the end of 2022 and will start rolling out to the rest of the world in early 2023.

Conceptually, Advanced Data Protection is simple: All CloudKit Service keys that were generated on device and later uploaded to the available-after-authentication iCloud Hardware Security Modules (HSMs) in Apple data centers are deleted from those HSMs and instead kept entirely within the account’s iCloud Keychain protection domain. They are handled like the existing end-to-end encrypted service keys, which means Apple can no longer read or access these keys.
can no longer = can't

Basically, iCloud servers had the keys, then when you turn this on, it will delete the keys from the servers.

Apple could in theory change the code to disable Advanced Data Protection, but until they say "we're getting rid of it", then the status quo is that they don't have your keys, your devices do.
 
  • Like
Reactions: chabig
if you read it, apple doesn't have the keys, they are "on-device". Apple disclaims any ability to recover your data, they have a mechanism where you can share the keys with someone else, for recovery, because they can't do it for you.
If the user can flip a switch which sends the keys to Apple, you really think that Apple cannot flip the same switch? Of course they can. Duh.
 


can no longer = can't

Basically, iCloud servers had the keys, then when you turn this on, it will delete the keys from the servers.

Apple could in theory change the code to disable Advanced Data Protection, but until they say "we're getting rid of it", then the status quo is that they don't have your keys, your devices do.
If the user can flip a switch which sends the keys to Apple, you really think that Apple cannot flip the same switch? Of course they can. Duh.
 
If the user can flip a switch which sends the keys to Apple, you really think that Apple cannot flip the same switch? Of course they can. Duh.
So what do you want Apple to do? They CAN flip a switch if they program it to. The only way for this to not be true is if they permanently remove their ability to update their software. Which is a bad idea of course, if we don't have software updates, we don't have security updates, and we then become very susceptible to things like Pegasus which rely on unpatched vulnerabilities.
 
  • Like
Reactions: fatTribble
I for one would like to store information in Notes that I would not want a hacker to ever access, so this is a very positive development for me.
If you lock the notes within the notes app they will be effectively end-to-end encrypted, without having to turn it on for everything.
 
So what do you want Apple to do? They CAN flip a switch if they program it to. The only way for this to not be true is if they permanently remove their ability to update their software. Which is a bad idea of course, if we don't have software updates, we don't have security updates, and we then become very susceptible to things like Pegasus which rely on unpatched vulnerabilities.
It is simple, apps that care about security have been doing it for years. Use public/private keys. Let the user control the keys. That way the keys are never accessible by Apple, or only the public part of the key is known by Apple.

Or just quit making this feature sound like it is more secure then it is. Apple wants users to think they don't have access so they don't have to deal with lost keys, not because they don't have access.
 
I previously enabled Add Recovery Contact to enable a friend to send me a recovery code if I was locked out.

Does Recovery Key work in the same way and can both options co-exist?

I think I feel safer with Recovery Key as I would always access to that. With Contact, there’s always a chance the contact might not be available.
 
It is simple, apps that care about security have been doing it for years. Use public/private keys. Let the user control the keys. That way the keys are never accessible by Apple, or only the public part of the key is known by Apple.

Sounds great until a customer loses their key and calls support.
 
It is simple, apps that care about security have been doing it for years. Use public/private keys. Let the user control the keys. That way the keys are never accessible by Apple, or only the public part of the key is known by Apple.

Or just quit making this feature sound like it is more secure then it is. Apple wants users to think they don't have access so they don't have to deal with lost keys, not because they don't have access.
But according to you, Apple can take their key, when you generate the key you will have to write it down, but Apple could program the system to take the key and upload it to the cloud. So you defeated your own argument, the only way in your view to make it secure is to move to a separate computer system and image the phone remotely.
 
  • Like
Reactions: southnorth
I previously enabled Add Recovery Contact to enable a friend to send me a recovery code if I was locked out.

Does Recovery Key work in the same way and can both options co-exist?

I think I feel safer with Recovery Key as I would always access to that. With Contact, there’s always a chance the contact might not be available.
I believe the key will work in the same manner. If I am not mistaken, you can make use of the Key as well as contact support.
 
  • Like
Reactions: southnorth
If you lock the notes within the notes app they will be effectively end-to-end encrypted, without having to turn it on for everything.
Oh yeah, you’re right! Amongst the giant stash of notes I have in my Notes app, there are a few that I lock up like combinations for locks I have, bank info if I need it at a moment’s notice, etc. And even better that it allows you to either choose an alphanumeric passcode or biometrics with Face ID or Touch ID.
 
But according to you, Apple can take their key, when you generate the key you will have to write it down, but Apple could program the system to take the key and upload it to the cloud. So you defeated your own argument, the only way in your view to make it secure is to move to a separate computer system and image the phone remotely.
You obviously don't know how public/private keys work.
 
You obviously don't know how public/private keys work.
I do know how they work. I actually use GPG on Linux command line, and I work with TLS certificates.

Let’s try this again. You think any kind of Apple-created software that has access to your private key at any point is bad. Fine, then how do you decrypt the iCloud backup? Are you saying you want every user to download their backup files onto a non-Apple system, then they enter their key, and they will use that system to image the iPhone? Probably not very realistic, right? Even a Mac wouldn’t be allowed to image the iPhone because it was created by Apple!

In short, how do you decrypt the backup files? With the private key. How do you enter the private key to decrypt the backup? Into the iPhone itself? Then Apple would have the ability to take your key, upload it to iCloud, decrypt the backup, and do whatever nefarious things you think they would do!

So please tell us what do you think Apple should do. At some point, you have to be able to do something with the backup, right? At some point you have to use the private key to decrypt it, right? Do you think we all should use a Linux system and download packages and decrypt files on the command line then send the commands to image the iPhone over USB? Is that very realistic to you? Because I’m not seeing an option, even if you handle the keys yourself, you would likely put them into the iPhone to use your backups to restore, so you’re at square one, Apple-created software will take your key and use it, but you think that’s not good.
 
So it appears that it’s just iMessage backups that I have the encryption key for - not the content of the messages themselves. That’s concerning to me. I’ll stick with Signal.
 
I was reading Apple’s document on how this feature works and came across this paragraph.


After the service key rotation is successful, new data written to the service can’t be decrypted with the old service key. It’s protected with the new key which is controlled solely by the user’s trusted devices, and was never available to Apple.

Does that mean only new data is protected and old data can still be decrypted by Apple or is that covered by the “key rotation”?
 
So it appears that it’s just iMessage backups that I have the encryption key for - not the content of the messages themselves. That’s concerning to me. I’ll stick with Signal.
I believe this change to enable Advanced Data Protection deals only with backups. iMessage itself has been encrypted for years.
 
  • Like
Reactions: kitKAC
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.