Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
So is this one of those things that will affect nobody but is good for clickbait and to get people all worried? It's one thing if someone finds something serious and Apple just ignores them but that wasn't the case here. What's the point of making this public other than needlessly worrying people and embarrassing Apple?
 
There's a lot of people reacting to this that really have no idea what this is all about.

  • Having FileVault full-disk encryption enabled makes this a non-issue, and everyone should be using this if they are concerned about their data protection.
  • SHA-256 is a strong algorithm. Unbreakable, no, but weak... also no.
  • Apple likely made this change because iOS devices are getting much larger storage capacities, so this faster algorithm made sense to adopt from a user experience perspective.
It's not like they suddenly decided not to encrypt at all. At the end of the day, using a unique, hard-guess (random) password phrase is always best, no matter the algorithm used.
[doublepost=1474814888][/doublepost]
NOPE ... we must NEVER say or do anything that may or might embarrass Apple ... they sit at the right hand of that great Wizard-in-the-Sky and are all knowing.

If you owned a company and made it into similar headlines, you'd want to avoid embarrassment. Why should Apple be held to a different standard?
[doublepost=1474815001][/doublepost]
Not that I needed another reason to stay on iOS 9.3.5.

This has nothing to do with iOS. This is the latest version of iTunes.
 
Simply because the keychain should be encrypted on the iPhone already and when exporting the keychain that (database?) file should be safe from hackers without any further backup encryption.
The entire backup *is* encrypted on the iPhone. iTunes is only used to set the encryption keys on the phone (based on your backup password). As I wrote before, if you use iTunes encryption, no information leaves the phone unencrypted. If you don't use iTunes encryption, the backup does not include keychain data.
[doublepost=1474819172][/doublepost]
Likely stupid question here: when you backup to your computer via iTunes, isn't there an option to make the backup UNencrypted? Last I looked, I even thought that was the default.
Yes.
Second dumb question, people are saying that encrypted backups have the Keychain info inside protected by ONLY the backup password. Does that mean that keychain info is completely unencrypted in an unencrypted backup?
If you don't use encryption, no keychain data is included in the backup.
Finally, couldn't the change have been related to the new massive 256GB storage option on the iPhone 7? Maybe the new algorithm allows for better or less CPU-intensive compression of the backup data. Or maybe they just did some tests and the old algorithm was too slow for those huge phones.
Using PBKDF2 has no impact on the actual data encryption/decryption. It is only used to compute the encryption key from the user's password, so the overall performance impact is small and independent of the amount of data to be encrypted. It is very unlikely that this change is intentional.
 
Last edited:
  • Like
Reactions: d00d
So? Being a programmer doesn't automatically make you a all knowing, logical human.

True, but if you're working with on an encryption mechanism in iOS (Apple's most important OS) you're probably going to know that changing the encryption mechanism to a much weaker one would be a bad thing - this isn't where they're going to put their 1st year programmers (and then not review the heck out of the code change).

Another programmer here - this is very odd. Mistakes happen all the time, but changing an encryption mechanism isn't just a "oh I forgot to put the brace there" item - breaking it outright so it didn't work would fit that theory (or used the old mechanism and didn't work so well).

This would seem much more likely to be an intentional change - change the encryption mechanism and its supporting code to use this much weaker protocol.

I love Apple, but when it comes to user privacy at this level the entire governmental world wants them to do stuff like this - that's the part of the world that controls Apple's market access. So we have to watch this stuff like hawks if we care about it. JMHO...
 
No. That would be highly unsecure, since it would imply that you could simply retrieve most items from the device keychain unencrypted from a paired computer without any protection. Search for "backup keybag" in the iOS Security Guide.

Fair enough; I'll check it out but that does seem reasonable (although a simple encryption scheme could be used between the two devices).

Of course it is possible. E.g. they could have a function that uses a simple hash in some cases and PBKDF2 in others depending on some parameter.

That would still involve a more substantial source code change than just messing up {}s. For example,
high_level_encrypt (key, data, AES256) would become something like high_level_encrypt (key, data, PBKDF2). Plausible, but not very likely I think. (It also incurs another functional call overhead; why not just call the desired encryption function directly?)
 
But a long, strong password would still take an impractical amount of time to crack, right?

(Not justifying the change at all.)
 
That would still involve a more substantial source code change than just messing up {}s. For example,
high_level_encrypt (key, data, AES256) would become something like high_level_encrypt (key, data, PBKDF2). Plausible, but not very likely I think. (It also incurs another functional call overhead; why not just call the desired encryption function directly?)
Function call overhead for a function that is only called once whenever a user enters a password? I think it's not me who needs to work on their programming skills. :p
[doublepost=1474821231][/doublepost]
But a long, strong password would still take an impractical amount of time to crack, right?
Yes, given a strong enough password a single round of SHA256 still provides good security. The main issue is that cracking not-so-safe passwords becomes a lot easier without a computationally intensive key derivation function.
 
Last edited:
So is this one of those things that will affect nobody but is good for clickbait and to get people all worried? It's one thing if someone finds something serious and Apple just ignores them but that wasn't the case here. What's the point of making this public other than needlessly worrying people and embarrassing Apple?

What's the point of any hacker/researcher publicly revealing the vulnerability of a system? To make sure they fix it.
 
Last edited by a moderator:
Why should Apple be held to a different standard?


Cook and Apple raised the flag first on privacy - from fortune.com

"In defying the order, Cook takes the position that he’s protecting his customers against evil-minded hackers and, without saying so, against governments that may or may not be motivated by noble intentions. He’s thus aligning himself with very broad constituencies. He’s also defending the Apple engineers who are proud to have created the invulnerable security features and who would be deeply demoralized if forced to write new code to defeat them."

Let's hear Apple's response to the issue that MR has reported on.
 
Cook and Apple raised the flag first on privacy - from fortune.com

"In defying the order, Cook takes the position that he’s protecting his customers against evil-minded hackers and, without saying so, against governments that may or may not be motivated by noble intentions. He’s thus aligning himself with very broad constituencies. He’s also defending the Apple engineers who are proud to have created the invulnerable security features and who would be deeply demoralized if forced to write new code to defeat them."

Let's hear Apple's response to the issue that MR has reported on.
Apple's response is in the article. It is even in the headline. They will fix this.
 
Last edited:
Then what about wireless?
I believe wireless signal interpretation is a lot easier than cutting through wire and tapping something nasty.
Depends. Wireless communication is known to be easy to sniff, so maybe there's more focus on encrypting it (e.g. wifi has WPA2). I know that attackers can very easily monitor traffic on an unsecured wifi network. An attacker on a typical IP network, wired or wireless, can sometimes run MitM attacks without any wire splitting/tapping, maybe through ARP poisoning/spoofing or hosting a fake DHCP server and router.
 
Last edited:
That would still involve a more substantial source code change than just messing up {}s. For example,
high_level_encrypt (key, data, AES256) would become something like high_level_encrypt (key, data, PBKDF2). Plausible, but not very likely I think. (It also incurs another functional call overhead; why not just call the desired encryption function directly?)

The whole backup format of iOS 10 devices has changed. If you compare a backup made of a device running an earlier version of iOS the structure and format of the backup is completely different.

So maybe the original source code effectively binned and started afresh but the new code just wasn't as well written in terms of security?
 
  • Like
Reactions: bennyf
Function call overhead for a function that is only called once whenever a user enters a password? I think it's not me who needs to work on their programming skills. :p

Joke noted and appreciated! However, I am a bit of a performance nut, so unnecessary function calls irk me. :)
 
Depends. Wireless communication is known to be easy to sniff, so maybe there's more focus on encrypting it (e.g. wifi has WPA2). I know that attackers can very easily monitor traffic on an unsecured wifi network. An attacker on a typical IP network, wired or wireless, can sometimes run MitM attacks without any wire splitting/tapping, maybe through ARP poisoning/spoofing or hosting a fake DHCP server and router.

Has anyone noticed that Apple is directing AEBS users to not create hidden networks? Because iOS 10 has a hard time finding those saved network credentials. Really? Because they're hidden from all but the tweakiest hackers? It seems odd. BTW, how does everyone like iCloud just grabbing all your documents and uploading then to iCloud. The opt-out language seems purposely confusing. Yes, I unchecked those boxes. Odd stuff. My 4th gen iPod never had a problem with a hidden network on an 2011 AEBS. I really don't want to have to buy yet another appliance.
 
Has anyone noticed that Apple is directing AEBS users to not create hidden networks? Because iOS 10 has a hard time finding those saved network credentials. Really? Because they're hidden from all but the tweakiest hackers?
Assuming by "hidden" you mean suppressing SSID broadcast, this is completely useless. It provides zero additional security, and is in many cases counterproductive because it may prevent other APs in your neighborhood from doing a proper channel survey and cause them to choose a channel that interferes with yours.

To secure your Wifi network, make sure to pick WPA2 with AES only encryption, and use a long, random passphrase (at least 20 characters).
It seems odd. BTW, how does everyone like iCloud just grabbing all your documents and uploading then to iCloud. The opt-out language seems purposely confusing.
Are you talking about macOS Sierra's iCloud Drive? That's actually an opt-in option.
 
not really anything to worry about unless u'r Mac is shared as well..

besides isn't SHA256 also the same encryption banks and other secure sites use to transmt personal info...?

And we think this is less secure only from the standpoint it was more secure in iOS9.. as a means to say "This is less secure because it can brute forced more quickly"

Maybe true, but then so can any other web site too u probably have personal info stored on.

... worrying probably over nothing

Still secure in my book. But if u wanna be picky,, ya, i guess Apple should fix it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.