Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This is not going away and it seems that it will reach a conclusion sooner rather than later. I think it's inevitable that Apple will eventually be forced to unlock an iPhone. There are too many cases and they can't win them all unfortunately.

Then all our electronic media will be hacked.
 
good, another chance for discovery of what Cellebrite actually used.

There's been no proof that Cellebrite was involved. Any company could have created tools similar to what they did.

And do you really think the FBI would let their method be confirmed. They will shout "national security" and demand it never be told. Probably win that one.

Are you a suspected criminal under investigation?

This isn't about your phone.

One does not have to be a criminal for it to be about "your phone".

What the FBI asked for was the first step down the path do demanding a built in back door that could be exploited by any hacker. With the amount of potentially embarrassing or identify theft enabling data the average person has on a smart phone, it would be a disaster.

What did I tell you? The FBI went to court in the first place because they while they knew they could unlock an iPhone 5c or lower,

I won't say that they knew. Yes they likely had discovered a possible method but it probably wasn't cheap enough, fast enough or guaranteed enough for their tastes and they assumed they could get better by going direct to Apple.

And yes they likely wanted a precedent as well.

I hope whatever exploit they used to unlock the phone has been patched and that Apple (continues to?) make(s) the iPhone more secure and harder to crack in future.

If the theories are correct about what was used, it was patched about 3 iPhones ago. Because Apple does actually GAF about us common folks and our data security.

It's no big secret what was used.

It's not confirmed that this was the method but yes it seems most likely it was this or a variant
 
  • Like
Reactions: You are the One
Whether we like it or not there appears to be bipartisan support for regulating encryption technologies. Even those running for President have said Apple needs to unlock the phone. Given the present state of the world's political instability some form of regulation is inevitable.
 
The battle for Helm's Deep is over; the battle for Middle Earth is about to begin.
 
Are you a suspected criminal under investigation?

This isn't about your phone.

Go troll somewhere else. There can't be single tech blog / site reader who doesn't understand the wider consequences of backdoors, weak security / encryption etc.
[doublepost=1460229854][/doublepost]
Whether we like it or not there appears to be bipartisan support for regulating encryption technologies. Even those running for President have said Apple needs to unlock the phone. Given the present state of the world's political instability some form of regulation is inevitable.

You mean since there is instability everyone should be forced use bullet proof security to keep us all safe? Right?
 
  • Like
Reactions: You are the One
Go troll somewhere else. There can't be single tech blog / site reader who doesn't understand the wider consequences of backdoors, weak security / encryption etc.
[doublepost=1460229854][/doublepost]

You mean since there is instability everyone should be forced use bullet proof security to keep us all safe? Right?
No, just stating my opinion on what is going to happen. How one responds to the inevitable is another topic.
 
Yes. I just read up on what the secure enclave actually is. It's an entirely new beast of security.

They're neat, but secure enclaves were not new when Apple started using one in 2013.

Heck, Samsung first implemented a secure enclave in their Exynos chips, starting back in 2012 with the Galaxy S3. It's used to help secure their NSA derived Knox kernel.

"Just one phone"

One at a time, yes. In other words, no need for Apple to give anyone a universal tool. The FBI even said that Apple could keep each phone in their possession.

But the FBI never said it was going to only ever be "just one phone". The internet started that nonsense.

Seriously though, this sets a dangerous precedent. Before you know it, Apple will be ordered to remove the secure enclave on their phones.

Asking Apple to help unlock their own device is nowhere near the same thing. And Apple has done so many times for law enforcement in the past.

This case is more about Tim Cook feeling that he had to refuse publicly because the request was so public. The rest of the intel community must be rolling their eyes at the FBI's ineptness.

If you want to know what will happen if Apple ever develops an alternate version of iOS that allows for unlocking phones belonging only to "suspected criminals under investigation", read this article. The TSA master key debacle is a blueprint scenario for the inevitable misuse of backdoors.

The alternate version would be useless to anyone outside of Apple, as Apple's public facing servers would not sign the update.

And if someone had access to those keys, then they don't need the alternate version at all, since they could create and install their own at any time.

This is not going away and it seems that it will reach a conclusion sooner rather than later. I think it's inevitable that Apple will eventually be forced to unlock an iPhone. There are too many cases and they can't win them all unfortunately.

And Tim Cook has said that Apple will comply if they think there are legally ordered to.

The warrant to access the info is totally legal. The only thing at question is whether Apple can be forced to help.
 
The alternate version would be useless to anyone outside of Apple, as Apple's public facing servers would not sign the update.

And if someone had access to those keys, then they don't need the alternate version at all, since they could create and install their own at any time.

Yup, the alternate version would be useless as is. But several jailbreak versions of iOS have proven that iOS packages can be manipulated and tweaked in order to work properly despite all of Apple's safeguards.

After all, iOS (no matter what version) is a software asset, and as such: a) there is no guarantee it will be contained forever by Apple; b) it's all but guaranteed that it will present points of attack and/or vulnerabilities to someone (skilled) that looks long enough.
 
this is already likely losing apple customers. any relative of a deceased loved one who's passcode was refused to be cracked by apple - upon hearing this FBI narrative - will see no reason that apple should/would/could not get into their phones. doesn't matter the specifics or details we as interested parties can parse and deconstruct. in the media's reporting, apple is now a 'friend' of the FBI and 'enemies' of dead people's families.
 
this is already likely losing apple customers. any relative of a deceased loved one who's passcode was refused to be cracked by apple - upon hearing this FBI narrative - will see no reason that apple should/would/could not get into their phones. doesn't matter the specifics or details we as interested parties can parse and deconstruct. in the media's reporting, apple is now a 'friend' of the FBI and 'enemies' of dead people's families.

Even the dead have the right to privacy. I'm not sure I'd want my parents having unfettered access to my phone.
 
Yup, the alternate version would be useless as is. But several jailbreak versions of iOS have proven that iOS packages can be manipulated and tweaked in order to work properly despite all of Apple's safeguards.

Jailbreaking (loading your own special code) is quite different from being able to load a desired Apple signed piece of OS code that has never been loaded on that device before. It'd be like trying to downgrade once a version is no longer signed, and the only way that has ever been done was back with old pre-iOS 4 versions where the original hardcoded signing key was captured. Apparently that's no longer possible.

IF someone can figure a way around that, then they likely don't even need Apple's special version; they can make their own.

Thus this whole "OMG it's a universal backdoor" wailing is just Chicken Little stuff. It only works on a device in Apple's lab. Encryption is still in place. A strong password will still take years to brute force. Heck, newer devices apparently have a hardware five second retry delay, which would add almost two orders of magnitude of delay.

So brute forcing on newer devices is a dead end unless someone used a dumb "1234" passcode type.

Now, if I were the government, I'd have found a way to retrieve the device UID from the chip instead. Researchers have done this for similar embedded IDs by watching for telltale electromagnetic emissions with itty bitty antennas. Once you have the UID, then the need to oh so very slowly brute force on the device itself disappears, and supercomputers can be brought into play with a copy of the encrypted file system. No Apple necessary.

If Apple really cared about personal privacy, they'd have kept their warrant oversight lawyers in the loop by cooperating. That way they'd be the first to see if something was being abused. You can't blow the whistle on something you do not know about.
 
  • Like
Reactions: H2SO4
You get it. Mass surveillance has always had the goal of suppressing political dissent. The "extreme" cases are ment to get an agreement for the usage to abuse it later.

I trust most of the people in the FBI, but if they are given too much surveillance power, as night follows day there will be political pressure to abuse it. 'Absolute power corrupts absolutely' and all of that...
 
  • Like
Reactions: Eraserhead
Yup, the alternate version would be useless as is. But several jailbreak versions of iOS have proven that iOS packages can be manipulated and tweaked in order to work properly despite all of Apple's safeguards.

After all, iOS (no matter what version) is a software asset, and as such: a) there is no guarantee it will be contained forever by Apple; b) it's all but guaranteed that it will present points of attack and/or vulnerabilities to someone (skilled) that looks long enough.

If you use a long alpha pass code, doesn't matter what jail breaker no matter how skilled he is
or Apple does, nobody's getting your stuff.

Since you need the passcode that doesn't reside on the phone to decrypt
I think you still need to have an unlocked phone to jailbreak.

There is no public jailbreak for 9.2 and 9.3 though there are rumored ones.

Eventually, even unlocked phones will be very very hard to root.
it's already much harder now than a few years ago.
[doublepost=1460310275][/doublepost]
this is already likely losing apple customers. any relative of a deceased loved one who's passcode was refused to be cracked by apple - upon hearing this FBI narrative - will see no reason that apple should/would/could not get into their phones. doesn't matter the specifics or details we as interested parties can parse and deconstruct. in the media's reporting, apple is now a 'friend' of the FBI and 'enemies' of dead people's families.

If the person wants you to access their phone after they die, they'll make sure you can. Otherwise it's none of their business.
 
  • Like
Reactions: kdarling
Jailbreaking (loading your own special code) is quite different from being able to load a desired Apple signed piece of OS code that has never been loaded on that device before. It'd be like trying to downgrade once a version is no longer signed, and the only way that has ever been done was back with old pre-iOS 4 versions where the original hardcoded signing key was captured. Apparently that's no longer possible.

IF someone can figure a way around that, then they likely don't even need Apple's special version; they can make their own.

Thus this whole "OMG it's a universal backdoor" wailing is just Chicken Little stuff. It only works on a device in Apple's lab. Encryption is still in place. A strong password will still take years to brute force. Heck, newer devices apparently have a hardware five second retry delay, which would add almost two orders of magnitude of delay.

Well, you have 30+ years as a device engineer, so I believe you have pretty solid knowledge of what you're talking about.

I myself have been working in IT security for the past 15 years, and I can attest that backdoors are generally a (very) bad idea. Having a door—no matter how secure—where there used to be only a solid wall is inherently adding a vulnerability to the system. Period. Time and again I have witnessed such mechanisms be abused, and in ways the original proponents never even dreamed of. On principle in IT security that most people overlook is the fact that the defender has to be 100% effective, 100% of the time, whereas the attacker has to succeed only once.

So, it's cool if you chalk it up to "chickenlittleism" when I say a backdoor would be catastrophic in this case. i'll still be rooting for this possibility never to materialize.

Thanks for the interaction. Have a fantastic week.
 
Again another attempt by the FBI to force a private company to do the FBI's dirty-work for them. I could see this being reasonable request if this was a simple matter of information, but not for creating new code.

EDIT: It just struck me...

FBI: we need access to the phone of a terrorist - it's life or death
FBI: we need access to the phone of a drug dealer - it's about crime
FBI: we need access to the phone of a political activist - it's about being American

This is exactly the problem, once you can compel people to do the governments work for them, you can compel them to do anything.
 
I myself have been working in IT security for the past 15 years, and I can attest that backdoors are generally a (very) bad idea. Having a door—no matter how secure—where there used to be only a solid wall is inherently adding a vulnerability to the system.

Normally I'd totally agree. However, with respect, the reason why I'm saying many people are going Chicken Little over this, is because they're just making a knee jerk reaction, instead of actually thinking logically about it all.

First off, the government will likely continue to try to figure out a way in, no matter what, whether or not Apple helps.

Secondly, it's not really a back door, since encryption is still in place, as are the major retry delays. It's simply as if someone turned off options to delete everything after X bad tries. It still can take years to brute force. And no, brute force is not a back door. If it were, then half the websites and devices on the planet have "back doors".

--

Most importantly, people totally ignore the fact that the FBI said Apple could keep the device and its special OS in their Apple labs. So let's look at what Tim Cook said with his FUD hand waving:

"In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession."

Only if it leaked from Apple, and if someone also has the signing keys.

So what it boils down to, is Cook is claiming that Apple is internally incapable of keeping the version... and the keys... a secret.

And if that's true, then we're already screwed even without the FBI's version, since Cook's meaning is that his employees could sneak out with a special version, and if so, then that means they already can do so with Apple's code and/or signing keys... which can be used to make the same kind of version.

TL;DR - Cook's warning does not mean it's the FBI that cannot be trusted... because the FBI would not have access to the version! Therefore it's Apple itself that Cook mistrusts. Not a great message on his part.
 
Last edited:
  • Like
Reactions: H2SO4
Normally I'd totally agree. However, with respect, the reason why I'm saying many people are going Chicken Little over this, is because they're just making a knee jerk reaction, instead of actually thinking logically about it all.

First off, the government will likely figure out a way in, no matter what, whether or not Apple helps.

Secondly, it's not really a back door, since encryption is still in place, as are the major retry delays. It's simply as if someone turned off options to delete everything after X bad tries. It still can take years to brute force. And no, brute force is not a back door. If it were, then half the websites and devices on the planet have "back doors".

--

Most importantly, people totally ignore the fact that the FBI said Apple could keep the device and its special OS in their Apple labs. So let's look at what Tim Cook said with his FUD hand waving:

"In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession."

Only if it leaked from Apple, and if someone also has the signing keys.

So what it boils down to, is Cook is claiming that Apple is internally incapable of keeping the version... and the keys... a secret.

And if that's true, then we're already screwed even without the FBI's version, since Cook's meaning is that his employees could sneak out with a special version, and if so, then that means they already can do so with Apple's code and/or signing keys... which can be used to make the same kind of version.

TL;DR - Cook's warning does not mean it's the FBI that cannot be trusted... because the FBI would not have access to the version! Therefore it's Apple itself that Cook mistrusts. Not a great message on his part.

I follow what your saying, I'm not sure I agree. First of all encryption is only as strong as its implementation, writing software that removes the pin lock security safeguards creates a backdoor, as it makes a brute force attack possible, where previously it was not.

Secondly, Cook is being pragmatic, no computer system is totally secure, even Apples. Even if there's only a small chance someone could break in and steal the software, given enough time it would be almost inevitable that they would. It's not a matter of trust, it's simply a matter of time. And whilst it would be difficult to install without the digital key, if they could find an exploit it may be possible. Isn't that how jail breaking works?

Thirdly, don't underestimate the legal aspects. Creating a precedent that requires Apple employees to work for the US government is chilling. Apple is not a law enforcement agency, it is not their job to investigate criminal activity. Remember we all work voluntarily, co-opting those same employees and forcing them to work for the government, is somewhat Orwellian. Also since computer code is protected free speech, such an order would require those same employees to speak on behalf of the government, surely unconstitutional. And if Apple employees can be forced to work for the government, why not anyone else!
 
They're neat, but secure enclaves were not new when Apple started using one in 2013.

Heck, Samsung first implemented a secure enclave in their Exynos chips, starting back in 2012 with the Galaxy S3. It's used to help secure their NSA derived Knox kernel.

The Knoc kernel was first introduced in Feb of 2013. The first device it worked on was the S4. It was nasa certified, not derived. It is not the same thing either. It is a disk partition that runs the mobile software as s virtualization there by creating sandboxes. The purpose is to mimic the sandboxes iOS has had built into the software since 2008. Unfortunately it doesn't really work as written. They don't even own a patent on it they are using the tech from another company. Apple, applied for their patent for a secure enclave in August 2012 and received it a year later after it was released. http://www.infoworld.com/article/26...-about-samsung-knox-for-android-security.html

But the FBI never said it was going to only ever be "just one phone". The internet started that nonsense.

Yes. The FBI director said in interview after interview it was about one phone and even told other agency as much. It was not until he was under oath infront of congress that his story changed.

Asking Apple to help unlock their own device is nowhere near the same thing. And Apple has done so many times for law enforcement in the past.

The difference is the used to maintain keys to the OS but stopped doing so because of the inherent risk of having them on thier servers. Apple is one of the few major tech companies that have not suffered a major breach to their servers. They know that it can happen. The less information they store, the safer our devices.
 
If the person wants you to access their phone after they die, they'll make sure you can. Otherwise it's none of their business.

you're absolutely right but tell that to the tons of soccer moms and elderly customers that make a good portion of the apple consumer base - they often times aren't aware of how the gist works and whether they can get into a phone or not after death is a big deal to some of them. T&C or not.

"i just heard from Martha at the book club that apple CAN get into a 5c which is what Doug had when he passed. i guess if he'd been a terrorist apple would have helped me get the contacts i desperately needed while i was grieving..."

i'm not arguing the specifics or the policy, i'm just saying that apple is already losing value/trustworthiness in some of their customers regards thanks to the FBI's public fracas. it will continue if recent reports are to be any indicator of what's to come.
 
I follow what your saying, I'm not sure I agree. First of all encryption is only as strong as its implementation, writing software that removes the pin lock security safeguards creates a backdoor, as it makes a brute force attack possible, where previously it was not.

I also understand what you're saying, but I think it's just not the right term for simply turning off retry limits. Back when you could brute force an Apple iCloud account because there were no retry limits, no one called it a back door.

The problem is that calling something a "back door" inherently biases the reader, because it usually means a secret, guaranteed, quick way in. That's of course scary. In this case, even if the software retry limits were disabled, it could take over a half decade to crack a good password on an older iOS device, and up to over three centuries on a newer one with the five second hardware delay. That's not a very scary "back door", especially since the brute force attack must run on the device itself. It's not like someone could take your device for years and you won't notice :)

Of course, with a poor password, it could take just hours. For that matter, it's possible to guess a few of the most-oft used ones even with the retry limit. Guessing is not a backdoor, although its just a variation of brute force.

Secondly, Cook is being pragmatic, no computer system is totally secure, even Apples. Even if there's only a small chance someone could break in and steal the software, given enough time it would be almost inevitable that they would. It's not a matter of trust, it's simply a matter of time.

Exactly my point. If Cook is right that Apple cannot be trusted to keep such a version or the keys safe, then even without the FBI request, we're already at risk of something happening. Nothing is perfect. Risks are part of everything.

And whilst it would be difficult to install without the digital key, if they could find an exploit it may be possible. Isn't that how jail breaking works?

Jailbreaking is about being able to load non-Apple files without a key, mostly to try to elevate privileges so you can then load other apps. However, jailbreaking cannot make the bootloader load an Apple OS version without an Apple key. Apple's done a good job locking that down. That's also why there is no way to downgrade to an unsigned version.

Thirdly, don't underestimate the legal aspects. Creating a precedent that requires Apple employees to work for the US government is chilling. Apple is not a law enforcement agency, it is not their job to investigate criminal activity.

Again, yep, that's exactly what I've been saying. This is not about whether or not it's legal or moral for the government to try to get into the phone. They have a warrant, therefore it is legal and moral. Warrants are a bedrock of our legal system.

It's about whether Apple can be forced into helping. It's also, according to Cook's Time magazine interview, about him being personally miffed that the FBI took their request public, which left him no choice but to publicly fight back.

Remember we all work voluntarily, co-opting those same employees and forcing them to work for the government, is somewhat Orwellian. Also since computer code is protected free speech, such an order would require those same employees to speak on behalf of the government, surely unconstitutional. And if Apple employees can be forced to work for the government, why not anyone else!

Apple has for years helped (and continues to help) the government to break into many iPhones, by providing a custom version of iOS that bypassed the lock screen, given law enforcement access to iCloud backup info and to any other info that's not encrypted.

The FBI was just dumb going public in this case. Not the least because there were other options, and because of public backlash, but also because well encrypted devices are ridiculously hard to brute force on the device itself, even with Apple's help. They really need to find a better, faster method if possible.

Reader opinions may vary and that's fine. Regards.
 
  • Like
Reactions: jnpy!$4g3cwk
Again another attempt by the FBI to force a private company to do the FBI's dirty-work for them. I could see this being reasonable request if this was a simple matter of information, but not for creating new code.

EDIT: It just struck me...

FBI: we need access to the phone of a terrorist - it's life or death
FBI: we need access to the phone of a drug dealer - it's about crime
FBI: we need access to the phone of a political activist - it's about being American

Go listen to the speech President Obama gave at South by Southwest a few weeks ago, he specifically mentions needing to get into people's phones for "enforcing tax laws". As complicated as our tax code is, anyone who pays taxes could be investigated and probably found in violation of something. Thankfully, the IRS is 100% non-political and would NEVER target political activists (tongue firmly in cheek).
 
I also understand what you're saying, but I think it's just not the right term for simply turning off retry limits. Back when you could brute force an Apple iCloud account because there were no retry limits, no one called it a back door.

The problem is that calling something a "back door" inherently biases the reader, because it usually means a secret, guaranteed, quick way in. That's of course scary. In this case, even if the software retry limits were disabled, it could take over a half decade to crack a good password on an older iOS device, and up to over three centuries on a newer one with the five second hardware delay. That's not a very scary "back door", especially since the brute force attack must run on the device itself. It's not like someone could take your device for years and you won't notice :)

Of course, with a poor password, it could take just hours. For that matter, it's possible to guess a few of the most-oft used ones even with the retry limit. Guessing is not a backdoor, although its just a variation of brute force.

OK, I see what your saying, it's not a backdoor in the traditional sense, there would still be an obstacle to entry. Whatever it's called though, it's still a weakening of security that presents an "easy" way to get into the phone, especially since most phones only have a four digit passcode. Of course going forward Apple could mandate an alphanumeric code that might mitigate the issue somewhat.


Exactly my point. If Cook is right that Apple cannot be trusted to keep such a version or the keys safe, then even without the FBI request, we're already at risk of something happening. Nothing is perfect. Risks are part of everything.

It's not a zero sum game though, Apple having to keep a version of iOS that is less secure on their system, increases that risk.

Again, yep, that's exactly what I've been saying. This is not about whether or not it's legal or moral for the government to try to get into the phone. They have a warrant, therefore it is legal and moral. Warrants are a bedrock of our legal system.

It's about whether Apple can be forced into helping. It's also, according to Cook's Time magazine interview, about him being personally miffed that the FBI took their request public, which left him no choice but to publicly fight back.

The court order makes it legal not moral, it's legal to execute prisoners in some states, whether that's moral or not is a different question. Legality and morality are not always the same thing and in this case I'd argue strongly against it being so. It's also questionably legal when there is a statute on the law books, CALEA, that appears to expressly deny the FBI the power they are seeking here.


Apple has for years helped (and continues to help) the government to break into many iPhones, by providing a custom version of iOS that bypassed the lock screen, given law enforcement access to iCloud backup info and to any other info that's not encrypted.

The FBI was just dumb going public in this case. Not the least because there were other options, and because of public backlash, but also because well encrypted devices are ridiculously hard to brute force on the device itself, even with Apple's help. They really need to find a better, faster method if possible.

Reader opinions may vary and that's fine. Regards.

I was aware Apple had previously helped the FBI and other LEA's gain access to iPhones, I had thought however that was down to them having the encryption key's for versions of iOS earlier than 8 and iCloud backups not actually being encrypted at rest. I was not aware that they had written special versions of iOS that bypass the lock screen, unless you're meaning for those cases where they had the encryption keys already. i.e version 7 or earlier. Accessing something for which they have the encryption key already is entirely different from seeking to get round the encryption by weakening the security safeguards.

The FBI wasn't just dumb they materially misrepresented themselves, in the San Bernadino case, when they argued it was only about this one phone, when they knew it was about much more than that. They were disingenuous/dishonest in the extreme when their director said he didn't care about the precedent, when he knew a, that he did and b, that it was irrelevant whether he cared or not, if a precedent was set, a precedent was set.

It troubles me greatly that those charged with protecting our safety feel they can lie with impunity. If you or I lied to the FBI in such a way it might be considered a felony.

Regards and respectfully

R.
 
  • Like
Reactions: kdarling
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.