Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Sir, I won't disagree with you about which are the more plausible scenarios... if such a tool were let out along with Apple signing tools, which I don't think would happen.

Have we seen the source code for iOS get out? For MacOS? For Windows? For Photoshop? For Word? Never. In many decades. It's very reasonable to assume that if Apple possessed a tool (be it an application or a modified version of iOS) that decrypted an encrypted iOS device, this tool would NOT get out. Is there a chance? Yes. Is it an acceptable risk? In my mind, yes.

I will disagree that the worse scenario is the case of a less secure OS.

I feel the same.

For one thing, we've had such OSes for a long time, iOS included. In fact, a lot of this feels like Apple desperately handwaving to direct attention away from the fact that people jailbreak iOS, that updates constantly fix buffer overruns that are iOS security holes, and that the fingerprint sensor is wide open to fake finger attacks.

I don't see that so much. This case has put Apple in a very tough spot. If they give in and do what the government wants, people question iOS security and they've been selling security as a feature for a while now. Since they chose to fight, they now have to make their case in the court of public opinion in addition to a real court. In some ways I'm glad this is happening because I think it's a good conversation for the public to have. The rhetoric coming out of Apple is a bit extreme, but at this point they have no choice.

For another, people getting hacked is nothing compared to personal injuries or national attacks.

100% agree.

Do we have to tread carefully? Yes. But I think we have to tread the path, instead of pulling into a turtle shell. Just as with any other warrant, there have to be checks, but at the same time, we cannot throw out warrant powers altogether.

This issue has been grossly distorted in my opinion. Apple continues to link the mere existence of a decrypt tool with a total privacy meltdown (cancer? really???) when in all reality the existence of a such a tool would in no way affect 99.999% of iOS users. It would not install a backdoor on a user's iPhone. It would not give the government a master key to your phone. It would not allow the government to surveil you at will (beyond all the information you already volunteer freely when you access the Internet). The only thing such a tool would do is decrypt an iPhone that is physically present in Cupertino when the government possesses a legal warrant. I fail to see how this will destroy privacy and security forever.

Before we can have a serious debate about this issue, we must all answer one question: There is never ever *ever* any compelling reason for law enforcement/government to view the contents of an electronic device, yes or no? I think most reasonable people would say no. So if we can start there, we can have a conversation about how to mitigate risk, which is precisely the conversation we SHOULD be having.
[doublepost=1457649778][/doublepost]
That's not a reality, just a possible outcome. If Apple wins this current case in court then it will only help them when they resist whatever laws get passed. If the courts say that Apple can't be compelled to develop a hack into it's own operating system, they'll be likely to side with Apple on a future case where the government wants Apple to install spyware on all its devices.

Just your possible outcome...and one that is far less likely when "national security" is invoked. Have you been paying attention this past decade or two?
 
  • Like
Reactions: kdarling
Before we can have a serious debate about this issue, we must all answer one question: There is never ever *ever* any compelling reason for law enforcement/government to view the contents of an electronic device, yes or no? I think most reasonable people would say no

The problem lies in designing a system that is both secure and not secure at the same time. It's not possible.
 
Thanks for responses.

Do you not believe then that it's possible that this coveted piece of potential software would leak out or get stolen? Rendering a huge hole in the security of devices?

I think I may be understanding your and others' point of view...But its validity completely rests on the security of a company.

When have we seen source code for any major OS or application stolen or leaked? Is it possible? Of course. It's also possible that someone other than Apple can break iOS encryption. Are either likely to happen? No.
 
The problem lies in designing a system that is both secure and not secure at the same time. It's not possible.

I don't agree. The problem lies in determining what level of risk you are willing to accept in order to balance the needs of privacy and public welfare. In my opinion, Apple being in sole possession of a heavily secured tool that decrypts a phone when present in Cupertino and connected to a specific device after biometric authentication from multiple Apple executives is a risk I am willing to accept.

Does the mere existence of such a tool render iOS less secure? Yes. In a meaningful way? No, in my opinion. Keep in mind, Apple has already said it could create such a tool today. So whatever security "weakness" exists that would allow them to decrypt the phone already exists on every iPhone. The only risk is that tool getting out of Apple.
 
Last edited:
  • Like
Reactions: kdarling
Do you not believe then that it's possible that this coveted piece of potential software would leak out or get stolen? Rendering a huge hole in the security of devices?

I think I may be understanding your and others' point of view...But its validity completely rests on the security of a company.

The validity of Apple's argument already rests on the security of a company! :)
  • Apple is claiming that such a tool could leak. (*)
  • This tool only works if you have access to Apple's signing keys.
  • Therefore Apple is actually claiming that their own secret keys could leak.
  • Which means that ANYONE could easily create and install the same software bypasses.
  • Therefore we already in the exact same insecure position that Apple is doing a Chicken Little over.
(*) Of course, it's not a general purpose tool at all. It's a one-off signed per-device OS version. Won't work anywhere else. It's just like when Apple stops signing an OS version now: you cannot install and use it.

The problem lies in designing a system that is both secure and not secure at the same time. It's not possible.

Well, as a devil's advocate:

iOS is already in that position because it allows anyone to use the correct passcode to get in. Learn someone's passcode and the iOS secure system is no longer "secure". Ditto for their fingerprints.

Also, the sheer fact that Apple possesses the source code to their own OS, plus signing keys, plus the ability to provision the Secure Enclave and Elements in their own devices, already likely means that we're vulnerable if that info got out.

Apple is using scare tactics in what should be a more level headed discussion. That's the sign of someone who isn't sure of their position. Same as when they got prior art banned from the courtroom. Instead of letting juries and courts use logic and common sense, Apple's lawyers go way overboard trying to stack the deck in their favor.
 
  • Like
Reactions: robbyx
Just your possible outcome...and one that is far less likely when "national security" is invoked. Have you been paying attention this past decade or two?
I have been paying attention. That is why I applaud Apple for resisting government overreach.

I don't expect to ever be accused of threatening "national security", but in case it ever happens, I want to be in control of some of my digital information. Even if I'm never accused, I don't want all of my data to be accessible by other countries (for their "national security") or by cyberterrorists, or by identity thieves. At the same time, I want to participate in the modern world. So long as Apple and other companies are fighting against the governments to protect my privacy, then the issue of my privacy is still a public issue.

I don't want Apple to stop fighting and start routinely complying with all requests for data (or to be forced to get out of the way so that third parties can access my data directly)
 
iOS is already in that position because it allows anyone to use the correct passcode to get in. Learn someone's passcode and the iOS secure system is no longer "secure". Ditto for their fingerprints.

That's like saying a lock isn't secure if you have the key.
 
I have been paying attention. That is why I applaud Apple for resisting government overreach.

I don't expect to ever be accused of threatening "national security", but in case it ever happens, I want to be in control of some of my digital information. Even if I'm never accused, I don't want all of my data to be accessible by other countries (for their "national security") or by cyberterrorists, or by identity thieves. At the same time, I want to participate in the modern world. So long as Apple and other companies are fighting against the governments to protect my privacy, then the issue of my privacy is still a public issue.

I don't want Apple to stop fighting and start routinely complying with all requests for data (or to be forced to get out of the way so that third parties can access my data directly)

Do you realize how immature you sound? If the government arrives at your house and hands you a search warrant, are you going to fight and not comply? We're not talking about some cop calling Apple and being like "hey, unlock this phone for me". We're talking about a legal search warrant issued by a judge. Are you seriously telling me that you have some right to just ignore this? Are you telling me that you should be able to keep you precious "digital information" secret at all costs, warrant or no warrant?
 
This is a marketing opportunity that only comes once in awhile.
Eddy should send a few billion to terrorists for making it happen.
Apple will be able to milk this for years of free publicity.
Apple Akbar.
Hmm, I recognise that logic...

toilet-terror.jpg
 
  • Like
Reactions: Ghost31
I think you might want to see someone about your paranoia.
It's not just my paranoia. Among the tech community, corporate elites, and pretty much everyone that posts on forums online, my stance is part of what is by far the vast majority. How can you not question your stance if for no other reason than you are in such a small minority? Even if you don't understand why you're wrong, shouldn't the overwhelming opinion of others make you seek to understand better?

I get that just because everyone else thinks one way doesn't mean you should too. Especially when everyone else is a moron. But the camp you're opposing isn't full of morons. It's the people who actually know what they're talking about. You're siding with politicians, lawyers, and law enforcement over technology leaders, principal engineers, and tech pioneers. On a technology issue. Even if we completely remove the details from the argument, how can you not see the flaw in which side you're on?

The prosecutors have no idea what they're talking about, or worse, they know exactly what they're trying to accomplish and will continue to try to dupe the layman into giving up his rights all the way to the bitter end.

You sound like an intelligent person. Either admit you're pro government and genuinely don't believe that privacy and freedom are what's best for human kind (which is a fair argument to make - people can't make bad choices and do bad things when they're not free to make choices or act sans surveillance), or do more research and realize why you're on the wrong side. Regurgitating the nonsense spouted by the FBI and its prosecutors adds nothing to the real argument. That crap is only meant to confuse the morons and elicit their obedience.
 
That's like saying a lock isn't secure if you have the key.

Well, yes. As long as a key exists for it, the strongest lock in the universe is obviously not secure from being unlocked. The lock's (or OS's) security actually comes with how well the key is secured.

Now, Apple already holds the keys to all versions of iOS. Do users consider the versions of iOS that they currently install on their device, to be insecure simply because Apple holds the keys? Of course not. They trust Apple to keep the keys safe.

Likewise, the version that the FBI wants to install is keyed just for that device. And Apple holds the key for it JUST LIKE THEY DO FOR ALL THE OTHER VERSIONS OF IOS.

So if Apple is claiming that the key to the specific version for each FBI device request could be taken, then they are by definition claiming that the keys to all other versions currently in use could also be taken.

So basically, Apple is claiming they're not trustworthy enough to hold the keys to their own OS. It's a self-defeating argument.
 
So you're fine with Apple providing tools to enable crime and simply washing their hands of all responsibility? Nice.
The notion of iPhones as a “crime enabling tool” (terrorist) crime is ludicrous.

So if one “isn’t fine” with “tools to enable crime”, shall we go and abolish the internet, forbid the distribution and printing of books etc.? An iPhone is first and foremost a device for communication and and storing information. Any form of communication and / or transmitting and storing information can and will “enable” crime. Whether it be terrorist-recruiting videos on the internet or the chemistry books from the local library. These literally give terrorists the blueprints for their operations.


Have we seen the source code for iOS get out? For MacOS? For Windows? For Photoshop? For Word? Never. In many decades.
That’s just flat-out wrong.
Parts of the Windows source code have been leaked, as (allegedly) Photoshop’s as well.
 
Well, yes. As long as a key exists for it, the strongest lock in the universe is obviously not secure from being unlocked. The lock's (or OS's) security actually comes with how well the key is secured.

My point is that it's therefore a useless example when deciding how secure a lock is.

Now, Apple already holds the keys to all versions of iOS. Do users consider the versions of iOS that they currently install on their device, to be insecure simply because Apple holds the keys? Of course not. They trust Apple to keep the keys safe.

Likewise, the version that the FBI wants to install is keyed just for that device. And Apple holds the key for it JUST LIKE THEY DO FOR ALL THE OTHER VERSIONS OF IOS.

So if Apple is claiming that the key to the specific version for each FBI device request could be taken, then they are by definition claiming that the keys to all other versions currently in use could also be taken.

So basically, Apple is claiming they're not trustworthy enough to hold the keys to their own OS. It's a self-defeating argument.

They hold the key that is necessary to sign and push a software update on the device, not the key to unlock the device.
 
Likewise, the version that the FBI wants to install is keyed just for that device.
If a "custom" version of iOS can be installed on one iPhone, it can be installed on any other iPhone. And easily so. After all, iPhones are all factory made to the same specifications and run the same OS. And that's what's going to happen: More and more government request will continue piling up, for more and more phones.
 
If the government arrives at your house and hands you a search warrant, are you going to fight and not comply? We're not talking about some cop calling Apple and being like "hey, unlock this phone for me". We're talking about a legal search warrant issued by a judge.

This isn't the reality we live in, though. There is a difference between knocking on your door and showing you a warrant vs. hacking your data and taking it. When they knock on your door, you can ask to see that warrant. Hell, you even have the freedom to pull out your gun and fight back, if that's how strongly you oppose whatever injustice they're trying to do.

But with digital info, you may not even be aware they're taking it. There is no opportunity for you to object, to request to see that warrant. And there is no oversight to hold them accountable to having had that warrant. You can't even fight back, should you choose. The single way you can keep your digital privacy is through it being impossible for them to take it. The government, law enforcement, and most human beings will circumvent rules when it suits them, especially when there is no accountability. Widespread use of Stingrays without warrants, illegal wiretapping.. these are real things.

Are you seriously telling me that you have some right to just ignore this? Are you telling me that you should be able to keep you precious "digital information" secret at all costs, warrant or no warrant?

You can't be compelled to reveal personal knowledge (not in America, at least). That info is locked away in a system no one can crack. Our founders didn't have the foresight to realize the same could be said for external knowledge stores, but we now have the opportunity to recognize that sameness and treat that information similarly.

We will one day have brain augmentations, implants that do what our current smartphones do. It might be far off, but it's inevitable. What isn't far off, however, is the decision which will establish whether or not the data in those augmentations is private or monitored by the government. In a future world where you will not be able to compete, work, or enjoy life on the same level as those utilizing these enhancements, do you really want people have to decide between privacy and poverty? Is that what freedom is supposed to look like? That's what's at stake here.
[doublepost=1457656535][/doublepost]
Agree 100%. And the reality is, if Apple doesn't do this, there will be legislation and ALL encryption will be weakened. The tech industry has an opportunity now to do the right thing and come up with a reasonable compromise. If they don't, we most certainly will see lawmakers with virtually no understanding of technology draft legislation that weakens security for everyone.
There is no compromise. Despite your optimism and/or misunderstanding, this is a black and white case. Either data is secure, or it is not.
[doublepost=1457656915][/doublepost]
So you're fine with Apple providing tools to enable crime and simply washing their hands of all responsibility?
Are you really so naive as to think this is at all related to Apple or that the outcome of this at all prevents future crime? Criminals have infinite other tools at their disposal. Apple isn't the only maker of phones, encryption, or messaging apps. The FBI is either really stupid or knows full well that a backdoor on the iPhone will do not a damn thing to thwart future terrorist activities. No, they know exactly what they're doing; they're paving the way for the surveillance state.
 
Do you realize how immature you sound? If the government arrives at your house and hands you a search warrant, are you going to fight and not comply? We're not talking about some cop calling Apple and being like "hey, unlock this phone for me". We're talking about a legal search warrant issued by a judge. Are you seriously telling me that you have some right to just ignore this? Are you telling me that you should be able to keep you precious "digital information" secret at all costs, warrant or no warrant?
The search warrant doesn't entitle the cops to force me to dig up my own backyard if they're looking for something that might be buried there. And it doesn't entitle them to dig up my backyard or make me do it if the warrant they have is for my neighbor's house. If I have a secret underground bunker (or just a storm shelter) that they don't know about, I'm not obligated to tell them about it.

I realize that the warrant is issued by a judge. Your desire to strip me of my rights doesn't hold any weight. If they demand I do something that they're not legally entitled to do, I have the right to refuse. Then a judge or judges who are senior to the judge who issued the warrant are likely to rule on whether I am right or the cops are right.

Apple has reason to think what the court is ordering them to do is illegal (and why are you are supporting the illegal order?), so they are appealing the order. It's their legal right to do so, and you want them to give up their rights to appeal an illegal order.
 
They hold the key that is necessary to sign and push a software update on the device, not the key to unlock the device.

That is correct. Ditto for the version the FBI wants. It's not a version that can be used anywhere else, or that contains a key to unlock the device.

If a "custom" version of iOS can be installed on one iPhone, it can be installed on any other iPhone.
No, not in this case. Each custom version would be keyed to one iPhone.

Apple has already said this is possible.

And easily so. After all, iPhones are all factory made to the same specifications and run the same OS. And that's what's going to happen: More and more government request will continue piling up, for more and more phones.

Yes, more requests will pile up, and Apple would be asked (and paid handsomely!) to create a version for each device.

This is not about some global backdoor. It's about whether or not Apple should be forced into creating a group just to comply with all the requests they'll probably get.

Mind you, they (like many other companies) no doubt already have dedicated people for other government requests, such as for Prism ones.
 
Last edited:
It's not a version that can be used anywhere else, or that contains a key to unlock the device.

No, not in this case. Each custom version would be keyed to one iPhone.
Apple has already said this is possible.
That might be true at face value but just technically speaking.
As I said, all iPhones are made to the same specifications.
The only thing preventing this software build from running on other iPhones is (to put it in simpler terms) the relation between software signing keys and the device's serial number(s).

Once you've figured out that math, which will be easy as Apple holds the root keys anyway, producing "other versions" will be a matter of a basic recompile. Put a few numbers into a calculator, then hit the "compile" button. And this is the only economically sensible way for Apple anyways, having to deal with hundreds of phones. It basically amounts to Apple being able to create "unique" backdoors at the cost and time of a few key presses.

Once you've figured it out, repeating the feat will be incredibly easy. Which is essentially what one could describe as a "global backdoor".
 
Last edited:
Once you've figured it out, repeating the feat will be incredibly easy. Which is essentially what one could describe as a "global backdoor".

Yes it could be made repeatable. No doubt Apple ALREADY has a similar tool for signing normal iOS releases. Yet nobody seems to worry about that tool, even though it could be used to sign a malicious GLOBAL version... unlike a tool that only does ONE device at a time.

Changing the passcode try limit is apparently doable. So is adding a way to enter the passcode remotely. But that's just coding.

What really matters is being able to sign the output, and the ability to do that already exists and could be lost/stolen just as easily (or not easily) as a device-specific signer. Heck, the current oft-used tool would be FAR more useful, too, since it would be more general.
 
Last edited:
Well, yes. As long as a key exists for it, the strongest lock in the universe is obviously not secure from being unlocked. The lock's (or OS's) security actually comes with how well the key is secured.

Now, Apple already holds the keys to all versions of iOS. Do users consider the versions of iOS that they currently install on their device, to be insecure simply because Apple holds the keys? Of course not. They trust Apple to keep the keys safe.

Likewise, the version that the FBI wants to install is keyed just for that device. And Apple holds the key for it JUST LIKE THEY DO FOR ALL THE OTHER VERSIONS OF IOS.

So if Apple is claiming that the key to the specific version for each FBI device request could be taken, then they are by definition claiming that the keys to all other versions currently in use could also be taken.

So basically, Apple is claiming they're not trustworthy enough to hold the keys to their own OS. It's a self-defeating argument.
So Apple doesn't trust themselves to hold such a "dangerous" tool... Why is that a terrible statement to make? Sounds like a reasonable and well-thought out conclusion.
 
So Apple doesn't trust themselves to hold such a "dangerous" tool... Why is that a terrible statement to make? Sounds like a reasonable and well-thought out conclusion.

The point is: Apple is already trusted to keep their CURRENT tools secret which make possible the very modifications the FBI wants.

They have the source code, they have the ability to update both the main OS and the secure enclave code, and they have the signing keys for all of it.

As for holding on to each specific device version, Apple says that part of their cost of doing each device's version, is the destruction of each version after the file data is extracted. Which is just being dramatic:

The FBI said it would be okay if Apple kept that device and the special version in their possession, and simply allowed an FBI passcode-guessing computer to access it remotely. (Which begs the question: why doesn't Apple just do the brute force unlock for them, and give the decoded file contents back, just like they did for years before now? Answer: because it's bad PR.)
 
Last edited:
The point is: Apple is already trusted to keep their CURRENT tools secret which make possible the very modifications the FBI wants.

They have the source code, they have the ability to update both the main OS and the secure enclave code, and they have the signing keys for all of it.

As for holding on to each specific device version, Apple says that part of their cost of doing each device's version, is the destruction of each version after the file data is extracted. Which is just being dramatic:

The FBI said it would be okay if Apple kept that device and the special version in their possession, and simply allowed an FBI passcode-guessing computer to access it remotely. (Which begs the question: why doesn't Apple just do the brute force unlock for them, and give the decoded file contents back, just like they did for years before now? Answer: because it's bad PR.)
Okay, valid points.

Now, what about legal precedence? Apple does this "just this once" and then has to continue to make consessions for the government wanting to spy on "terrorists". Snowden already revealed the vast wire tapping and spying on citizens because they're "suspicious" that the government does, so why should we trust our government to be responsible with the power to spy on anyone they want? That's the slippery slope argument that is being made and Apple is fighting against the legal precedence that this one 5c case is setting up if the FBI.

(Fact check about Snowden, I may be misrepresenting it)
 
Okay, valid points.

Now, what about legal precedence? Apple does this "just this once" and then has to continue to make consessions for the government wanting to spy on "terrorists".

I think we're well past that question.
http://www.c-span.org/video/?405442-1/hearing-encryption-federal-investigations
Goto 32:30 for preamble.
Then Comey contradicts himself after 34:00 saying that the FBI will use this case as a precedent to unlock other phones.
Once public knowledge, this "backdoor" will be the #1 target for cybercriminals.

Even without this tool, cybercrime losses are estimated at half a TRILLION dollars this year with growth projections to 2+ Trillion by 2019:
http://www.securitymagazine.com/articles/86352-cybercrime-will-cost-businesses-2-trillion-by-2019

It's simply naive to believe that Apple or any organization can be trusted to keep any keys secure if the ROI is high enough.
If there's a way in, it's only a matter of time.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.