Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I don't like Tim Cook's privacy stance in this particular case. On one hand, he says terrorism should be destroyed. On the other hand, he is supporting it by not helping the gov.
I disagree, if he helps the government do something illegal in the name of security, is that really providing long term security? No, it just provides temporary security from these threats and not from the long term threat of the government, because where does it stop?
 
Well, I don't think it will get that far. Even if Apple appeals the district court judge's order, no judge likes to see another judges order disobeyed, especially as publicly as Apple did with their open letter. I could be wrong of course, but legally, I don't see how Apple wins this one. And even if they do, it will provide more impetus for legislators to craft laws that compel such compliance. In a way, that might be worse. Seems to me whether Apple wins or loses this little fight, they lose the war.
I disagree because this is bringing the issue to the public's attention and as a result, WE can win the war and potentially even put back control of the government into the hands of the people, where according to the constitution it SHOULD be.
 
  • Like
Reactions: rdlink
I don't like Tim Cook's privacy stance in this particular case. On one hand, he says terrorism should be destroyed. On the other hand, he is supporting it by not helping the gov.

What in the Hell are you even talking about? Our freedoms are based in the unequivocal fact that there are times when the ends do not justify the means. This is one of those times. Forcing Apple to give up the security and privacy of all of their customers is not worth the inconsequential data that will be recovered from this dead terrorist's phone.
 
  • Like
Reactions: Benjamin Frost
I don't like Tim Cook's privacy stance in this particular case. On one hand, he says terrorism should be destroyed. On the other hand, he is supporting it by not helping the gov.
Uh no ... he isn't. This isn't about handing over information to the government, which they've done before. They're telling Apple to create a backdoor into their devices, which will compromise the security of hundreds of millions of users all across the world. This isn't just a United States issue ... it is a world issue. This would compromise device security across the planet. So this is a lot bigger than it appears to be.
 
Tim Cook should be thrown in jail for instructing his company to ignore a lawfully issue order from a judge.

There is no ifs, ands, or buts about it regardless of your stance on encryption.

American is a country of laws, laws which make it the greatest nation the planet has ever seen. Tim Cook decides it is his right to break the social contract we all agree to as part of being in a civilized society.

Hey, maybe I will just stop paying taxes because it is my human right not to pay taxes. Lets see how far that goes.


Looks like someone just took AP Government. The social contract has been broken as it's no longer rule by the people for the people, but rather rule by the corporations, banks, and rich. You have to be pretty blind to think laws should be followed blindly and the founding fathers even stated that when the government no longer represents the people, that it should be stood up against.
 
But in fact, they could.
The encryption key isn't linked to the password : if it was, everything should have to be reencrypted everytime you change your password.
No. The way it works is that a random key is used to encrypt the memory. That key is then encrypted using another key derived from the passcode. So if you change your passcode, only the first key needs to be reencrypted.
What they have to do is remove the password verification part (|| 1), while signing the OS with their own keys, adding a verification in the source to check the device ID. Result : no password / authentification, so full access to the phone.
Besides, they can even keep the phone for themselves while it's beign searched, so the FBI couldn't get the modified iOS version. It's not a security issue as long as you don't release it to the public : it's a fork of iOS.
It doesn't really matter if Apple does it inhouse or not. The problem is that once the precedent is established and the weakened iOS version created, the feds will request it again for other cases, and other governments will also demand access.
 
But in fact, they could.
The encryption key isn't linked to the password : if it was, everything should have to be reencrypted everytime you change your password.
What they have to do is remove the password verification part (|| 1), while signing the OS with their own keys, adding a verification in the source to check the device ID. Result : no password / authentification, so full access to the phone.
Besides, they can even keep the phone for themselves while it's beign searched, so the FBI couldn't get the modified iOS version. It's not a security issue as long as you don't release it to the public : it's a fork of iOS.

Now, why nobody ever thought about it? Simple : they can't sign the modified iOS version to work on the device.
Because the SECOND they do this, every single government agency in the world will demand Apple start doing this every time they want access to a phone. Even to phones they may have stolen from enemy countries. Can you imagine North Korea stealing an iPhone from a U.S. official and then demanding that Apple help them unlock it to get government secrets?

It's not as far fetched as you might think if the FBI wins this one.
 
  • Like
Reactions: Benjamin Frost
Because the SECOND they do this, every single government agency in the world will demand Apple start doing this every time they want access to a phone. Even to phones they may have stolen from enemy countries. Can you imagine North Korea stealing an iPhone from a U.S. official and then demanding that Apple help them unlock it to get government secrets?

It's not as far fetched as you might think if the FBI wins this one.
That isn't even a scenario. The problem is that if Apple creates a backdoor, then other people will eventually be able to find out how to use that backdoor ... not just the FBI. That is one of the many problems this would cause. It throws security completely out the window. It's completely idiotic.
 
  • Like
Reactions: Benjamin Frost
That isn't even a scenario. The problem is that if Apple creates a backdoor, then other people will eventually be able to find out how to use that backdoor ... not just the FBI. That is one of the many problems this would cause. It throws security completely out the window. It's completely idiotic.
Except that backdoor wouldn't exist except on the very device they want to unlock. It has nothing to do with every other iphones in the world : this is a version that would be installed only on one device.
 
Except that backdoor wouldn't exist except on the very device they want to unlock. It has nothing to do with every other iphones in the world : this is a version that would be installed only on one device.

The point isn't the backdoor, it's the precedent. If Apple were to lose this or cave, then it would be harder for them to stand firm on security when it's your phone the FBI wants into.
 
  • Like
Reactions: LizKat
Apple will eventually be forced to comply. A few "random" reviews of Apple's current tax practices by the IRS and something will be found amis.
I have full faith that Apple's massive team of accountants can handle the paper pushers at the IRS.
 
According to this Apple ha hire freedom of speech lawyers to fight the case:

http://pocketnow.com/2016/02/19/iphone-encryption-case

Still completely with the FBI and the DOJ on this one though, will be an interesting fight. But I'm not American so perhaps have a different view on it all? I know in the UK they would be forced to comply or face jail, or the agencies would just hack the device anyway?

According to McAffey there already exist at least one way to get around the primary concern of the FBI : The auto-wipe after 10 failed attempts. It requires physical control of the device - which they have.

The other two concerns are the delays after several failed attempts before you can attempt again, and having to manually put the code you wish to try in.

The first two issues would be solved by McAffee's route (involves resetting the memory that holds the attempt counts), the last cannot.

The FBI wants a custom firmware to disable the first two issues and enable the ability to have a computer enter the codes so they can crack it faster. A package which if it got out would make everyone's phone weaker. And as was mentioned elsewhere, the chances of them letting apple to be the ones to have it during the cracking so as to minimize it getting out - if they'd even let Apple be the ones to install the firmware - are low to slim.

Today, with a court order, one really can't expect the right to privacy from government surveillance, whether it is the tapping of communications, or even recording or photographing someone, including when they believe they are in private situations. There really isn't anything here to change that precedent. You are violating the law to refuse compliance.

I think the argument comes from the inferred, and unfortunately therefore hypothetical and fallacious argument that:

(Apple cooperating with government when complying with a court order on a known terrorist) = (total and constant government invasion of personal privacy of everyone)

There isn't a company in the world (gun manufacturers/sellers I'm looking at you) that can stand between criminals and their prosecution by the government by presenting the argument that it invades privacy of someone that a court has ordered to be monitored, etc.

If you disagree with this, then change the laws, but don't believe that public opinion is the forum.

The difference here is that Apple is being compelled to create a tool to hand over information not theirs.

When a court order is issued to a company it's almost always been to hand over something they posses - objects or their own actual records. Occasionally to have some assistance. All of which Apple has provided to the best of it's abilities.
 
Oh, sorry, I thought you were doing an impression of the "Apple's a bunch of commie, terrorist sympathizers!" thing that's popping up.

Oh no, I didn't realize my comment could have been interpreted both ways until you pointed it out. Thanks!
 
According to McAffey there already exist at least one way to get around the primary concern of the FBI : The auto-wipe after 10 failed attempts. It requires physical control of the device - which they have.

The other two concerns are the delays after several failed attempts before you can attempt again, and having to manually put the code you wish to try in.

The first two issues would be solved by McAffee's route (involves resetting the memory that holds the attempt counts), the last cannot.

The FBI wants a custom firmware to disable the first two issues and enable the ability to have a computer enter the codes so they can crack it faster. A package which if it got out would make everyone's phone weaker. And as was mentioned elsewhere, the chances of them letting apple to be the ones to have it during the cracking so as to minimize it getting out - if they'd even let Apple be the ones to install the firmware - are low to slim.

I have read conflicting information in UK news sites where the FBI clearly stated it has no wish to be involved in the process of installing this software, nor know how it works, or be present during the process, and requests Apple perform the installation and development of the software at it's HQ only and then delete the software.
So it will be very interesting reading the real evidence when this goes further up the legal chain.
 
The point isn't the backdoor, it's the precedent. If Apple were to lose this or cave, then it would be harder for them to stand firm on security when it's your phone the FBI wants into.
I get that. I'm simply refering to the "it's stupid because it would create a backdoor" and the "Apple can't do it anyway!" things.
 
A marketing strategy even though Apple has consistently emphasized the security of iOS for years?
The whole consistent emphasis on security is a marketing strategy. Apple knows that people will feel better about using Apple products if they feel their data is secure. And having a secure platform makes companies more comfortable having Apple devices connected to their enterprise systems. It allows development of features like ApplePay.

If Apple deliberately compromised their own system, it would be bad for their marketing strategy. And incidentally also bad for Apple customers.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.