Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
FWIW, it appears the device can crack at about 240ms per guess. That means a little over four guesses per second.

For a four-digit passcode (10,000 possibilities), that's only 40 minutes.

For a six-digit passcode (1,000,000 possibilities), it's close to three days.

Seven digits, as someone has suggested: almost 28 days.

For an eight-character password, assuming 95 possible characters*, it's over 50 million years.

The key point here being that it appears to be an online attack, so speed is limited by whatever the Secure Enclave allows (enforced 80ms delay) plus additional latency by its surroundings. Someone who still has an all-digit passcode should perhaps be slightly worried, though. (Leaving aside that I'm fairly confident Apple will fix this soon.)

*) 26 lower-case, 26 upper-case, digits, some special characters. In practice, I'm guessing iOS actually allows quite a few more than that, making it even harder to crack. But let's still assume these are the common characters to use in a password.
 
Wow, that's some mind boggling logic. You don't care if a criminal gets his hands on your unlocked iOS device?

FWIW, in ten years I had never lost one of my iOS devices and considered myself very competent at keeping them secure, almost to the point of being cocky about it. Of course, a few weeks ago I forgot my iPad on a plane. It only has a 6-digit passcode but I'm extremely happy that I had the feature activated.

I think it's kinda like leaving your unlocked car outside your house with the keys inside. You can say you're willing to chance it and just chalk up the loss if something happens, but you'd feel pretty stupid if a drunk got inside and crashed it into your house while you're sleeping.

Funny you should say that. I leave my keys in my car most of the time. If someone is gonna take it, they are gonna take it. That is what insurance is for. The convenience outweighs the very minuscule risk. Same with the phone.
 
I get the strange feeling we all hate Cellebrite.: https://en.wikipedia.org/wiki/Cellebrite

FWIW, it appears the device can crack at about 240ms per guess. That means a little over four guesses per second.

For a four-digit passcode (10,000 possibilities), that's only 40 minutes.

For a six-digit passcode (1,000,000 possibilities), it's close to three days.

Seven digits, as someone has suggested: almost 28 days.

For an eight-character password, assuming 95 possible characters*, it's over 50 million years.

The key point here being that it appears to be an online attack, so speed is limited by whatever the Secure Enclave allows (enforced 80ms delay) plus additional latency by its surroundings. Someone who still has an all-digit passcode should perhaps be slightly worried, though. (Leaving aside that I'm fairly confident Apple will fix this soon.)

*) 26 lower-case, 26 upper-case, digits, some special characters. In practice, I'm guessing iOS actually allows quite a few more than that, making it even harder to crack. But let's still assume these are the common characters to use in a password.

I use strong password only in the cloud,,, Easy for hackers to get... Much less so, on a local device unless i choose to, and quite frankly, if u are serious about what you do in terms of protection, there will never be exceptions, not even a slip of the tongue.. in any case at all.. no matter who it is.

I still use 4 digits on my phone and also probably very insecure..... my defenses used are way more important.
 
Last edited:
It's a catch 22, if Apple engineer a process by which they can get into the phone through a backdoor then hackers will very quickly work out how to exploit it. You can't design something like that that can only be used by a certain population of people.

No, it's actually not a catch-22. Let's say someone breaks into your house and steals everything and kills your dog. There's a camera next door pointed right at your house and the neighbor has the entire video of the whole thing. But it's Tim Cook, and he won't give you or law enforcement access to anything, and he'll keep the video camera recording 24x7. Not much different.
[doublepost=1521850445][/doublepost]
There's no need for hyperbole. No one is refusing them their ability to uphold the law. Law enforcement has always had barriers to accessing suspects' communications. In the past there was no record kept of past communications so law enforcement could only investigate on a go forward basis. People could - and still can - communicate face to face in private without law enforcement having access to their conversation. If an encrypted text message log has returned us to that then so be it. They still have access to all traditional (pre-smart phone era) investigative techniques PLUS they effectively have a 24/7 tracking device on all of us via cell tower records PLUS for any traditional SMS and phone call data they have all the phone company records of those meta data regardless of how well encrypted our devices are.

Crimes continue to be investigate and solved. Sure there are some barriers that technology has created but there are also lots of new avenues it has opened. Personally I feel strong encryption has many more benefits to everyone's daily lives than weakening it would provide to a minority of criminal activity.


The only people afraid are the ones with something to hide. I, for one, welcome anyone willing to help solve crime and will not buy product from those that shield criminals.
 
I’ll be in the lobby with a legal pad and a pen. :D


Ok, but maybe somebody will rob your house, or hit and run your car outside, or kidnap your cat. And they'll probably have an iPhone, maybe even yours. Wouldn't it be nice to catch them, or do you consider that a donation to those less fortunate ;)
[doublepost=1521993535][/doublepost]
Why wait? Just like cracking an iPhone passcode there could be valuable hours passing in which we haven't solved crimes. Please post all your user names and passwords here now.

Because there's no crime, you schmuck.
 
Ok, but maybe somebody will rob your house, or hit and run your car outside, or kidnap your cat. And they'll probably have an iPhone, maybe even yours. Wouldn't it be nice to catch them, or do you consider that a donation to those less fortunate ;)
[doublepost=1521993535][/doublepost]

Because there's no crime, you schmuck.

LMAO!!!!!
Sorry, I have been a recipient of a home invasion. I was not concerned about a phone other than its ability to dial 911.
Oh, and the ******* I shot in my home.

Sorry, no. If you can come up with a way for the legitimate authorities to gain access without black hats, criminals, and State organizations from leveraging the same I would agree.
 
No, it's actually not a catch-22. Let's say someone breaks into your house and steals everything and kills your dog. There's a camera next door pointed right at your house and the neighbor has the entire video of the whole thing. But it's Tim Cook, and he won't give you or law enforcement access to anything, and he'll keep the video camera recording 24x7. Not much different.
[doublepost=1521850445][/doublepost]


The only people afraid are the ones with something to hide. I, for one, welcome anyone willing to help solve crime and will not buy product from those that shield criminals.

It's not that Tim Cook won't give access. It's that Tim Cook currently doesn't have the tools to give access. Apple stopped storing encryption keys, removing its capability to unlock data stored locally on people's phones. At issue here is whether Apple has the responsibility to have a prior capability in place to give access when needed.

With attitudes like yours, who needs backdoors? Wanting privacy isn't an abnormal trait that is peculiar to criminals. It's the foundation of a democratic society. Just because one doesn't want the government getting its hands on their information doesn't mean they have something to hide. It's just that they believe-right so, I might add-that it's one of the government's business!

The government can get a warrant based on probably cause to search someone's home. That means that if the owner locked the door, the government has every right to what it takes to get into that person's home, even if it involves breaking down the door. That's all a warrant implies. The warrant doesn't guarantee that the police will get what they're searching for. The homeowner doesn't have to tell the police to exactly where to look to find what they're looking for. It's still perfectly lawful for the homeowner use a lock that's hard to pick, install a door that's hard to kick down or use any other means that makes it as hard as possible for anyone to get into the house without unlocking the lock with the right key.
 
The government can get a warrant based on probably cause to search someone's home. That means that if the owner locked the door, the government has every right to what it takes to get into that person's home, even if it involves breaking down the door. That's all a warrant implies. The warrant doesn't guarantee that the police will get what they're searching for. The homeowner doesn't have to tell the police to exactly where to look to find what they're looking for. It's still perfectly lawful for the homeowner use a lock that's hard to pick, install a door that's hard to kick down or use any other means that makes it as hard as possible for anyone to get into the house without unlocking the lock with the right key.

I believe the issue with encryption is the fact that the authorities are unable to access the content of these devices WITH a warrant. A search warrant isn't worth the paper it's printed on if the door has a lock that cannot be broken.
 
I guess I'm a first victim of GrayKey. Reading about it I've changed my trusted 4-digit password to an alphanumeric string which I promptly forgot. Yeah I felt like a total idiot.
Being in transition between machines I've backupped only select folders, which, surprise surprise, didn't include MobileSync. How's that for an idiot?! Such distraction earns you a title of a Chief Supreme Idiot. And I guess being one I want to thank GrayKey for that.
 
  • Like
Reactions: LIVEFRMNYC
I guess I'm a first victim of GrayKey. Reading about it I've changed my trusted 4-digit password to an alphanumeric string which I promptly forgot. Yeah I felt like a total idiot.
Being in transition between machines I've backupped only select folders, which, surprise surprise, didn't include MobileSync. How's that for an idiot?! Such distraction earns you a title of a Chief Supreme Idiot. And I guess being one I want to thank GrayKey for that.

Unless you’re engaging in some illegal activities or planning a terrorist attack chances are you won’t have to worry about GrayKey.
 
I've been reading this discussion with interest and have a few thoughts:
1.While we could, even 5 years ago, maintain some separation between our online presence and private life/secure documents,/financial life, I don't think that is possible any longer.
~The government and many employers require all of our banking information as a precursor to direct deposit.
~Sensitive documents, e.g., mortgage docs, income tax docs, medical results, family information are routinely e-mailed and must be kept, at least temporarily, accessible to complete various processes online. Other documents.... copies of our SS cards, drivers licenses, birth certificates, copies of ID's, passports and other identification are also transmitted online to government agencies. Even photos can be used by criminals.
~ Educational documents , e.g., transcripts, student loans, with all personal information are transmitted electronically.
~ Resumes for employment are kept available online for transmission.
2. I really don't think that it is realistic today to not have sensitive information on our phones that we used to keep in safety deposit boxes at the bank.
3. Security should then be considered top priority for most of us. Yes, I just switched to a 12 digit alphanumeric pw and have made minor changes to several other PW's in the Apple ecosystem, including iTunes, iCloud, and added an encryption key on my MAC.
4. As someone who did have their iPhone 6s stolen about 2 years ago, and reported within a few minutes (had to borrow a phone :0((, the police observed while I tracked it on my iPad watching it leave town until they figured out they were being tracked and turned it off. I was able to remotely erase all data and it became a brick. I didn't get it back. Even had the police recovered it, I wouldn't have allowed them to access the contents of my phone for simple privacy reasons, even as the "good guy".
5. People who are around kids and early teens should protect their phones just as judiciously as they would a paper copy of their birth certificate, DL, or official University Transcript. Phones with this much sensitive information on them are not toys and should always be kept in a safe place.

Bottom Line, Security should be the first priority for those of us who value it; know the danger of unfettered government control over our lives (even if we're doing nothing wrong:rolleyes:); and who have to conduct the business of life online. It was inconvenient to get into the safe or go to the bank or go to the safety deposit box or find the paperwork in the past. A tiny bit of inconvenience to use a more secure, complex PW is a minor trade-off for the massive convenience of having a computer in your pocket with all of your information in one place. I, for one, hope Apple continues to prioritize security.
 
I've been reading this discussion with interest and have a few thoughts:
1.While we could, even 5 years ago, maintain some separation between our online presence and private life/secure documents,/financial life, I don't think that is possible any longer.
~The government and many employers require all of our banking information as a precursor to direct deposit.
~Sensitive documents, e.g., mortgage docs, income tax docs, medical results, family information are routinely e-mailed and must be kept, at least temporarily, accessible to complete various processes online. Other documents.... copies of our SS cards, drivers licenses, birth certificates, copies of ID's, passports and other identification are also transmitted online to government agencies. Even photos can be used by criminals.
~ Educational documents , e.g., transcripts, student loans, with all personal information are transmitted electronically.
~ Resumes for employment are kept available online for transmission.
In most cases it cannot. Still, for Direct Deposit all my employer or other institution needs is a routing / account number. Not all my bank info. Just that one account. While a lot of this information is emailed (and should be kept secure) most items like SS Cards, Passports, Government issued ID's, etc are either mailed or picked up in person. Information transmitted is encrypted or sent via secure electronic documents. This type of information is seldom, and should never be transmitted unencrypted. Electronic transmission has replaced snail mail in many instances. Online presence is really just another form of communication. Instead of a piece of physical paper, we now use virtual electronic paper. ;)

2. I really don't think that it is realistic today to not have sensitive information on our phones that we used to keep in safety deposit boxes at the bank.
I cannot think of anything in my Safe Deposit Box I also have on my phone. Not sure what you would have duplicated :cool:


3. Security should then be considered top priority for most of us. Yes, I just switched to a 12 digit alphanumeric pw and have made minor changes to several other PW's in the Apple ecosystem, including iTunes, iCloud, and added an encryption key on my MAC.
That is always a challenge. Personally I have a two step and 8 digit and a password manager - too many accounts with too many requirements. All your devices should be encrypted if possible IMO. ;)


4. As someone who did have their iPhone 6s stolen about 2 years ago, and reported within a few minutes (had to borrow a phone :0((, the police observed while I tracked it on my iPad watching it leave town until they figured out they were being tracked and turned it off. I was able to remotely erase all data and it became a brick. I didn't get it back. Even had the police recovered it, I wouldn't have allowed them to access the contents of my phone for simple privacy reasons, even as the "good guy".
Remote wipe and/or X number of fails wipes the device should be standard. That said, keep your devices away from kids :eek:.


5. People who are around kids and early teens should protect their phones just as judiciously as they would a paper copy of their birth certificate, DL, or official University Transcript. Phones with this much sensitive information on them are not toys and should always be kept in a safe place.
I would say even more critical. Accidental wipe via "too many failed attempts to unlock" (had that happen:(), critical apps have secondary security (most do), or physical damage.


Bottom Line, Security should be the first priority for those of us who value it; know the danger of unfettered government control over our lives (even if we're doing nothing wrong:rolleyes:); and who have to conduct the business of life online. It was inconvenient to get into the safe or go to the bank or go to the safety deposit box or find the paperwork in the past. A tiny bit of inconvenience to use a more secure, complex PW is a minor trade-off for the massive convenience of having a computer in your pocket with all of your information in one place. I, for one, hope Apple continues to prioritize security.

Apple has been good on the security side for the most part. Now if they would place as much emphasis on well developed "it just works" devices ... :oops:
 
  • Like
Reactions: CPTmom2wp
It's not that Tim Cook won't give access. It's that Tim Cook currently doesn't have the tools to give access. Apple stopped storing encryption keys, removing its capability to unlock data stored locally on people's phones. At issue here is whether Apple has the responsibility to have a prior capability in place to give access when needed.

With attitudes like yours, who needs backdoors? Wanting privacy isn't an abnormal trait that is peculiar to criminals. It's the foundation of a democratic society. Just because one doesn't want the government getting its hands on their information doesn't mean they have something to hide. It's just that they believe-right so, I might add-that it's one of the government's business!

The government can get a warrant based on probably cause to search someone's home. That means that if the owner locked the door, the government has every right to what it takes to get into that person's home, even if it involves breaking down the door. That's all a warrant implies. The warrant doesn't guarantee that the police will get what they're searching for. The homeowner doesn't have to tell the police to exactly where to look to find what they're looking for. It's still perfectly lawful for the homeowner use a lock that's hard to pick, install a door that's hard to kick down or use any other means that makes it as hard as possible for anyone to get into the house without unlocking the lock with the right key.


Nice try. Tim Cook does indeed have access. They write the code, they can do whatever they want. No audit. No disclosure. It's like saying "I checked 'private' in Facebook, so it's private, right?"
 
Nice try. Tim Cook does indeed have access. They write the code, they can do whatever they want. No audit. No disclosure. It's like saying "I checked 'private' in Facebook, so it's private, right?"

No it is not the same (though it appears that many people made just that assumption, and are now finding out that it wasn't the case.)

And just because they wrote the code doesn't mean they have access.

Consider the concept behind the use of PGP in email. Having the encryption key doesn't allow you to decrypt it - you need a separate decryption key. They may have designed the system such that they don't see the second key for a variety of reasons, the biggest being that this is the way the customers want it designed.

I have nothing criminal to hide, however I would like to know that the security of the device is such that I can legally and ethically use it to contact colleagues regarding patients. Not all those who desire security and privacy do so for criminal reasons.
 
  • Like
Reactions: dk001
No it is not the same (though it appears that many people made just that assumption, and are now finding out that it wasn't the case.)

And just because they wrote the code doesn't mean they have access.

Consider the concept behind the use of PGP in email. Having the encryption key doesn't allow you to decrypt it - you need a separate decryption key. They may have designed the system such that they don't see the second key for a variety of reasons, the biggest being that this is the way the customers want it designed.

I have nothing criminal to hide, however I would like to know that the security of the device is such that I can legally and ethically use it to contact colleagues regarding patients. Not all those who desire security and privacy do so for criminal reasons.

You are either a mathematical expert who understands all there is to know about PGP including the inner workings and theory behind the algorithms, or an extremely gullible person. Those that control the code control the access. Period.
 
You are either a mathematical expert who understands all there is to know about PGP including the inner workings and theory behind the algorithms, or an extremely gullible person. Those that control the code control the access. Period.

Sure, the one who writes the code controls the access but that control is only there when the code is being written. Apple took away its access already. The code is already written and in the hands of users.

At issue here is whether Apple should have written the code to give itself access so that if law enforcement wanted, Apple could get information off someone's phone and hand it over to law enforcement. I think the answer is no because that special way to get access can be used by the bad guys too.
 
  • Like
Reactions: dk001
Sure, the one who writes the code controls the access but that control is only there when the code is being written. Apple took away its access already. The code is already written and in the hands of users.

At issue here is whether Apple should have written the code to give itself access so that if law enforcement wanted, Apple could get information off someone's phone and hand it over to law enforcement. I think the answer is no because that special way to get access can be used by the bad guys too.


Specifically what proof do you have that "Apple took away its access already"? Did you actually see the code in question?
 
Huh? This drives Apple to be more secure from which you benefit.
Are you serious? The fact that Apple will work to try to prevent these sort of hacks is a side effect of what these guys are doing, not some noble int nation.
[doublepost=1522823081][/doublepost]
Why? They shattered the myth of the 'unbreakable' iOS security?
Because people that dedicate their lives to hacking security systems, causing untold billions of dollars and countless hours of human effort to be wasted I order to try to prevent such hacks are scumbags.
 
You are either a mathematical expert who understands all there is to know about PGP including the inner workings and theory behind the algorithms, or an extremely gullible person. Those that control the code control the access. Period.

You are changing the parameters of your original statement.

You first indicated that “Tim Cook does have access”, then later state that Apple controls access. The latter statement is correct, but does not invariably imply the former.

Simply stated, Apple made a choice to design the software so they can’t decrypt data that they have selected to remain private. This is how they exercised their control.

They have chosen to have their software design reflect the requests of the majority of their users (remember that not all of their customers reside in the US, and not all share your views regarding government and privacy.) Apple has also clearly and repeatedly told people that they are free to take their business elsewhere if they don’t like Apple’s approach. Perhaps this is something you should consider if you don’t agree with Apple’s approach to the privacy issue - voting with your wallet can be quite powerful if enough people do it.
[doublepost=1522902072][/doublepost]
Specifically what proof do you have that "Apple took away its access already"?

This is likely the same amount of proof you have that Apple continues to have access to the data.

(Having said this, security blogs, white papers, and reports seem to indicate that iOS is relatively secure, and poses an increasingly difficult target due to security measures and encryption. When one also considers the amount of money that Apple is leaving on the table by not monetizing user data [e.g. Siri’s relatively poor performance] then it is reasonable to presume that Apple puts a premium on privacy. This, in turn, would suggest that they would be more likely to remove their own access to data, rather than leaving a back door open.)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.