Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Them getting into the phone isn't the issue, it's
how it went down. Apple said they didn't want to help because of the risk of the method getting out into the wild. They could have just helped, kept it quiet and we may never have known, instead they practically dared the creation of a method that they now have no control over themselves.

You are obviously just in your own imagined world, happily disregarding the real facts.
 
  • Like
Reactions: jnpy!$4g3cwk
I believe the FBI's concern is how information that should be restricted to those systems ended up on her server.

It certainly is if you're passing on classified information through the emails on it. Which has already been shown to have happened.

I could be mistaken, but I don't think it's been proven that classified info from that other 'super classified system', whatever it is, ended up on her server. It seems like a widespread issue too, she's not the first secretary of anything really to do it.
 
Well, that plan didn't work out so well because now Congress is looking into legislation that will require manufacturers to have ways to give the government access when needed. If a bill like this passes, it doesn't matter what Apple does, they will have to comply with the law.

https://www.macrumors.com/2016/04/08/senate-draft-encryption-bill-dangerous/

That seems an odd law to try and pass - a bit like making lock manufactures legally responsible for being able to open one of their locks at any time. If you give the ability to lock down the device to the user, I don;t see how you can force the manufacturer to unlock it.
 
I could be mistaken, but I don't think it's been proven that classified info from that other 'super classified system', whatever it is, ended up on her server. It seems like a widespread issue too, she's not the first secretary of anything really to do it.
According to Wikipedia, "65 emails were found to contain information classified as 'Secret', more than 20 contained 'Top-Secret' information, and the rest contained 'Confidential' information". News stories from agencies like the New York Post cited that an unspecified number of emails were beyond "Top Secret" (whatever that means) as they originated from "special access programs". The State Department won't release 22 of the emails due to the secret content and, meanwhile, Fox News claims to have discovered that Hillary is an agent of Satan. Ok, I made that last part up.

It is my understanding that the FBI wasn't as concerned about the server as to what information made it to the server and how. However, with them releasing little details, it is hard to tell.

I will reiterate that Hillary, as a seasoned politician, new the risks of running her own server and just didn't care as it made things more convenient for her.
 
  • Like
Reactions: Tiger8
That seems an odd law to try and pass - a bit like making lock manufactures legally responsible for being able to open one of their locks at any time. If you give the ability to lock down the device to the user, I don;t see how you can force the manufacturer to unlock it.

It's not so strange, actually. If you take a look at CALEA (Communications Assistance for Law Enforcement Act) passed in the mid-1990s, it is very analogous.

https://en.wikipedia.org/wiki/Communications_Assistance_for_Law_Enforcement_Act

That law required communications companies to arrange their systems in such a way as to allow lawful law enforcement access. There is a similar arrangement for banks so the government can investigate bank fraud and money laundering.

My point was that by Apple not cooperating with the FBI in the first place, this got the attention of Congress. In fact, I seem to recall one senator scolding Bruce Sewell at the recent senate hearings on this issue, saying in effect that if Apple did not cooperate, a law might be passed and that "Apple will not like it". My own view is that Congress will not let one company impede the legitimate investigations of law enforcement, and that one way or another, at the end of the day, Apple will have to comply. I hope I'm wrong, and that there is a mutually satisfactory solution for all. Encryption is important for users like us. But so are legitimate law enforcement investigations. I'm hoping a middle ground can be reached.
 
Them getting into the phone isn't the issue, it's
how it went down. Apple said they didn't want to help because of the risk of the method getting out into the wild. They could have just helped, kept it quiet and we may never have known, instead they practically dared the creation of a method that they now have no control over themselves.
These things never stay quiet forever. It was just a few years ago that it was revealed that the cell carriers were working with the government, providing access to detailed call meta data without any warrant. Security is a core feature that Apple touts - especially as their products now store electronic payment information (Apple Pay), health information (HealthKit), etc. If Apple can't be compliant with various card processor security requirements, federal security requirements like HIPAA, all of the European security requirements, etc. then they expose themselves to limitless lawsuits.

Frankly, if Apple tells me that my data is encrypted on my phone, I don't want a hidden asterisk after that statement that says that it excludes information sharing with the government and to facilitate that, they have integrated backdoors or other measures that make it easier for hackers to access the data as well.

One of the founding fathers of the United States warned us what would happen you trade your rights with the government in exchange for their commitment of security...
 
These things never stay quiet forever. It was just a few years ago that it was revealed that the cell carriers were working with the government, providing access to detailed call meta data without any warrant. Security is a core feature that Apple touts - especially as their products now store electronic payment information (Apple Pay), health information (HealthKit), etc. If Apple can't be compliant with various card processor security requirements, federal security requirements like HIPAA, all of the European security requirements, etc. then they expose themselves to limitless lawsuits.

Frankly, if Apple tells me that my data is encrypted on my phone, I don't want a hidden asterisk after that statement that says that it excludes information sharing with the government and to facilitate that, they have integrated backdoors or other measures that make it easier for hackers to access the data as well.

One of the founding fathers of the United States warned us what would happen you trade your rights with the government in exchange for their commitment of security...


Trump 2016
 
It's not so strange, actually. If you take a look at CALEA (Communications Assistance for Law Enforcement Act) passed in the mid-1990s, it is very analogous.

https://en.wikipedia.org/wiki/Communications_Assistance_for_Law_Enforcement_Act

That law required communications companies to arrange their systems in such a way as to allow lawful law enforcement access. There is a similar arrangement for banks so the government can investigate bank fraud and money laundering.

So I wonder how this law would work with the various apps that store personal information, bank details etc and allow the user to encrypt the information? Let's assume that the encryption used is effectively unbreakable - the ability to read this information would be beyond Apple, and yet...?

Suppose there's a phone being used by terrorists, and they have a code whereby 'banana' means 'bomb' and 'fruitbowl' means 'plane' - I can just imagine an angry FBI agent, waving a copy of an email in Tim Cook's face shouting "THEY SAY THEY'VE PUT THE BANANAS IN THE FRUITBOWL!! WHAT DOES THAT MEAN!?!?!?! YOU HAVE TO TELL ME - IT'S THE LAW!!!!"
 
So I wonder how this law would work with the various apps that store personal information, bank details etc and allow the user to encrypt the information? Let's assume that the encryption used is effectively unbreakable - the ability to read this information would be beyond Apple, and yet...?

Suppose there's a phone being used by terrorists, and they have a code whereby 'banana' means 'bomb' and 'fruitbowl' means 'plane' - I can just imagine an angry FBI agent, waving a copy of an email in Tim Cook's face shouting "THEY SAY THEY'VE PUT THE BANANAS IN THE FRUITBOWL!! WHAT DOES THAT MEAN!?!?!?! YOU HAVE TO TELL ME - IT'S THE LAW!!!!"

Well, I don't think CALEA applies here because that law was directed to phone companies. At the time the legislation was enacted, device makers were excluded from the law. The new draft legislation from the Senate Intelligence committee basically covers the device manufacturers that CALEA excluded back in the 1990s.

Your question about apps, particularly those independent apps that use end to end encryption for example, it's a good question. No doubt today there is a need for encryption, especially with data like bank transactions or health information as you point out, and in light of all the hacking exploits. Encryption is valuable and in many cases essential. But not just the good guys use it. Bad guys use it too, for keeping info on money laundering, drug dealing, kidnapping, etc. I don't know what the answer is, but there has to be some middle ground where law enforcement can break the encryption on the phone itself and the encrypted apps pursuant to a lawful court order.
 
meanwhile, Fox News claims to have discovered that Hillary is an agent of Satan. Ok, I made that last part up.
Hahahaha!! Today you win the internet! Remember, Fox News was the first to report, years ago, that then senator Obama has 2 black kids... IN WEDLOCK! :eek::eek::eek::eek:

I will reiterate that Hillary, as a seasoned politician, new the risks of running her own server and just didn't care as it made things more convenient for her.
Agree!
 
Considering that I once unlocked a friends phone when it was sitting on a table at a restaurant by absent mindedly playing with it and he asked how I unlocked it. I said, I honestly didn't know, i was just fiddling with it and it unlocked.

So that means there's definetely something real simple and easy that's not a back door, just something they overlooked. It used to work with swiping up and telling it you wanted to access the camera, then backing out of the camera and you were back at the home screen.

Or they could have taken the dead guy's hand off and used his dessicated dead finger to unlock it.
 
Considering that I once unlocked a friends phone when it was sitting on a table at a restaurant by absent mindedly playing with it and he asked how I unlocked it. I said, I honestly didn't know, i was just fiddling with it and it unlocked.

So that means there's definetely something real simple and easy that's not a back door, just something they overlooked. It used to work with swiping up and telling it you wanted to access the camera, then backing out of the camera and you were back at the home screen.

Or they could have taken the dead guy's hand off and used his dessicated dead finger to unlock it.

5c doesn't have Touch ID
 
That seems an odd law to try and pass - a bit like making lock manufactures legally responsible for being able to open one of their locks at any time. If you give the ability to lock down the device to the user, I don;t see how you can force the manufacturer to unlock it.

Could be a useful analogy. Suppose that the government passed a law that required lock makers to keep a key for every lock they ever manufacture. The company can implement this however they like. Locks last a long time, so, if every lock has a unique key, that is a lot of keys that they have to store very, very safely, and, produce a duplicate every time the government shows up with a search warrant. This facility could add a large cost to every lock. OTOH, the company could just build locks that are also opened by a single master key, and, give the key to the government. Cheap. Problem solved, right?

To sell locks, the lock company needs to convince people that their locks can be used to protect personal information. It may not have to be impossible to get to it, but, it does need to be difficult and expensive to get to that information-- and stay that way for a while.
 
FBI hacked the iPhone but they still can't decide what to do about Hillary's email server...

They already have access to the server; no need to beg anybody to crack it.
[doublepost=1460666393][/doublepost]
FBI = Amateur hackers. Cellebrite are the pros.

Pros that only charge $15,000? What a great deal.
[doublepost=1460666576][/doublepost]
Wasn't the ability to "hack" the phone just the ability to clone the iOS image so it could run in some type of emulator? You could then run multiple copies of the image and try as many possible PINs as you could until you unlocked one of the clones.

Yup, I think that's right.
[doublepost=1460667238][/doublepost]
Could be a useful analogy. Suppose that the government passed a law that required lock makers to keep a key for every lock they ever manufacture. The company can implement this however they like. Locks last a long time, so, if every lock has a unique key, that is a lot of keys that they have to store very, very safely, and, produce a duplicate every time the government shows up with a search warrant. This facility could add a large cost to every lock. OTOH, the company could just build locks that are also opened by a single master key, and, give the key to the government. Cheap. Problem solved, right?

To sell locks, the lock company needs to convince people that their locks can be used to protect personal information. It may not have to be impossible to get to it, but, it does need to be difficult and expensive to get to that information-- and stay that way for a while.

I don't think it's a good analogy, not that I don't agree with you guys. Locks can be broken with enough firepower or brute physical strength, unless you're using locks made out of adamantium or something. The government probably already has special tools to broken open any lock. Even if the lock is unbreakable, they can probably get into the building or the container (what have you) by other methods. So the government would not need to pass any law requiring lock makers to keep spare keys.

Encryption is so much more potent. It's a different beast. Increasing amount of physical material would not necessarily solve an encryption problem, as it would do a lock.
 
Last edited:
Wasn't the ability to "hack" the phone just the ability to clone the iOS image so it could run in some type of emulator? You could then run multiple copies of the image and try as many possible PINs as you could until you unlocked one of the clones.

All the brute force attacks must be run on the target device itself.

This is because the entered passcode is tangled with a device specific key that cannot be read, but only caused to be combined. And then it's stretched by multiple iterations that are designed to take around 80 ms each time.

This is why it could take literally years to find a good six character passcode.

What you might be thinking of, is cloning the part of memory that holds the retry count before wiping, so that it could be put back to zero on the device every time you almost reach the retry limit.
 
Last edited:
First world problem : Apple doesn't need to make a media frenzy, but that doesn't mean anyone who helps them needs to follow them either down that 'hush-hush' path..

Apple strikes again..

Tim wants to be in charge of the whole world..... Let the FBI do their job. If they wanna report something and let everyone in on it, so be it...

Apple won't gain anything by keeping thing under wrap except many want to know what's going on.

On the other side. i found it shocking to think Apple would even think that if they are "trying to keep all users secure"

By definition, by keeping out of the media when it comes to the law, IS a reverse of what they did.
 

The author of that article was guessing, and he guessed wrong

RF based sensors like TouchId can be used with a dead finger for at least 15 minutes. That time can be extended if the finger is infused with chemicals.

Some other sensors do look for a pulse and/or body temperature, but those are notoriously inaccurate when the human has been exposed to cold.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.