Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Error 53 can occur without any repairs made by any party, and Apple would still not be able to fix it.

That's the real issue.

Spot on. Any of us having never had our phone repaired can get an error 53, anytime. Your phone will remain bricked until you make an appointment , which could be days or a week away depending how busy your local store is, if there is one , and you than have to wait possible up to a week to get it back.

Seems many are cool to be without thier phone for up to 2 weeks, cause a hardware failure caused error 53. I'm not . I'd like a message saying a component has failed and I should take it in for a repair while the phone continues to operate.
[doublepost=1455474632][/doublepost]
I agree everyone SHOULD worry about security, not everyone does, and some prefer being able to take a device to anyone for repair. So, to accommodate that...

Is the U.S. ruled by lawyers? Pretty much. You can look up the number of Senators, Congress members, and Presidents with law degrees.

I don't get it, in the land of the free, Home of the brave , to quote a classic, you are complaining about having the choice where to have a device you bought repaired? And you would prefer everyone was forced to repair it at apple?

Dude, my parents emigrated from a communist country, thier way of life and circumstances were in the best interest of thier "security" no Lawyers problems.

You have no idea how lucky we are! No government nor company should be allowed to play big brother, while pretending our security is paramount, trust me, they don't care at all, we are just pawns.

FYI : choice free under communism sucked! You were super super secure! So secure you could not even get out, the illusion was that it was in your best interest ....

If you don't want the choice, fine, though never give up that right!

P.s there is a first, I'm sticking up for the USA way of life against an America ..... :p
 
Last edited:
Wow. People who jailbreak their devices are anarchists and perverts?

Yes. The people who do it are anarchists. Doing it is a perversion of the device itself.

My friend jailbroke his Amazon Firestick with the express intent of being able to download and watch movies without paying for them. That makes him an anarchist in my book.
[doublepost=1455540261][/doublepost]
Error 53 can occur without any repairs made by any party, and Apple would still not be able to fix it.

That's the real issue.

Funny, that. Apple say you should call AppleCare if you encounter Error 53. Why waste a phone call if it can't be fixed?
 
There is simple solution. If IOS security scan finds abnormalities in touch ID sensor than it should display error code and description and importantly disable finger reading security feature but keep phone working like iphone 5 home button without sensor.. You don't need genius to figure this out.
 
  • Like
Reactions: dk001 and jamezr
There is simple solution. If IOS security scan finds abnormalities in touch ID sensor than it should display error code and description and importantly disable finger reading security feature but keep phone working like iphone 5 home button without sensor.. You don't need genius to figure this out.

This isn't what happens. The hardware subsystem detects a fault in its chain of trust at a critical point in key generation. iOS, or iMyDangerousJailbreak, or iWhateverOtherOS you've managed to install then fails to boot. As it should. You don't need to be a genius to figure this out.
 
This isn't what happens. The hardware subsystem detects a fault in its chain of trust at a critical point in key generation. iOS, or iMyDangerousJailbreak, or iWhateverOtherOS you've managed to install then fails to boot. As it should. You don't need to be a genius to figure this out.


That alleged "chain of trust" was not needed prior to touchID and is only necessary to support fingerprint swipes. The consumer should have the option to not take advantage of the fingerprint option and use non-TouchID-encoded home buttons.

If the added touchID allows a security exploit which can potentially bypass passcode entry security, than TouchID should be disabled and/or not be used at all by anyone who is security-conscious.
 
  • Like
Reactions: dk001 and Ladybug
That alleged "chain of trust" was not needed prior to touchID and is only necessary to support fingerprint swipes. The consumer should have the option to not take advantage of the fingerprint option and use non-TouchID-encoded home buttons.

If the added touchID allows a security exploit which can potentially bypass passcode entry security, than TouchID should be disabled and/or not be used at all by anyone who is security-conscious.

It's not an alleged chain of trust. It is a chain of trust involving key exchanges. Assuming that it would even be possible to 'disable the touch ID', any more than it's possible to disable your mouth (or any other random device connected to the serial bus), that won't help anyone at all if they want to use Apple Pay. Get the phone fixed so that it works properly if it goes wrong. This is not a decision that iOS can make in order to pick and choose features, by the way. iOS is outside this chain of trust, as good security design dictates.
 
Last edited:
Edit: forget it. I've deleted my response entirely. Just another person to throw on my ignore list. No sense in arguing with ignorance.
[doublepost=1455559330][/doublepost]
That alleged "chain of trust" was not needed prior to touchID and is only necessary to support fingerprint swipes. The consumer should have the option to not take advantage of the fingerprint option and use non-TouchID-encoded home buttons.

If the added touchID allows a security exploit which can potentially bypass passcode entry security, than TouchID should be disabled and/or not be used at all by anyone who is security-conscious.
The argument I see many making is that it can't be disabled. Ok. Fine. Let's say that's true. Why did it take apple this long to come up with this security measure? We have had Touch ID for over two years now. I've asked this multiple times in this thread and nobody has even attempted to respond. Everyone just keeps shouting that this in the name of security and how we are all stupid for not seeing that.
 
Last edited:
  • Like
Reactions: dk001 and Ladybug
It's not an alleged chain of trust. It is a chain of trust involving key exchanges. Assuming that it would even be possible to 'disable the touch ID', any more than it's possible to disable your mouth (or any other random device connected to the serial bus), that won't help anyone at all if they want to use Apple Pay. Get the phone fixed so that it works properly if it goes wrong. This is not a decision that iOS can make in order to pick and choose features, by the way. iOS is outside this chain of trust, as good security design dictates.

The consumer should have the choice not to have to use TouchID to make purchases as TouchID/thumbprint should only be a convenience. If a customer prefers to use old-fashioned passcode entry, then he/she should have that option without requiring fingerprint entry and should have the latitude to use a home button which does not include the touchID "features". If a third party button can create a security risk pre-passcode entry, then I have to question the security model for touchID.
 
Edit: forget it. I've deleted my response entirely. Just another person to throw on my ignore list. No sense in arguing with ignorance.
[doublepost=1455559330][/doublepost]
The argument I see many making is that it can't be disabled. Ok. Fine. Let's say that's true. Why did it take apple this long to come up with this security measure? We have had Touch ID for over two years now. I've asked this multiple times in this thread and nobody has even attempted to respond. Everyone just keeps shouting that this in the name of security and how we are all stupid for not seeing that.

Agreed, plus I don't know if people realize this or not but touch-id is possibly one of the worst forms of authentication: someone can access your phone while you are drunk/drugged/sleeping, just grab the finger and put it on the phone. Plus, as a bonus, cops can force you to open phone with touch-id (they can't force you if you just have a pin-code, but if you have touch-id, they can).

This is why touch-id is meant as a convenience secondary feature, and this is why some important things such as restarting the phone mandates pin-code, NOT touch-id.

Touch-id is not secure. Apple's motives for bricking the phone must be other than just "security".
 
This is not a decision that iOS can make in order to pick and choose features, by the way. iOS is outside this chain of trust, as good security design dictates.

That's fine, because iOS does not have to make the final choice. The user can.

The user is already able to make a decision to turn off Touch ID, and use only the passcode if they wish for unlock and Apple Pay.

Thus there is no security reason why iOS cannot ask the user: "Before I brick your device, do you wish to go back to using just your passcode until the sensor can be trusted again?"

Of course, it'd be a lot friendlier if Apple simply used common sense and had iOS do that on its own, since it would have no effect on security.
 
It's not an alleged chain of trust. It is a chain of trust involving key exchanges. Assuming that it would even be possible to 'disable the touch ID', any more than it's possible to disable your mouth (or any other random device connected to the serial bus), that won't help anyone at all if they want to use Apple Pay. Get the phone fixed so that it works properly if it goes wrong. This is not a decision that iOS can make in order to pick and choose features, by the way. iOS is outside this chain of trust, as good security design dictates.

Let's assume you are correct, then why only the 6/6+ and 6S/6S+? I can use my Mini4 and Air 2 to go buy stuff.... Your description and alleged functionality make no sense unless this function only exists on some devices, not all. I am looking for real answers (very few so far) and the express zealotry you are displaying contains very little in the way of verifiable fact. A big piece on Apple's side is missing. Until we have all the relevant factual information your "" is allegedly correct. Not factually correct.

What has been documented so far:
Error 53 has been occurring for a while, is triggered in the iPhone 6/6+ and 6S/6S+ only, and can be caused by 3rd party repairs, use of 3rd party parts, use of scavenged parts, occur on a device that had no repairs at all, and is currently not repairable by authorized dealers. The user who encounters an error 53 will lose all information. There is no verification if this information is wiped or just not accessible by the user. Some authorized repair places have supposedly been given the information on how to fix.
 
  • Like
Reactions: Ladybug
So, this just happened to me. I had a third party repair the screen, and Touch ID did not work. Restored, and bricked phone. I chatted with Apple and they insisted that it wouldn't require a new phone, but I've made an appointment for the Apple Store this weekend and I am preparing for the worst. This thread has way too many people yelling at each other for me to go through it all, but has anyone had this problem with Apple resolving it without requiring them to buy a new phone?
 
Last edited:
Agreed, plus I don't know if people realize this or not but touch-id is possibly one of the worst forms of authentication: someone can access your phone while you are drunk/drugged/sleeping, just grab the finger and put it on the phone. Plus, as a bonus, cops can force you to open phone with touch-id (they can't force you if you just have a pin-code, but if you have touch-id, they can).

This is why touch-id is meant as a convenience secondary feature, and this is why some important things such as restarting the phone mandates pin-code, NOT touch-id.

Touch-id is not secure. Apple's motives for bricking the phone must be other than just "security".
I had read about police officers being able to "force" fingerprint input. I am still curious if something like this would hold up in court. I am no lawyer, so I don't know, but if a password can't be input, and all a thumbprint is, in terms of touch ID, is a coded password, it would seem it would be the same to me. Again, this would need to be challenge and, to my knowledge, this hasn't happened.
[doublepost=1455636130][/doublepost]
That's fine, because iOS does not have to make the final choice. The user can.

The user is already able to make a decision to turn off Touch ID, and use only the passcode if they wish for unlock and Apple Pay.

Thus there is no security reason why iOS cannot ask the user: "Before I brick your device, do you wish to go back to using just your passcode until the sensor can be trusted again?"

Of course, it'd be a lot friendlier if Apple simply used common sense and had iOS do that on its own, since it would have no effect on security.
Is this scenario you are suggesting possible> I ask because, at this point, multiple people have quoted me saying that is not possible (though never in any detail as to why). It seems to make sense to me to disable the dunctionality of touch ID but I have people telling me that a malicious third party device can still pull data from the Touch ID slot. Not knowing enough about hardware or software engineering, one can tell me practcally anything, but the laogical side of me wants to refute this notion with every fier of my being lol. It just doesn't make sense to have a peice of hardware render an entire smartphone useless if damaged or the chanin of trust is broken. But then I second guess myself. After all, what do I know?!
 
I had read about police officers being able to "force" fingerprint input. I am still curious if something like this would hold up in court. I am no lawyer, so I don't know, but if a password can't be input, and all a thumbprint is, in terms of touch ID, is a coded password, it would seem it would be the same to me. Again, this would need to be challenge and, to my knowledge, this hasn't happened.

Did you read the article? It DID hold up in court. A fingerprint is physical and you can be compelled to give it. Similar to how you can be compelled to give DNA. Now I would hope a warrant would be needed, but you can be compelled nonetheless. You cannot be compelled to give your passcode, even with a warrant, as that is "testimony" (it is in your mind) and protected (in the US) by the 5th Amendment.



Mike
 
  • Like
Reactions: lordofthereef
Is this scenario you are suggesting possible> I ask because, at this point, multiple people have quoted me saying that is not possible (though never in any detail as to why). It seems to make sense to me to disable the functionality of touch ID but I have people telling me that a malicious third party device can still pull data from the Touch ID slot.

Well, thinking more about it...

The sensor itself cannot attack anything, since it's treated as an input peripheral just like, say, the barometer or compass. Assuming iOS has no buffer overrun bugs, that is. (I.e. whether the OS can be attacked by sending a larger than usual image packet that overflows memory. Normally programmers watch for that, but then again, Apple fixes buffer overrun bugs almost every update.)

Anyway, what Apple says is:

"The fingerprint sensor is active only when the capacitive steel ring that surrounds the Home button detects the touch of a finger, which triggers the advanced imaging array to scan the finger and send the scan (*) to the Secure Enclave. The raster scan is temporarily stored in encrypted memory within the Secure Enclave while being vectorized for analysis, and then it’s discarded."

(*) Note that the image scan is normally encrypted with a key so that only the Secure Element can read it.

So here's an "attack" scenario that actually supports Apple's bricking in some ways. It requires:

1. The installation of a hacked sensor which always sends the fingerprint image in the clear.
2. Malicious code installed that has taken over the OS and can intercept the image data.

Thus while the Secure Enclave would continue to ignore the unencrypted sensor data, meaning an attacker could not gain access directly, the previously corrupted software could read the fingerprint images being sent by the sensor in the clear and forward them to a third party to make a fake finger for later... which of course cannot work until a real sensor is put back in and resynced by Apple !!

So it relies on installation of some kind of OS virus AND the corresponding sensor that sends data in the clear... and then putting back the real sensor later on. Woof. Seems like a lot of effort, especially since the user has to be targeted for this to work. In which case grabbing a fingerprint from a used glass seems easier to accomplish without having to grab and tamper with the victim's phone ahead of time.

Not to mention that if malicious code has taken over the OS, you've got much bigger problems :)
 
Last edited:
Did you read the article? It DID hold up in court. A fingerprint is physical and you can be compelled to give it. Similar to how you can be compelled to give DNA. Now I would hope a warrant would be needed, but you can be compelled nonetheless. You cannot be compelled to give your passcode, even with a warrant, as that is "testimony" (it is in your mind) and protected (in the US) by the 5th Amendment.



Mike

You can't be compelled to tell them what finger you used though... So, use the pinky on your opposite hand to unlock and I'm thinking they won't get in. A bit uncomfortable to handle a phone like that, but that's the price of security.
 
Did you read the article? It DID hold up in court. A fingerprint is physical and you can be compelled to give it. Similar to how you can be compelled to give DNA. Now I would hope a warrant would be needed, but you can be compelled nonetheless. You cannot be compelled to give your passcode, even with a warrant, as that is "testimony" (it is in your mind) and protected (in the US) by the 5th Amendment.



Mike
I didn't read the article, but mostly because I assumed (obviously erroneously) that this was the same story that I heard years ago where it was in question whether the officer's actions were legal and there was no outcome (I wasn;t aware it ever made it to court). That's a shame, and I would think it was left up to "interpretation". Shame.

Edit: I just read it. Virginia court ruling, so I am curious if this ever went to the supreme court level (did it? the article doesn't say), what the outcome would be.

Interestingly the precedent was a case in 88 where it was ruled that a person CAN be required to produce a KEY to a safe but not a PASSCODE held in his/her mind, the key being physical. Interesting that there is no distinction between a physical inanimate object and one's finger, at least in that judge's mind, but you bring up the example of DNA, so I suppose that would be the same.

Anyway, thank you for the response. It prompted me to do a little more research and a learned a few things today. :)
[doublepost=1455661783][/doublepost]
You can't be compelled to tell them what finger you used though... So, use the pinky on your opposite hand to unlock and I'm thinking they won't get in. A bit uncomfortable to handle a phone like that, but that's the price of security.
Based on the precedent that I am reading about, you CAN be compelled to do that. The precendet is being required to provide a physical key to a safe. I presume I can't throw a bunch of keys at them and say "one works". Or even if I can, you only have ten fingers they have to try. If this sort of thing holds up, I am sure they can get a subpoena, or warrant, or whatever legal paperwork is needed to try all of your fingers. You don't have to tell them which one, you just have to provide a hand.

Now, we all know that the touch ID requires a passcode after 5 failed attempts (as good as touch ID is now, it would be neat, moving forward, to change the number of attempts to something lower, like one or two). So I suppose you could use the wrong finger quickly five times, after which point no amount of tries will get you through without the passcode.
 
Last edited:
and frankly he's not sure doing a screen repair would even fix it. they have to be able to calibrate the new screen and if it's well and truly bricked that would likely fail.

It isn't a screen repair issue. The Touch ID sensor isn't being resynced to the processor's secure cache, so the system can't be sure there isn't a security breach. The better solution would be to tell the user Touch ID isn't secure, and disable it. That way, the phone would still mostly work. Apple resyncs the parts when they do the replacement.
 
Better to rent out that hardware you pay for then to protect you from your bad ideas right?

If this was the case in the past you wouldn't have the ability to change out screens if you shattered them just as one example.
The discussion I've been hearing has been regarding unauthorized third party repairs. If a screen was part of the security of the phone, secure conscious people wouldn't want some fly by night outfit (perhaps from China or Russia?) to supply repair parts that invalidate the phone's security.
 
Why did it take apple this long to come up with this security measure? We have had Touch ID for over two years now. I've asked this multiple times in this thread and nobody has even attempted to respond. Everyone just keeps shouting that this in the name of security and how we are all stupid for not seeing that.

The issue has been around for a long time, but it's only recently that people are bleating about it. It wasn't the most recent firmware change that enabled this. It's the fact that the firmware was being changed, a process which requires new keys to be generated as part of generating the encrypted filing system. A restore from iTunes would also trigger new key generation.
[doublepost=1455732514][/doublepost]
why only the 6/6+ and 6S/6S+?

Proof that error 53 only happens on these models, please.

As for verifiable fact, here is just one definitive statement made by Apple:

The chip in your device also includes an advanced security architecture called the Secure Enclave which was developed to protect passcode and fingerprint data. Fingerprint data is encrypted and protected with a key available only to the Secure Enclave. Fingerprint data is used only by the Secure Enclave to verify that your fingerprint matches the enrolled fingerprint data. The Secure Enclave is walled off from the rest of the chip and the rest of iOS.
 
It isn't a screen repair issue. The Touch ID sensor isn't being resynced to the processor's secure cache, so the system can't be sure there isn't a security breach. The better solution would be to tell the user Touch ID isn't secure, and disable it. That way, the phone would still mostly work. Apple resyncs the parts when they do the replacement.

This doesn't really make sense. If it were true, then all third-party screen replacements would have this issue. The only thing I can think of is that the home button flex cable is getting damaged during incompetent repairs.

It wasn't the most recent firmware change that enabled this. It's the fact that the firmware was being changed, a process which requires new keys to be generated as part of generating the encrypted filing system. A restore from iTunes would also trigger new key generation.

You run into the problem before your software gets changed, though, because TouchID gets disabled right away. This week, when I got my screen replaced, TouchID went straight to the "failed' screen when attempting to turn it on. Software update also failed to work. Apple told me to do a restore, and bam, Error 53. They said "I was afraid this might happen," which really pissed me off. Then why did you tell me to do it? At least before I had a mostly working phone!
 
Last edited:
That's fine, because iOS does not have to make the final choice. The user can.

You're really missing the point of having a chain of trust implemented in hardware. iOS is subservient to it, and cannot/should not boot if it errors out. Before you dictate to the world how security should be implemented, go and educate yourself by researching chains of trust. Google will help, and here's a little slide describing particular implemenations. Then you can get onto stuff like key management, security protocols and the like.

https://www.trustedcomputinggroup.o...08341B656D48802/SANS Webcast Presentation.pdf
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.