Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
"Face ID is designed to work with hats, scarves, glasses, contact lenses, and many sunglasses. Furthermore, it's designed to work indoors, outdoors, and even in total darkness."
Total darkness? You mean it unlocks before the screen is turned on and lights up your face?
 
  • Like
Reactions: Darrensk8
As I said in another thread, it sounds strange that Apple order was to "reduce FaceID accuracy", since, production-wise, there is no such a thing as "FaceID", but there are many components that contribute to it.
What they probably did was to change the accuracy requirements for one or more of the sensors involved in the technology. We don't know how this will affect the final system. As far as we know, an improvement in the machine learning software could compensate for the less precise sensors...

Wow gee, the voice of reason has appeared. I totally agree. Love all of the armchair experts in this thread completely falling for Bloomberg’s clickbait. No one has any clue outside of Apple and the manufacturers what the technical requirements are. Reducing accuracy in this context could mean so many different things, none of which could have an impact on the end user experience.
 
"Face ID is designed to work with hats, scarves, glasses, contact lenses, and many sunglasses. Furthermore, it's designed to work indoors, outdoors, and even in total darkness."
Total darkness? You mean it unlocks before the screen is turned on and lights up your face?

Possibly. Why does this surprise you?
 
There is absolutely no need for this technology, other than Apple trying to leverage the hardware they know will be in the phone to reduce components (and costs).

The cost and number of components for Face ID are significantly higher than the Touch ID system it replaces. Please try again.
 
Sigh. No. glasses distort light paths even if they do not block light. I have not seen a demonstration of the iPhone recognising somebody with glasses, let alone recognising somebody wearing glasses and also recognising them wearing contacts. So, in point of fact, we do not know the answer to my question.

EDIT: getting fed up of making the point, so...

1461617427954


Look at the right sight of the picture, where the right edge of the right lens almost bends light around the head. Not to mention IR scatter off the frame itself.
I wonder if this bending of the light is the reason the IR scanner in our Samsungs hurts my eyes and was starting to hurt my husband’s eyes. We both had to quit using iris scanning because of the pain. The scanner worked but was a bit slower with our glasses on. We probably got exposed to the light longer than the average person not wearing glasses. I don’t know.

I don’t think Apple’s facial scanners will work that badly but I do think they should have had the courage to face the naysayers and give a Touch ID sensor on the back Apple Logo or the side power button (assuming it doesn’t violate Sony’s patent) and give users a choice.

It’s cool that they want to lead with this innovation but their arrogant insistence on dragging customers away from still perfectly viable options like Touch ID and wired accessories, while the newest technologies are still not yet fully optimal, is starting to wear on my nerves.
 
Yes you can enter in a passcode, but I prefer to avoid that hassle, particularity on an expensive phone. The question is how often will you will have to do that, particularly if you wear glasses. So far as I know Apple has not released that information. We all know the false positive rate cited in the key note. So, if you know the false negative rate from Apple literature please give the number and cite the source.
Sweet Jeebus please stop with the glasses, contacts, and other unrelated "impediments" to Face ID. Uggh, still with the false negative. It's not a hill to die one, trust me. The false negative is going to vary greatly based on how your phone held in relation to face. Certain angle are going to be better for authentication and certain angles are going to work better for different people.

Also please point out the fundamental misunderstandings I have about the system, but be aware I have studied the visual system as a neuroscientist, I work with people who study face perception, I am familiar with neural network modelling (including knowing many of the people who first worked on back-propagation and created the first neural nets for speech synthesis and speech recognition), and I understand a certain amount of signal detection theory (which applies in this case).
I'm not saying you're not a scientist. But if you are a scientist, I'm of the opinion you're not doing what a scientist should. I say that because you've opted not to do the one fundamental thing a good scientist does: research. A cursory Google search will provide you with information about how Face ID works and what data it's actually capturing. I am not a scientist, but I am most definitely willing to help you get started because your familiarity with neural network modeling (including knowing people who know stuff about unrelated stuff) isn't apropos or necessary.
Face ID doesn't care about your face per se. It cares about the math of your face: The TrueDepth camera captures accurate face data by projecting and analyzing over 30,000 invisible dots to create a depth map of your face and also captures an infrared image of your face. A portion of the A11 Bionic chip's neural engine — protected within the Secure Enclave — transforms the depth map and infrared image into a mathematical representation and compares that representation to the enrolled facial data.
So neither glasses nor contact lenses will change that math. The caveat there is polarized glasses. Contact lenses don't even factor. There's plenty of information out there regarding how Face ID works and about the state of facial recognition in general. As a scientist that should have been where you started. My opinion of course.
 
  • Like
Reactions: Ntombi
This is the most ridiculous rumor I’ve ever heard.

About as stupid as saying “The A11 is difficult to produce, therefore Apple made last minute changes and removed 2 of the smaller cores.”

You don’t just change the design of complex components on the fly. This is pure BS made up for God only knows what purpose. Sounds like some people are truly afraid of the iPhone X and FaceID and need to scare people with false rumors.
 
"Apple overcame its production challenges by quietly telling suppliers they could reduce the accuracy of the face recognition technology to make the iPhone X easier to manufacture."

Quietly scratches the iPhone X off list of possible purchases.
 
You’re absolutely right about TouchID, but the Bloomberg piece does mention that Apple put their suppliers under a lot of time pressure to shrink the concept of Microsoft’s Kinex tech into a module measured in millimeters and then produce it in mass quantity.

Technically, this tech was never Microsoft's. Microsoft created its own tech with the Kinect 2.0.
 
It is a amazing how fans of a company are unwilling to accept they did something negative, no matter what. The article makes sense to me. There were reports earlier that they could not produce a high enough yield on these parts that have to be mated to work together. Then all of a sudden its reported that the problem is solved. Now this article says that they loosened the spec on the parts, which would mean to me they loosened how tightly the parts needed to be paired. I would fully expect from this that the accuracy of FaceID was reduced. As someone who's worked in manufacturing, this all seems logical. There is a concept called Design for Manufacturability, that they apparently didn't follow very well in the design of X. What is unknown is what impact the reduction in accuracy will have on the user. It could mean that its less secure but in turn has less false fails. It could mean its harder to pass and has more false fails. It could have no impact at all. I will say that no one outside Apple's engineering department will ever know the answer to that question, since no one will have access to an X made to the original spec, or if they do they won't know it. The bottom line is going to be reviews from people who have the final product. I still stand by my position that removing TouchID for FaceID was a dumb design move. Now that I've had the Note 8 for a while and it has multi factor, I like how quickly I can unlock by the race that occurs... first one wins.
 
From the article:

"The dot projector is at the heart of Apple’s production problems. In September, the Wall Street Journal reported that Apple was having trouble producing the modules that combine to make the dot projector, causing shortages. The dot projector uses something called a vertical cavity surface-emitting laser, or VCSEL. The laser beams light through a lens known as a wafer-level optic, which focuses it into the 30,000 points of infra-red light projected onto the user’s face. The laser is made of gallium arsenide, a semiconductor material, and the lens is constructed of glass; both are fragile and easily broken. Precision is key. If the microscopic components are off by even several microns, a fraction of a hair’s breadth, the technology might not work properly, according to people with knowledge of the situation."

...

"The fragility of the components created problems for LG Innotek Co. and Sharp Corp., both of which struggled to combine the laser and lens to make dot projectors. At one point only about 20 percent of the dot projectors the two companies produced were usable, according to a person familiar with the manufacturing process. LG Innotek and Sharp slowed the production process down in an effort to prevent breakages and ensure the components were assembled with the required level of precision. "

...

"To boost the number of usable dot projectors and accelerate production, Apple relaxed some of the specifications for Face ID, according to a different person with knowledge of the process. As a result, it took less time to test completed modules, one of the major sticking points, the person said."
Yes, I read both entire articles too. The first two paragraphs you quoted say nothing about relaxed accuracy specifications. The first discusses what a precision piece of equipment the laser is. The second discusses that it’s fragile, causing low yields. Nothing anywhere about compromising accuracy.

The third paragraph does use the phrase “relaxed some specifications,” but to resulted in less testing time. The article then says nothing about how this led to a sacrifice in FaceID accuracy. You’re making a huge assumption, but I forgive you since you didn’t write these articles.
 
Last edited:
What group has the ability to prove the accuracy isn’t 1,000,000:1? How do you even quantify that?

From what I've read, there are very large computer generated face sets, with predefined similarity values, that can act as a learning or testing source.

Anything Apple and especially iPhone is always good clickbait. But I don't perceive Bloomberg being nefarious here. They are just doing what news orgs do... gather information from sources and then present it in a a report.

Yes, but others are quickly piling on clickbait headlines, and Bloomberg jumped to some conclusions as well.

Here's the source "accuracy" evidence from their article:

To boost the number of usable dot projectors and accelerate production, Apple relaxed some of the specifications for Face ID, according to a different person with knowledge of the process. As a result, it took less time to test completed modules, one of the major sticking points, the person said.

So it depends on exactly what specifications were relaxed in order to speed up the tests. Do they actually reduce accuracy? Or just relax some overly tight manufacturing specs.
 
Last edited:
"Face ID is designed to work with hats, scarves, glasses, contact lenses, and many sunglasses. Furthermore, it's designed to work indoors, outdoors, and even in total darkness."
Total darkness? You mean it unlocks before the screen is turned on and lights up your face?
No.

It does not use visible light, it uses infrared light. When you raise the phone or touch to wake, it beams infrared light to your face to verify your identity, then unlocks the phone upon verification. It won’t light up your face.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.