Ok, well I've never said this before, but this is something that Steve Jobs would have never allowed. I guess we'll see what kind of effect this has.
I think your on to something. In that Apple may not be getting the dot projector quality they want. But they designed the system to use different patterns of dots each time a face is scanned. So maybe they just downscale from 30,000 possible dots to 20,000 and adjust the patterns. The patterns and scans are unique to your iphone’s Combination of sensors and security key, so it’s not like the system isn’t pretty well encrypted if they end up only using 2/3 the capability of one part because a hacker can’t exploit that hardware because it’s all in sync and doesn’t work at all if changed.As I said in another thread, it sounds strange that Apple order was to "reduce FaceID accuracy", since, production-wise, there is no such a thing as "FaceID", but there are many components that contribute to it.
What they probably did was to change the accuracy requirements for one or more of the sensors involved in the technology. We don't know how this will affect the final system. As far as we know, an improvement in the machine learning software could compensate for the less precise sensors...
That's less than 3% difference, which is in the noise. Humans trying detecting 3% difference is an futile attempt to detect the benefits if slightly larger screen.? Nope, I make speakers, not phones.But only the Pixel 2 has a lower screen/body ratio from the current crop of "Bezel-less" flagships.
Apple iPhone X 81.49%
LG V30 83.24%
Samsung S8 83.6%
Xiaomi Mi Mix 2 82.63%