Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
"Most reports from iPhone X users say that Face ID disappears as soon as you set it up, unlocking things when you want them, and locking them when you don’t. Probably the best reason for disabling attention awareness is the sunglasses scenario mentioned above, or if you’re a blank-eyed zombie. For everything else, there’s probably not much point switching away from the default."

From the same article you linked. The speed of Face ID has never been an issue except when wearing sunglasses. You still haven't made a point that you can actually back up regarding Face ID.
"which speeds up the Face ID process and unlocks your iPhone X faster". Do you really think that this is not about the speed? Just think about it. Even if the sensor and the software needed for attention detection is as fast as the 3D scan, for the sensor to see your eyes state one needs to position the phone in front of the eyes. And this takes time. 3D detection does not need it. That would explain the reports that Google's version of face id is faster and works at wider angles.
 
Sorry but I couldn't find any information to support your claim.

The M1 (not the U1 you mentioned earlier), is the always-on motion co-processor, and is used to detect even small motions of the phone, allowing the LED dot array to be turned on before the screen is touched.

To me it seems that you just don't understand how the Soli radar works.
Maybe this will help you.

While there are many uses for this system, in this discussion, it is being used to activate the illuminators before the screen is touched.
 
  • Like
Reactions: realtuner
"which speeds up the Face ID process and unlocks your iPhone X faster". Do you really think that this is not about the speed? Just think about it. Even if the sensor and the software needed for attention detection is as fast as the 3D scan, for the sensor to see your eyes state one needs to position the phone in front of the eyes. And this takes time. 3D detection does not need it. That would explain the reports that Google's version of face id is faster and works at wider angles.

Of course it is faster, but no one will notice because it was never slow.....unless you are wearing sunglasses. Both articles you referenced stated that explicitly. You are talking about a difference of nano or milliseconds in normal use. I've tested toggling attention required on and off 5 times today. There is no noticeable difference. Period.
 
Last edited:
You either totally don't get it or are intentionally lying. It is because Samsung's sensor reads the 3D picture is why it doesn't work with some screen protectors. Not flat screen protectors as you claim but with the tempered glass screen protectors. Tempered glass prevents capacitive sensing. Because of that, tempered glass screens have holes which are obviously closer to the sensor than the finger and thus they are sensed by the sensor instead of the fingerprint.

You’re the one who doesn’t get it. If it reads your fingerprint in 3D then no screen protector should ever fool it. It should simply fail to unlock because it didn’t get an accurate reading of your print to match with the version stored on the phone. The fact it unlocks shows there’s a serious flaw in how it reads your prints.

The default behavior of any biometric that doesn’t obtain a highly accurate version of your finger (or face) should be to deny access, not to say “close enough, it’s probably the owner” and unlock anyway.
 
Of course it is faster, but no one will notice because it was never slow.....unless you are wearing sunglasses. Both articles you referenced stated that explicitly. You are talking about a difference of nano or milliseconds in normal use. I've tested toggling attention required on and off 5 times today. There is no noticeable difference. Period.
I wear regular glasses and also have prescription sunglasses. I’ve never noticed any difference in how my iPhone works with either pair of glasses (or no glasses).
 
  • Like
Reactions: TopherMan12
You’re the one who doesn’t get it. If it reads your fingerprint in 3D then no screen protector should ever fool it. It should simply fail to unlock because it didn’t get an accurate reading of your print to match with the version stored on the phone. The fact it unlocks shows there’s a serious flaw in how it reads your prints.

The default behavior of any biometric that doesn’t obtain a highly accurate version of your finger (or face) should be to deny access, not to say “close enough, it’s probably the owner” and unlock anyway.
With screen protector (with holes) the sensor does not read or register the fingerprint, it can only see the wholes (in 3D) and the hole pattern does not change from finger to finger. It is a secure device when used properly but of course Samsung have to update their software to reject fingerprint registration with the obstacle present.
 
So I’m curious. What do you tell them?

I would have thought that by now the Android OEM’s would have figured out both Face ID and underscreen Touch ID, but that isn’t the case.

in the case of underscreen fingerprint scanners, this is probably why we haven’t seen it from Apple. The technology for this is just a lot harder than current state of the art.

I do not argue, I offer them Premium Technical Support for $19.99/mo and wish them well. My job is not Apple Evangelist but Internet Technical Support.
[automerge]1571534321[/automerge]
Do you have any google apps, use google use gmail, maps, etc....on your iPhone? Then they get you too. Same as with app the other companies you have on your phone. It’s inescapable to be tracked!
I use a VPN, and do not use any Google apps other than Waze which I only share location while running the app.
 
  • Like
Reactions: Sikh and Truefan31
With screen protector (with holes) the sensor does not read or register the fingerprint, it can only see the wholes (in 3D) and the hole pattern does not change from finger to finger. It is a secure device when used properly but of course Samsung have to update their software to reject fingerprint registration with the obstacle present.
Wow, you’re so far off base on this it’s not even funny.

Have you even seen the demo videos of this? This is how one user did it:

  • User registers their thumb print without a screen protector.
  • They unlock with the thumb to show it works.
  • They try a different finger and it won’t unlock.
  • They place a plastic case on the screen and after only two tries, the finger unlocks the phone.
You think this is normal? They didn’t learn their fingerprint after installing a screen protector - they used a screen protector to allow an untrained finger to work.
 
You’re the one who doesn’t get it. If it reads your fingerprint in 3D then no screen protector should ever fool it. It should simply fail to unlock because it didn’t get an accurate reading of your print to match with the version stored on the phone. The fact it unlocks shows there’s a serious flaw in how it reads your prints.

The default behavior of any biometric that doesn’t obtain a highly accurate version of your finger (or face) should be to deny access, not to say “close enough, it’s probably the owner” and unlock anyway.

I suspect that what is happening is that the finger was registered with the screen protector on? If so, then the hashed feature vector probably ends up being something very sparse - imagine “0000000000000010000”

Then whenever any finger touches it, from anyone, it creates a new hashed feature vector, say “0000000001000000000.” And it says “yeah, close enough.”

Because the software is too dumb to reject the registered finger feature vector as being unrealistic/too simple.

If they are truly registering WITHOUT the protector, then I have no idea how one would write software that ******.
 
  • Like
Reactions: realtuner
I do not argue, I offer them Premium Technical Support for $19.99/mo and wish them well. My job is not Apple Evangelist but Internet Technical Support.
[automerge]1571534321[/automerge]

I use a VPN, and do not use any Google apps other than Waze which I only share location while running the app.
Question for you. You use a VPN and use no Google apps and I assume, please excuse me if I'm wrong, no social media apps. Do you feel that you are a typical Iphone user?
 
They didn’t just admit it unlocks with the eyes closed; they featured it in ads, I think at the launch. They reckon it is faster that way. True, lowering security standards is convenient and makes for faster operation. Microsoft made that mistake once.
 
The iPhone attention requirement is active by default and by disabling it you yourself accept the risk. You can leave your phone entirely unprotected if you want....again you accept the risk. The "critical problem" is as you have already said: the lack of the attention protection entirely. That is not the "end of it" at all. That is the beginning. Googling fixing the problem is the end of it.

And you are using the word "everybody" extremely loosely and without any backing as to advice consensus.
It’s funny as I don’t know how any body can defend this as being anything but a security nightmare

It seems no different to Samsung’s own facial recognition
 
Wow Apple pay doesn't bother to check your ID at the actual transaction time and only needs your phone to be unlocked?

What a complete garbage feature, can't believe people claim Apple cares about security.

Google pay has always needed my fingerprint at the actual transaction time.

Wow your ignorance is astounding. I should‘ve known I was reading garbage when I saw this post was liked by macfacts, another conveyor of falsehoods.
 
Given that, due to press coverage Google are now going to change this to either require eye-open "Attention" or give you the option to have this turned on or off.
This whole story is now a non issue.
Personally I hope they put in a toggle switch so that, unlike Apple, Pixel owners will have the choice.

Problem solved, nothing to see here, move along people :)
 
Given that, due to press coverage Google are now going to change this to either require eye-open "Attention" or give you the option to have this turned on or off.
This whole story is now a non issue.
Personally I hope they put in a toggle switch so that, unlike Apple, Pixel owners will have the choice.

Problem solved, nothing to see here, move along people :)
Apple do give you the option to have required attention on or not and no it’s not nothing to see until they fix the issue
 
Question for you. You use a VPN and use no Google apps and I assume, please excuse me if I'm wrong, no social media apps. Do you feel that you are a typical Iphone user?
I do Facebook from a computer only as I run marketing campaigns for businesses. I am most definitely not a typical iPhone user, I’ve managed security for multiple publishers over the years so have tried to always protect my own privacy.
 
The M1 (not the U1 you mentioned earlier), is the always-on motion co-processor, and is used to detect even small motions of the phone, allowing the LED dot array to be turned on before the screen is touched.



While there are many uses for this system, in this discussion, it is being used to activate the illuminators before the screen is touched.
Motion co processors on iPhones incorporate sensors like gyroscope, barometer, accelerometer, compass etc.
It can detect all kinds of motions but in the context or phisycal touch or manipulation of the phone.
You said it yourself "detect even small motions of the phone, allowing the LED dot array to be turned on".
So I still don't understand this claim that it can replicate the principle of how Soli chip works on the Pixel 4.
I honestly can't replicate it, I move my hand close to the iphone x and nothing happens. I have to tap on the screen or pick up the phone to have it turn on.
 
not just you.....tech hasn't been very exciting in a while. iPhones included. We are getting incremental upgrades but nothing revolutionary. Companies are running out of ideas, I think.
Well I think we have reached limitations of affordable, broadly available tech in smart devices. In 2007 when iPhone launched, there were so many new things never seen in a ‘phone’ yet so many other things not yet seen that have subsequently been added by either Apple or Android hardware manufacturers.

If I’m honest, every major Apple product that launches is pretty exciting. I’m already excited for the next iPad Pro. Some people probably think that there’s no reason I should be excited for Apple product launches but it comes down to one thing for me: Every Apple product I use works well and improves productivity for me in some way. I genuinely enjoy using my iPhone, my Apple Watch, my iPad Pro, my MacBook Pro, my AppleTV and my AirPods. So when a new version of any of those comes out, I’m excited.
 
Motion co processors on iPhones incorporate sensors like gyroscope, barometer, accelerometer, compass etc.
It can detect all kinds of motions but in the context or phisycal touch or manipulation of the phone.
You said it yourself "detect even small motions of the phone, allowing the LED dot array to be turned on".
So I still don't understand this claim that it can replicate the principle of how Soli chip works on the Pixel 4.
I honestly can't replicate it, I move my hand close to the iphone x and nothing happens. I have to tap on the screen or pick up the phone to have it turn on.
Yes raise to wake will awake the phone and authenticate immediately. I turned it on to test, but normally it’s off.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.