Back when people thought Apple was doing an in-display fingerprint sensor, it was thought from their patents that it might be image based as well.
An RF signal from the sensor ring passes through your fingertip and is measured by a tiny antenna array in the center part. This gives a 3D image of the fingerprint's ridges, valleys and pores.
Which is why a 3D copy works as well. It's also why a wet finger fails, as that provides a short cicuit signal path across the finger surface instead of having to pass internally.
Synaptics' device includes its own encryption and secure processor.
No, that was a fan myth.
Their security white papers talk generically about looking for temperature, pulse, flexing, etc, but do not give info about the actual methods used in this sensor, other than claiming that they use an AI based scheme which rejects 99% of spoof attempts.
This claim will be tested quickly, no doubt!
--
I was an early supporter of FaceId, and it's still a good method. However, its sensors takes up frontal space.
So I can easily see under-screen print sensors being popular on phones where the front is used for a notchless display.