This isn't the same thing at all. TouchID is used for authentication only. FaceID is used for authentication as well, but as they showed at the keynote, the face recognition software is being actively used for in-app usage beyond authentication. The face recognition is used for animojis and snapchat uses it for those masks. So that means apps can use the face recognition for purposes beyond authentication.
We also know from the keynote and the hands on videos that the facial recognition software can tell if your eyes are open or closed, and if you're looking at the phone or looking away from the phone. This, combined with the in-app usage of the facial recognition software, means it is entirely possible that apps can be set up where you have to actively watch ads or they will autopause and wait for you to return. With your TouchID example, even if an app could use TouchID that way, which I don't think they can as it's just for authentication, all it could guarantee is that you pressed your thumb to the sensor at the end of the ad. It can't guarantee that you actually watched the ad. With face recognition, they can actually require that you physically watch the ad. Huge huge difference.
Even if app developers don't use this tech to force you to watch ads, they can use it to track which ads you are watching and build up data on you. If I had to have the X, I'd be disabling
FaceID for sure.
Face ID is not the same as using the front facing camera. You can already use the camera to do things that they demoed on stage. Snapchat being the obvious one. I saw no indication that besides the authentication of Face ID there is anything in the API that enables a developer to respond to 'looking directly at the screen or not'
Besides, if an app maker decided to do what your suggesting, wouldn't you just stop using the app?