I love that he gets into such detail. That’s interesting about the 4ms improvement they got using something they apparently can’t expose as a public method without some risk.
Now I’m curious about “mid-frame event processing”, but googling it hasn’t immediately got me anything.
As things happen on the screen, there are events triggered.
- Start of Frame Drawing
- Mid-Frame Drawing (half-draw done?)
- End of Frame Drawing
Apps can likely subscribe to these events to perform functions before and after frames (screens) are updated.
It's likely that Apple has been able to subscribe to mid-frame events within their own apps, but aren't ready to make that available (safely) to third parties.
[doublepost=1561242731][/doublepost]
I am glad that Apple is going down this route. I recently had the chance to use a Pixel Book for an extended time, and the high latency of the pen in third-party apps was a distinct problem.
Yah, I think it's awesome that Apple competes against itself. They already had the industry-leading tablet+pen(cil) implementation, but needed to improve it further.