In the mages of the MacBook Touch posted above, the virtual keyboard on the second screen looks identical to the physical keyboard, even with a touchpad. I can't see why something like that wouldn't work for all current OSX applications.
An on-screen keyboard would obscure part of the screen. Perhaps the part of the screen you want to type into.
And for all Mac apps it would have to be on-screen 24-7 because all Mac apps accept keyboard input all the time. The let's-remap-tablet-inputs-to-desktop-apps has been done before. It sucks.
Think of the natural gesture for scrolling a window. Now think of the natural gesture for moving a window. Now think of the natural gesture for selecting a block of text. In all cases it is place finger on screen and slide. Untangling and resolving these gestures is not something the Mac OS can do.
For a touch interface to work (and not suck) it needs applications built from the ground up with touch in mind.
C.