Should Apple open up hardware-level access to a third party developer, I think it's a valid question in terms of design phylosophy/security. Can't blame Apple right away.
This API isn't actually allowing the 3rd party developer to talk directly to the hardware and program it to do arbitrary things. The OS is still the only thing allowed to do that.
Rather, this API allows the feeding in of frames of H.264-compressed video as function arguments, and it provides frames of raw, uncompressed video as callback return values.
The application software doesn't need to know anything about the unique low-level characteristics of the underlying video cards, and it doesn't "talk to" the video cards directly.
This is a good start.
Apple has a few "next steps" they should take:
1) Expand the API to transparently work with a wider variety of video cards. If they do this, then Adobe Flash should automatically gain access to these new cards without the need to modify their software at all.
2) Follow Microsoft's lead with the DXVA framework, and expand this API to work with any hardware-accelerated codec which has been installed in the video subsystem and automatically fall back to the OS's built-in software codecs if no hardware path exists for any particular codec.
Basically, they could approach both of these objectives in two ways.
1) They could expand this API.
2) They could fold this API into QTKit as a lower-level point of entry and exit to that system, offering all the benefits of QTKit's codec flexibility without the lock-in of having to go with QuickTime's container all the way from file I/O through to on-screen delivery.