Yeah, but you know, thats where all the marketing speak starts. There is a distinction between "HDR" and "HDR standard". I haven't looked at the exact math of it, but if I understand it correctly, what the HDR standards do is is basically map colors to brightness, hence also the requirement to have 10bit per channel in data specification (its a different question though if the actual panel needs to support 10bit color, no idea what standards say there). I suppose that they also must be losing some color information, since if you want to encode both color and brightness, you'd have to make some sacrifices (but maybe they do it the way that some colours just can't be brighter than other colours, no idea). Anyway, by using so encoded color/brightness pairing, one can describe an image where different parts have vastly higher brightness contrast than with "normal" images. But thats just the way of encoding stuff. If you want, a HDR standard is basically a "compression specification" for color and brightness.
What I am more concerned about is how these brigtness contrasts are actually displayed. What you'd really want here, as mentioned before, is individual pixel brightness control. It seems that high-end TVs achieve this either by using OLED (where brightness can be controlled individually, even thought the range is usually not great) or by using multiple zonal LED backlights instead of just one uniform. And if you don't have this physical capability to control brightness, you can resort to an array of tricks such as adjusting overall brightness per-frame (so that brighter scenes appear much brighter) and/or changing colors (tone mapping) to abuse particularities of human color and brightness perception.
The later tricks were employed by games for quite some time now since all you need is an appropriate shader. You don't actually need to support any of these HDR standards or have any special hardware. Still, I don't see how all this is more then a clever hack, similar to how we used to employ palettes to pretend if we have a lot of colors (while in fact we had only 256 or less). "Real HDR" means floating-point color channel specification for some proper dynamic range and hardware that can actually display full range of brightness in each pixel individually. Once we have tech that allows every single pixel to go between 0 and 5000nits (or more), we'd really real HDR. And only little need for HDR standards
P.S. It is very much possible that I completely misunderstood what HDR standards do, so feel more then free to correct me if what I wrote is total nonsense (it often is

)
P.P.S. And of course Apple already offers APIs to work with HDR (which they more humbly and adequately call EDR):
https://developer.apple.com/library...OSX/WhatsNewInOSX/Articles/MacOSX10_11_2.html last item