This article from iMore gives a good rundown on Dolby Vision on the new iPhone:
Why Apple is putting Dolby Vision cameras on the iPhone 12
"So, if Dolby Vision is so great, why doesn't everything use it?
Well, first, you have to license it from Dolby, which costs money. That's why some companies use HDR10 still or the newer, better, freely licenseable HDR10+.
Second, you have to compute it, which used to require special cameras, often dual exposures, and a beefy editing rig to put it — and output it — all together.
Now, as of this week, Apple is doing it on a phone. On. A. Phone.
And in 10-bit. For context, most cameras record in 8-bit. I record my videos in 10-bit. RAW is typically 12+-bit. It means more data for the video so if you need to fix white balance or exposure or saturation in post, you have much more data to work with.
That's obvioulsy a ton more data to process, but the iPhone 12 can handle it. Thanks to the A14 Bionic system-on-a-chip, or SoC, which is pulling all that data off the camera sensor, crunching it, adding the Dolby Vision metadata, and saving it all in real time. In. Real. Time.
It's the first time Apple has compute engines capable of doing it. And it's legit mind blowing."