Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

jjjjjooooo

Suspended
Original poster
Sep 15, 2017
69
18
For those who may be unaware there is a new HDMI 2.1 standard (http://www.hdmi.org/manufacturer/hdmi_2_1/) with some good features which should start shipping on 2018 TVs.

However the new 4K Apple TV is limited to 2.0. Which may not be a big deal as most features are forward-looking, but may be worth keeping in mind for those considering buying a new TV or an Apple TV. (The Xbox One X releasing November 7th is expected to be the first device with 2.1, so it's possible Apple could have included it, or waited a couple months.)

According to news reports (https://www.cnet.com/news/hdmi-2-1-what-you-need-to-know/) it's possible some features can be enabled on 2.0 through a firmware update, although there is no guarantee.

Apple users may have to wait until the next Apple TV for this and an improved remote, presumably at minimum two years from now, unless Apple gets serious and releases that 60" television set next year.

One feature I'm curious about is VRR (variable refresh rate) which is used for gaming, but am wondering if it can be used to solve the 24p problem, as opposed to relying on TV makers to reverse telecine properly.
 
So there's a new technology coming out in the future that will eventually make existing technology obsolete??? This is a ground breaking discovery!!!
 
One feature I'm curious about is VRR (variable refresh rate) which is used for gaming, but am wondering if it can be used to solve the 24p problem, as opposed to relying on TV makers to reverse telecine properly.

That feature is like freesync/g-sync on PC's. It's primarily intended for gaming, to stop screen tearing etc.
 
If it won't be available on TVs until 2018, why would Apple have included it in this version of Apple TV?
I'm assuming Apple will sell this version for at least two years.

Why do you think Microsoft is including it?
[doublepost=1505684322][/doublepost]
That feature is like freesync/g-sync on PC's. It's primarily intended for gaming, to stop screen tearing etc.
Right, but am hoping it can be used to solve the 24p problem the same way ProMotion on the iPad apparently does.

Otherwise Apple TV owners need TVs that can extract a 24p source from a 60p signal, which apparently the majority of 2017 TVs can't do (http://www.rtings.com/tv/tests/motion/24p).
 
If this was a $2000+ TV or receiver, I'd say people should consider more what the lack of HDMI 2.1 may mean for them in in the future. We're talking about a cheap (relative to other A/V components) streaming box though and it's such a new spec practically no other A/V components have it yet either.
 
Right, but am hoping it can be used to solve the 24p problem the same way ProMotion on the iPad apparently does.
HDMI 2.1 is not required for 24p output. That has long been part of the existing HDMI standards. If Apple wanted to, they could easily implement this today.

In general, when devices claiming support for HDMI 2.1 start appearing next year, it does not mean that they will support every new 2.1 feature right away. HDMI 2.1 is mainly about futureproofing the standard, e.g. by introducing support for resolutions even higher than 4K. This is obviously not relevant for a device like the ATV. HDMI 2.1 features will be introduced selectively and piecemeal over the next few years.

There is one feature though that might be relevant: HDMI 2.1 introduces official support for the transmission of dynamic HDR metadata. This might be required for the new HDR10+ standard that Samsung is pushing together with some partners. But there are rumors that the required protocol changes can also be implemented retroactively on exisiting HDMI 2.0a devices via firmware updates. Anyway, so far there is no indication that Apple is planning to support HDR10+.

Another potentially interesting feature in the near term is eARC. It's an extended version of ARC that will finally allow the transmission of more advanced audio formats such as lossless codecs and Dolby Atmos on the return channel. So if you're in the market for a TV and plan to use ARC for a soundbar or AVR, it might be worth waiting for that.
 
HDMI 2.1 is not required for 24p output. That has long been part of the existing HDMI standards. If Apple wanted to, they could easily implement this today.

In general, when devices claiming support for HDMI 2.1 start appearing next year, it does not mean that they will support every new 2.1 feature right away. HDMI 2.1 is mainly about futureproofing the standard, e.g. by introducing support for resolutions even higher than 4K. This is obviously not relevant for a device like the ATV. HDMI 2.1 features will be introduced selectively and piecemeal over the next few years.

There is one feature though that might be relevant: HDMI 2.1 introduces official support for the transmission of dynamic HDR metadata. This might be required for the new HDR10+ standard that Samsung is pushing together with some partners. But there are rumors that the required protocol changes can also be implemented retroactively on exisiting HDMI 2.0a devices via firmware updates. Anyway, so far there is no indication that Apple is planning to support HDR10+.

Another potentially interesting feature in the near term is eARC. It's an extended version of ARC that will finally allow the transmission of more advanced audio formats such as lossless codecs and Dolby Atmos on the return channel. So if you're in the market for a TV and plan to use ARC for a soundbar or AVR, it might be worth waiting for that.

Dolby Vision uses HDMI 2.0a for 4k video. I don't see why HDR10+ (basically extending HDR10 with dynamic .. so scene by scene metadata, according to a SMPTE standard... the one Dolby sticks to as well btw) would not work over HDMI2.0a.

It's more a question of supporting it.

I think Netflix does both HDR10 and DV (Dolby Vision)
AFAIK Amazon Prime Video does the same.

There's talk that Amazon will support HDR10+ as well.

I'm not sure if HDR10+ will really take off. The issue is that DV costs royalties for device makers... guess on the display (TV) side.
 
It's more a question of supporting it.

I think Netflix does both HDR10 and DV (Dolby Vision)
AFAIK Amazon Prime Video does the same.

There's talk that Amazon will support HDR10+ as well.

I'm not sure if HDR10+ will really take off. The issue is that DV costs royalties for device makers... guess on the display (TV) side.
The current HDR status after IFA and Apples event, courtesy of Flatpanels.dk:
hdr.jpg


Seems like HDR10 and DV is the way to go in the future. And HLG if you like TV broadcasting. Why we are still getting new standards, such as Advanced HDR by Technicolor, is beyond me. Even HDR+ appear to be a lost cause, although the combined forces of Samsung and Amazon may be sufficient. At least the latter have shown before that they are willing to go their own ways, and they have sufficient market support (And increasing, through Amazin Prime video streaming!) that it could become a minor standard.

So there's a new technology coming out in the future that will eventually make existing technology obsolete??? This is a ground breaking discovery!!!
What a negative post. Current technology will not be obsolete, and if you don't technology development, there are plenty of amish societies who would love to take you in.
 
Last edited:
The current HDR status after IFA and Apples event, courtesy of Flatpanels.dk:
hdr.jpg


Seems like HDR10 and DV is the way to go in the future. And HLG if you like TV broadcasting. Why we are still getting new standards, such as Advanced HDR by Technicolor, is beyond me. Even HDR+ appear to be a lost cause, although the combined forces of Samsung and Amazon may be sufficient. At least the latter have shown before that they are willing to go their own ways, and they have sufficient market support (And increasing, through Amazin Prime video streaming!) that it could become a minor standard.


What a negative post. Current technology will not be obsolete, and if you don't technology development, there are plenty of amish societies who would love to take you in.

Thx. That helps. I upgraded my Onkyo receiver recently to do a TR-NR676, which is fully HDMI2.0a an all inputs and can do both HDR10 and DV (bypass).

Now I just have to find a "nicely priced 4K TV for Xmas doing DV and HDR10. So far DV is slowly being introduced by TV makers. I don't believe in "firmware upgrades" promises by TV makers, because of their poor SW support. One year max. is usually what you get. Also keep in mind that DV requires HW support.

For me HLG is just dead. Broadcasters here in Europe are painfully slow adopting new standards. The transition to HD took ages. Most if not all HD broadcasts are still stereo. 4K and HDR might still take 5-10 years.

For HD the argument was often "production costs and the SD cameras have not been written off". Specially commercial stations are the last ones to make the jump to better video quality.
 
Dolby Vision uses HDMI 2.0a for 4k video. I don't see why HDR10+ (basically extending HDR10 with dynamic .. so scene by scene metadata, according to a SMPTE standard... the one Dolby sticks to as well btw) would not work over HDMI2.0a.
Dolby Vision uses a trick to "hide" the metadata in the video stream to achieve compatibility with existing HDMI connections.
 
However the new 4K Apple TV is limited to 2.0.
Actually it's 2.0a

Dolby Vision uses HDMI 2.0a for 4k video. I don't see why HDR10+ (basically extending HDR10 with dynamic .. so scene by scene metadata, according to a SMPTE standard... the one Dolby sticks to as well btw) would not work over HDMI2.0a.
Because HDR10 is 10 bit, Dobly Vision is 12 bit which maxes out the bandwidth of the cable and uses a trick to get dynamic metadata to work, and HDR10+ dynamic metadata requires the extra bandwidth of HDMI 2.1.
 
It's now simply a race to get product out which will decide this. If Dolby can get their studio partners to release enough reference quality Dolby Vision material (and players!) over the next 6 months then the enthusiast market may sway towards buying DV-enabled televisions which may force Panasonic and Samsung into support or face loosing high-end sales. If HDR10+ enabled equipment and content can make an impact (in terms of availability and quality) then DV will probably die off due to the licensing costs. HDR10+ basically needs to work over existing HDMI ports via firmware updates (that are actively pushed out) and needs to work as an extension of HDR10+ on UHD-BluRay disks so said disks are still within the UHD-BluRay specification.

Who knows though, perhaps Dolby will offer a sweetener of reduced licensing costs if partners eschew HDR10+ support (after all, as HDR10+ is an open standard why would LG, Sony etc not support it?)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.