Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I'm still not impressed or by any of these HDR standards. when I got my first tv that had it I expected to be wowed but I can't really tell much of a Dif when its on or off🤷‍♂️🤷‍♂️
 
  • Like
Reactions: Mike_Trivisonno
I'm still not impressed or by any of these HDR standards. when I got my first tv that had it I expected to be wowed but I can't really tell much of a Dif when its on or off🤷‍♂️🤷‍♂️
HDR requires a very good TV with a very high luminance output to be able to reap its benefits. Just because it has an HDR sticker on it doesn't mean it's actually HDR capable.
 
I'm still not impressed or by any of these HDR standards. when I got my first tv that had it I expected to be wowed but I can't really tell much of a Dif when its on or off🤷‍♂️🤷‍♂️
Sounds like you don't have an actual HDR TV, but rather a normal TV that's marketed as HDR by television manufacturers (which is all too common, unfortunately).
 
To be fair, Samsung is likely more visible because it supports HDR10+ exclusively on its TVs.

HDR10+ as a standard is bigger than Samsung, though.

Amazon, 20th Century Fox, Panasonic, and Samsung together updated the base universal HDR10, which Apple supports.

Apple just never implemented the update. :(

HDR10+ has relatively wide backing: Blackmagic, Arm Ltd., Onkyo, Broadcom, Google, Plex, Qualcomm, Technicolor, Warner Brothers, Mediatek, Unisoc, etc.

Samsung is more visible because Samsung created it after a spat with Dolby over a $3 per tv license. It is NOT an update to HDR10, which is created and maintained by the CTA. It’s name is like HD DVD - unrelated to what the name leads you to believe.

20th Century no longer uses it. And Dolby Vision also carries the base HDR10 data on its payload. Your comment reads like a copy/paste of a Wikipedia page that hasn’t been updated since 2017. It’s Samsung, Amazon and Paramount+. Warner on movies. And Hulu when it carries some Warner movies. Panasonic, Hisense and TCL have it, but they each also have DV.
 
  • Like
Reactions: SFjohn and nutmac
"HDR10+ support
The latest generation of high dynamic range technology is now supported in the Apple TV app."

It must have been a typo. Whoever wrote it meant to say "not" instead of "now," and Apple posted it. ;)
 
  • Haha
Reactions: enigmatut
Maybe Apple is using the engineering resources to get the library tab to finally work with large libraries.
 
It really sucks that they don't support some of these standards.
To be fair, I think HDR10+ is just silly.

HDR10+ was primarily funded and created by Samsung as a move against Dolby. HDR10 was created by Consumer Technology Assocation and it was neither consulted nor involved in the creation of HDR10+.

HDR10+ more or less replicates Dolby Vision (just as Samsung's smart phones copied iPhone), and contrary to common understanding, it requires manufacturers to pay them an annual fee ($2,500 to $10,000). To be fair, this fee is dramatically cheaper than Dolby Vision (about $2 per hardware).

Since HDR10+ is largely identical to Dolby Vision and every single modern TV sets with HDR10+ (other than Samsung's) also has Dolby Vision, it is essentially redundant and unnecessary standard.
 
  • Like
  • Love
Reactions: matrix07 and SFjohn
HDR requires a very good TV with a very high luminance output to be able to reap its benefits. Just because it has an HDR sticker on it doesn't mean it's actually HDR capable.
HDR is mainly about 3 things:
  • Saturated, vibrant color: Displays capable of wide color gamut (10-bit, although the spec calls for 12-bit). Quantum displays are particularly great at rendering vibrant color.
  • Deep blacks and contrast: Local dimming with many zones (e.g., mini LED or XDR), or ideally OLED or micro LED.
  • Bright highlights: Capable of very bright highlights, at least 600 nits, preferably higher than 1000 nits.
And to get a great HDR performance, most displays need some calibration. Rtings is a great resource for getting more out of your display.
 
  • Like
Reactions: vinegarshots
Exactly. Having only about 32% of all TV market share- the largest share of the television pie- Samsung TVs should not be fed a royalty-free, open source alternative to Dolby Vision. Stick with an option NOT supported on about a third of all televisions everywhere.

Sarcasm aside: we should not automatically hate this because Samsung is involved. Apple still uses things from Samsung and we seem to be OK with it when Apple chooses Samsung over other alternatives for parts & pieces. And this particular thing would make some key offerings from Apple work better with the highest market share brand of televisions out there.

Objectively, Dolby Vision is considered the superior option (and I'm glad AppleTV supports it) but that makes no difference to those in the 32%... unless they want to dump their television and buy another. Through an AppleTV-focused lens, competition like Roku boxes support BOTH, so Apple NOT is leaving something fairly tangible for competing offerings... at least for 32% of TV buyers.
Everything you just said but put on Samsung for refusing to support DV. I can’t fault Apple for supporting the better more popular format. I can blame Samsung for refusing to support a format they don’t control. HDR10+ vs DV is just this generation's BR vs HDDVD.
You’ve missed the conclusion from RTINGS.
HDR10+ is sometimes the only dynamic HDR available. See this lengthy list. By refusing to support it, you’re back down to HDR10 (static HDR).



Apple screwed up by not including both formats.

Hulu uses HDR10+. On ATV 4K, you can only get basic HDR10.

Paramount+ uses HDR10+. On ATV 4K, you’re degraded down to HDR10.

Add Prime Video, Google Play, and YouTube as HDR10+ services.

It’s not as cut and dry as you’re claiming. All modern TVs (and thus content players) should include both.
I didn’t miss it. I said HDR10+ by its existence fragments the market. Most, if not all of those movies would be DV if Samsung put DV on their TVs. Samsung wants to sell their DV-less tv and control the HDR format. That makes them worse than Apple because Samsung has a vested interest in the popularity of the format. Apple just wants to give its users the optimal experience, and choice is not always best for people. The best way Apple can do that is to pick a side.

And as far as your examples go...

Hulu, Paramount+, Google Play, and Prime all support Dolby Vision. The only one on your list that doesn't support DV is YouTube, which also doesn't support 5.1 surround sound.
 
Last edited:
It’s probably a selling point for their next AppleTV hardware release?
 
Dolby could help end this minor format war by making the standard free and saving us all a headache. Of course they won't do that because they want to make money (the consumer is a secondary concern, they didn't develop this technology out of the goodness of their hearts). What we need is a royalty free dynamic HDR standard, and at the moment the closest thing we have to that is HDR10+. Personally I don't know why the committee who implemented the HDR standard don't just release a dynamic version opensource/royalty free and put this annoyance to rest, it seems rather short sighted of them.

Seeing, as far as I am aware, that HDR10+ can be implemented in software, it would have been considerate of Apple to support both formats for the sake of their customers.
 
Last edited:
  • Like
Reactions: DiceMoney
To be fair, I think HDR10+ is just silly.

HDR10+ was primarily funded and created by Samsung as a move against Dolby. HDR10 was created by Consumer Technology Assocation and it was neither consulted nor involved in the creation of HDR10+.

HDR10+ more or less replicates Dolby Vision (just as Samsung's smart phones copied iPhone), and contrary to common understanding, it requires manufacturers to pay them an annual fee ($2,500 to $10,000). To be fair, this fee is dramatically cheaper than Dolby Vision (about $2 per hardware).

Since HDR10+ is largely identical to Dolby Vision and every single modern TV sets with HDR10+ (other than Samsung's) also has Dolby Vision, it is essentially redundant and unnecessary standard.
Then why not support it? Whether or not you do not like it is irrelevant but I want my equipment to support every standard it can.
 
  • Like
Reactions: DiceMoney
So everyone here is talking about televisions, but the article is about HDR10+ support on an iPhone, iPad or a Mac, not a television. Do we really care that when watching a movie on a tiny screen in less-than-optimal conditions that it isn't in HDR10+?
 
If it supports Dolby Vision, it supports the latest generation high dynamic range technology and it does not need HDR+.

Unfortunately, this is not true. As others have pointed out HDR10+ is basically an alternative to Dolby Vision. For example, Prime Video supports HDR10+, if your TV does not support that, it defaults to HDR10 and not Dolby Vision.

It's silly that this format was even created, and it is was created because Samsung didn't want to pay to license Dolby Vision. It further segments the market for no reason at all.
 
  • Like
Reactions: Marshall73
HDR requires a very good TV with a very high luminance output to be able to reap its benefits. Just because it has an HDR sticker on it doesn't mean it's actually HDR capable.
Very little content available via streaming services is actually in 4K, much less 4K HDR. And when it is available, the content maker might not have done a good job of creating it. The guy who originally posted he could not tell the difference might be thinking he is watching 4K HDR even though most of what he is likely watching isn't HDR, and most of it isn't even 4K. Just because one has a 4K HDR capable TV doesn't mean that everything gets magically converted to 4K HDR, but some people are under that incorrect assumption. Such is the power of marketing! Most content on streaming services is still HD SDR. A 4K HDR TV can't magically make HD SDR content look like 4K HDR.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.