Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

miatadan

macrumors regular
Original poster
Apr 23, 2006
102
19
Sudbury,ON , Canada
Right now average tv such as Sony X90J series ( RTINGS.com )
SDR Brightness
Real Scene Peak Brightness : 519 cd/m2

HDR Brightness:
Real Scene Highlight: 784 cd/m2

Higher end tv with mini-led such as Samsung QN90A ( RTINGS.com )
SDR Brightness
Real Scene Peak Brightness: 1088 cd/m2

HDR Brightness
Real Scene Peak Highlight: 1810 cd/m2

Now for those that say that they can find 27" monitors with HDR and high refresh rate in $300-500 range do not have good performance .

Good example of this : Asus ROG Strix XG27UG:
according to RTINGS.com

SDR Peak Scene 449 cd/m2
HDR Real Scene 472 cd/m2

The Apple Studio Display is 600 nits brightness but not HDR because high quality HDR requires 1000 nits sustained ( full screen ) and even the Asus ProArt display PA27UCX-K is $2999.00 US dollars and the PA32UCX-PK is $4499.00 US ( close to Apple Pro Display XDR in pricing )

the high refresh rates only needed by gamers

For me the Wide colour ( P3 ) and
Available reference modes for HDTV Video, NTSC Video, 2 Digital Cinema P3 modes more important that just HDR by itself.

Your average 4K monitor does not support Dolby Vision or proper HDR1000.

Besides that some prefer the build quality of the Apple displays over cheap plastic.

Dan
 
What is the advantage of HDR? And why do people want such bright displays?
I understand the value of colour accuracy, and good contrast, but isn't it more useful to have these features at lower brightness?
 
The Apple Studio Display is 600 nits brightness but not HDR because high quality HDR requires 1000 nits sustained ( full screen ) and even the Asus ProArt display PA27UCX-K is $2999.00 US dollars and the PA32UCX-PK is $4499.00 US ( close to Apple Pro Display XDR in pricing )

Your average 4K monitor does not support Dolby Vision or proper HDR1000.

Displaying HDR content in a satisfying manner isn't just a question of peak brightness. It is also, and IMO more importantly, a question of contrast, colour gamut, and processing pipeline to convert the "absolute" values into something that makes sense for your display technology and the lighting of the environment you're watching the content in.
In darker environments you probably don't need to go much above 600 nits peak for a portion of the screen for now, most graders don't even bother going higher as it makes for a rather uncomfortable experience.

VESA HDR certification separates the scale into regular and "True Black" ones to acknowledge that HDR content can be reasonably well displayed on screens with a lower peak brightness provided they also have lower black levels and better local contrast :D. https://displayhdr.org

This doesn't reach superbly high brightness values across a large portion of the screen, but in darker environments you can expect, IMO, a better HDR experience than the XDR display (and conversely the latter will prove better in brighter environments) : https://www.dell.com/en-us/shop/ali...3dw/apd/210-bcye/monitors-monitor-accessories

The Studio Display being a regular IPS display doesn't have enough contrast to properly display HDR content, that's all :D.

What is the advantage of HDR? And why do people want such bright displays?

The main advantage is that it's a format that started its life as an "absolute" one (ie x digital values mean x actual brightness on your display) and that requires more stringent capabilities on the display side from which we all benefit, even for SDR content (such as a larger colour gamut or a higher contrast ratio).

If you watch HDR "as the artist intended" then you'll only rarely need to go much above 600 nits for a portion of the screen. Most HDR content is graded for darker environments and most grader don't use the higher nits values when doing so, at least for now.

Where it's quite interesting to be able to reach a higher peak brightness is when you don't watch HDR content in a darker environment. While HDR - theoretically - is an "absolute" format, in practice most people will watch HDR content under varied lighting conditions. In brighter environment you may want to shift the entire content brighter if possible to preserve the visible dynamic range.

Since most displays are incapable of reproducing the full range anyway (at least locally), in practice the "absolute" nature of HDR content rarely is displayed as intended, whether that's because of a lack of full display or local contrast (LED TVs), enough brightness across a large enough portion of the screen (OLED), lack of colour gamut, necessary tone mapping to alleviate these limitations, etc. And on top of that you add TV manufacturers' annoying tendencies to add their own - undesirable - processing on top of it and poor calibration out of the factory.
 
Last edited:
Right now average tv such as Sony X90J series ( RTINGS.com )
SDR Brightness
Real Scene Peak Brightness : 519 cd/m2

HDR Brightness:
Real Scene Highlight: 784 cd/m2

Higher end tv with mini-led such as Samsung QN90A ( RTINGS.com )
SDR Brightness
Real Scene Peak Brightness: 1088 cd/m2

HDR Brightness
Real Scene Peak Highlight: 1810 cd/m2

Now for those that say that they can find 27" monitors with HDR and high refresh rate in $300-500 range do not have good performance .

Good example of this : Asus ROG Strix XG27UG:
according to RTINGS.com

SDR Peak Scene 449 cd/m2
HDR Real Scene 472 cd/m2

The Apple Studio Display is 600 nits brightness but not HDR because high quality HDR requires 1000 nits sustained ( full screen ) and even the Asus ProArt display PA27UCX-K is $2999.00 US dollars and the PA32UCX-PK is $4499.00 US ( close to Apple Pro Display XDR in pricing )

the high refresh rates only needed by gamers

For me the Wide colour ( P3 ) and
Available reference modes for HDTV Video, NTSC Video, 2 Digital Cinema P3 modes more important that just HDR by itself.

Your average 4K monitor does not support Dolby Vision or proper HDR1000.

Besides that some prefer the build quality of the Apple displays over cheap plastic.

Dan
Exactly. HDR400 is crap. I really hate these marginal false advertising. To me, black level is even more important. That’s why OLED display always look so good even when the peak brightness is only about 700-800 nits, because black is really black. The shortcoming of Studio Display is no local dimming zone. Black is always a shade of grey. The max brightness is not bad at 600 nits actually.
 
I apologize for resurrecting this old thread but I have finally received my long delayed Studio Display. I am not at all a professional but I have used calibration tools and software to calibrate all my TVs and monitors for over a decade.

My understanding HDR displays:
  1. High Color Depth: Ideally true native 10-bit for smooth and natural color gradient. 12-bit for Dolby Vision but I don't think any consumer displays are capable. Many consumer grade displays are 8-bit with Frame Rate Control, dithering to simulate 10-bit.
  2. Wide Color Gamut: Ideally 90% or higher on DCI-P3 color space.
  3. Deep Contrast: Deep contrast ratio, such as infinite:1 on OLED and 4000:1 or higher for LCD with local dimming.
  4. High Peak Brightness: Ideally 1,000 nits or higher. Most HDR contents are mastered at 1,000 nits maximum, some as high as 4,000 nits.
From PC Magazine's reviews of Apple Studio Display and Pro Display XDR:

Apple Studio DisplayApple Pro Display XDR
Color Depth10-bit10-bit
DCI-P3 Color Gamut99.0%98.7%
Contrast Ratio970:112,460:1
Peak Brightness587 nits1,561 nits

As to be expected, Studio Display's contrast ratio is embarrassingly bad. It is utterly incapable of displaying deep blacks with rich shadow details.

But I think 587 nits is more than respectable. For SDR, dark room screens are calibrated at 100 nits. Under typical office lighting conditions, 120 nits. If the display is near the sunlight or bright indoor lighting, the figure will be much higher.

When the display is enabled for HDR and is displaying HDR contents, higher luminance images (e.g., sunset, headlights, lamp) will be allowed to exceed the SDR calibrated limit, for that eye popping experience.

It is sad that Apple chose not to enable HDR on Studio Display. Most HDR computer displays have far worse HDR spec than Studio Display.
 
Last edited:
  • Like
Reactions: cocoua
I apologize for resurrecting this old thread but I have finally received my long delayed Studio Display. I am not at all a professional but I have used calibration tools and software to calibrate all my TVs and monitors for over a decade.

My understanding HDR displays:
  1. High Color Depth: Ideally true native 10-bit for smooth and natural color gradient. 12-bit for Dolby Vision but I don't think any consumer displays are capable. Many consumer grade displays are 8-bit with Frame Rate Control, dithering to simulate 10-bit.
  2. Wide Color Gamut: Ideally 90% or higher on DCI-P3 color space.
  3. Deep Contrast: Deep contrast ratio, such as infinite:1 on OLED and 4000:1 or higher for LCD with local dimming.
  4. High Peak Brightness: Ideally 1,000 nits or higher. Most HDR contents are mastered at 1,000 nits maximum, some as high as 4,000 nits.
From PC Magazine's reviews of Apple Studio Display and Pro Display XDR:

Apple Studio DisplayApple Pro Display XDR
Color Depth10-bit10-bit
DCI-P3 Color Gamut99.0%98.7%
Contrast Ratio970:112,460:1
Peak Brightness587 nits1,561 nits

As to be expected, Studio Display's contrast ratio is embarrassingly bad. It is utterly incapable of displaying deep blacks with rich shadow details.

But I think 587 nits is more than respectable. For SDR, dark room screens are calibrated at 100 nits. Under typical office lighting conditions, 120 nits. If the display is near the sunlight or bright indoor lighting, the figure will be much higher.

When the display is enabled for HDR and is displaying HDR contents, higher luminance images (e.g., sunset, headlights, lamp) will be allowed to exceed the SDR calibrated limit, for that eye popping experience.

It is sad that Apple chose not to enable HDR on Studio Display. Most HDR computer displays have far worse HDR spec than Studio Display.
I'm keeping the old resurrected thread alive a bit longer because I find the topic interesting. I'm curious about something I've seen when comparing different monitors. This is something that I've not seen anyone else comment on.
The Wikipedia page, as well as many other sites, say the Apple Studio Display "does not support HDR content". In my quick tests, I'm not sure that's entirely true.

I have 3 monitors to compare: an Apple Studio Display, an LG 4k 27UL850, and an Asus ProArt PA27UCX-K (a DisplayHDR 1000 monitor).
When playing HDR video examples from YouTube, the HDR indicator shows up on both the Asus ProArt and the Apple Studio Display, but not on the LG 4k (when HDR toggled off in Mac settings). After turning off the HDR option on the Asus ProArt, the YouTube HDR example no longer shows HDR playing, it simply shows its playing 4k. There is no HDR on/off option for the Studio Display using Mac OS Ventura.
If you turn on "stats for nerds" on YouTube for the HDR video example, both the ProArt and the StudioDisplay show they are detecting the smpte2084 (PQ)/BT.2020 color space and for this video they both run about 30-35Mbps data rate. This is consistent with YouTube HDR content. The LG SDR monitor playing the same HDR video example shows it's using the BT.709 color space and running around 15-19Mbps data rate, c/w SDR 4k video.

To me, although the Studio Display does not meet all the VESA HDRDisplay 600 specs, this would seem to indicate the Studio Display is seeing and using HDR metadata. At least it's doing so with YouTube HDR videos. As you said, many advertised HDR monitors have worse specs than the Studio Display. Apple seems to have included some HDR capabilities but rightly did not label it as a true HDR display.
 
  • Like
Reactions: nutmac
To me, although the Studio Display does not meet all the VESA HDRDisplay 600 specs, this would seem to indicate the Studio Display is seeing and using HDR metadata. At least it's doing so with YouTube HDR videos. As you said, many advertised HDR monitors have worse specs than the Studio Display. Apple seems to have included some HDR capabilities but rightly did not label it as a true HDR display.
I calibrated my Studio Display's brightness setting to 121 nits (to 100% SDR white pattern, the closest I can set it to my ideal brightness of 120 nits, or one shift + option + F1 below 50% brightness). I disabled both Automatically adjust brightness and True Tone settings.

I played Color Bars BT2020 100% Blue Filter 4K UHD HDR10 from YouTube, placing i1Display Pro on its 100% white color bar. The measurement jumped from 121 nits to 250 nits.

I then opened Photos app and played two Dolby Vision video clips of my kid skiing and playing tennis (recorded with iPhone 13 Pro). Its peak brightness on the brightest part jumped to over 200 nits.

If I temporarily set my Studio Display to 100% brightness, I measure 535 nits on 100% SDR white pattern. YouTube's 100% HDR measures 563 nits.

So it does seem Studio Display is capable of HDR, but at greatly reduced capacity. Full HDR should map 100% to the highest peak brightness possible.
 
So it does seem Studio Display is capable of HDR, but at greatly reduced capacity. Full HDR should map 100% to the highest peak brightness possible.


Look for "EDR". Apple has enabled some of their more recent panels to display HDR content to a limited extent above the nominal white value.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.