Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
FYI - I have my Pro Display XDR connected to my 2019 MBP 16", and I get DSC status: Enabled. Just in case someone is wondering.
Thanks. I have already received info for the following connection types from other people:
  • Dual HBR2 (5K) (Thunderbolt only, Alpine Ridge, or Blackmagic eGPU before firmware update)
  • HBR2 DSC (6K) (Navi GPUs such as the 5300M or 5500M in the MBP 16", 5700XT, W5700, W5700X, RTX GPUs)
  • Dual HBR3 (6K) (Thunderbolt only, non-Navi GPUs, macOS only until PCs get new Titan Ridge drivers or firmware)
The product IDs are ae22 for the first two and ae2e for the last one. ae22 has an overlay EDID using ae23 product ID. ae2e has an overlay EDID using ae2f product ID. I am still wondering where ae21 or ae2d might come from.

I am looking for info about non-DSC single cable connection (such as Moshi USB-C to DisplayPort cable from a Radeon VII or RX 580 (or Vega II with Cable Matters USB-C to DisplayPort adapter or Thunderbolt 3 dock with DisplayPort output). There is a question of whether this mode allows HBR3.

Another thread elsewhere has experiments attempting to use YCbCr modes with 4:4:4, 4:2:2, or 4:2:0 chroma sub sampling. Windows was used with a Nvidia Pascal GTX card because the Pascal card supports DisplayPort 1.4 (required for HBR3 testing and 4:2:0 testing) and doesn't support DSC (because DSC is more preferable for compression than chroma sub sampling) and the Nvidia control panel has control over color format (RGB/YCbCr), and chroma sub sampling (4:4:4, 4:2:2, 4:2:0), and bit depth (6,8,10,12 bpc). The tests were unable to get any YCbCr mode to work or HBR3. What did work is 6 bpc for RGB to get 5K using only HBR2.

Also, I am wondering if HBR2 DSC works with USB through a Sunix UPD2018, Huawei VR 2 Computer Connection Cable, or Wacom Link Plus (for people with a card that supports DSC but doesn't have a USB-C port like the RTX or W5700 cards - the USB connection should enable brightness control, preset selection, etc. in macOS and maybe Windows if you can get the Boot Camp drivers installed which is a question for PCs).
 
Hardware calibration AND/OR profiling is touted as coming soon. See: https://www.apple.com/mt/pro-display-xdr/specs/

“Hardware calibration” is a specific term in color correction. It refers to the ability to upload corrective LUTs to the monitor’s internal driver circuitry, bypassing the OS and graphics driver. It’s not something that can be added through software updates later on down the road.

High end (Eizo, NEC, some BenQ) monitors usually have this feature. As far as I know, the XDR doesn’t. It’s a ‘dumb’ display in that it entirely relies on an attached MacOS computer to manage its color output.
 
I always understood hardware calibration differently: the ability to set a specific target in hardware (luminosity, gamma, gamut and color temperature), after the hardware calibration profiling (measuring color patterns to correct slight differences from the specified target and saving them in a icc Profile to correct in software).
What you describe with Eizos et al, is correct but not the complete way to color accurate viewing as profiling is always needed to correct for specific differences of every unique monitor. Profiling also corrects changes that happen because of aging components (backlight and color shifts).
 
Last edited:
  • Like
Reactions: Adult80HD
It’s a ‘dumb’ display in that it entirely relies on an attached MacOS computer to manage its color output.
It has a USB interface so it's not so dumb. Have you seen the output from DisplayDiagnose?
Code:
/System/Library/Extensions/AppleGraphicsControl.kext/Contents/MacOS/DisplayDiagnose -a > DisplayDiagnose.txt 2>&1
 
  • Like
Reactions: Adult80HD
I always understood hardware calibration differently: the ability to set a specific target in hardware (luminosity, gamma, gamut and color temperature), after the hardware calibration profiling (measuring color patterns to correct slight differences from the specified target and saving them in a icc Profile to correct in software).
What you describe with Eizos et al, is correct but not the complete way to color accurate viewing as profiling is always needed to correct for specific differences of every unique monitor. Profiling also corrects changes that happen because of agent components (backlight and color shifts).

You can use ICC profiles with a monitor set to it’s natural gamut (and obviously adjust luminosity and white point to their correct values for the target you’re chasing), or you can just create a proper LUT with those corrections in place and upload it to the monitor to run directly. The latter tends to be more accurate, depending on the LUT generation process, and you need to do it to run through any signal path that avoids the graphics driver (eg. via a BlackMagic card). It has the added bonus of not relying on any OS at all, meaning you can plug the display into any other PC or Mac and begin work immediately.

As you may know, video work never uses ICC profiles. That’s something that only exists in print-land, and has its own set of challenges and inaccuracies.
[automerge]1581899809[/automerge]
It has a USB interface so it's not so dumb. Have you seen the output from DisplayDiagnose?
Code:
/System/Library/Extensions/AppleGraphicsControl.kext/Contents/MacOS/DisplayDiagnose -a > DisplayDiagnose.txt 2>&1

I haven’t. Can you upload a LUT or not?
 
I haven’t. Can you upload a LUT or not?
No. I'm just saying there's a lot of info that can be read from the display and therefore maybe a way to write info to the display but we have to wait for Apple to make the software to do that (maybe there'll be an XDR firmware updater - I think I've seen different versions of the firmware being used already from different people that posted results from the DisplayDiagnose command).
 
  • Like
Reactions: Adult80HD
In the original post, it was noted that macOS doesn't offer any display modes between the native @2x and the @1x.

This is obviously different to many other displays (such as 5K iMacs and the 5K LG) where there's a mode that's about 25% larger, which is then scaled down to fit the display.

When I unboxed my XDR I didn't realise this and would have returned it because of this, because the native @2x looks comically large to my eyes (I'm coming from a 5K iMac where I always ran the 25% larger mode).

The good news is that it's possible to force a 25% larger mode if you have a tool that can interact with the CoreGraphics APIs directly. I used Hammerspoon, because it's my project, but presumably SwitchResX could do it too. I'm now running 3840x2160@2x - ie the framebuffer is 8K. Works great even on the 580X.
 
That’s been done already and it’s here:

Thanks, I have also succeeded in getting the 6K from PC laptop.
[automerge]1581986303[/automerge]
I've seen AGDCDiagnose for 0xae22 but not that one. Can you send me the output so I can compare?
I'm using a PC laptop to drive the XDR with native 6K resolution, and everything looks perfect. I wonder if there is any similar diagnose script that I can run in Windows or Linux to compare with these results? Thanks.
 
Last edited:
I'm using a PC laptop to drive the XDR with native 6K resolution, and everything looks perfect. I wonder if there is any similar diagnose script that I can run in Windows or Linux to compare with these results?
What laptop? Are you using Intel Ice Lake, AMD Navi, or Nvidia RTX? Anything else doesn't support DSC and would require Thunderbolt 3 for dual HBR3 but Windows Thunderbolt drivers don't allow dual HBR3 as far as I know.

Maybe try GPU-Z and look at the advanced tab? It hasn't been updated to show DSC though.
CRU and Monitor Asset Manager (moninfo.exe) can get the EDID of the display. sysfs in linux might have some info (especially for Intel graphics).
 
What laptop? Are you using Intel Ice Lake, AMD Navi, or Nvidia RTX? Anything else doesn't support DSC and would require Thunderbolt 3 for dual HBR3 but Windows Thunderbolt drivers don't allow dual HBR3 as far as I know.

Maybe try GPU-Z and look at the advanced tab? It hasn't been updated to show DSC though.
CRU and Monitor Asset Manager (moninfo.exe) can get the EDID of the display. sysfs in linux might have some info (especially for Intel graphics).
My Razer laptop has Nvidia RTX, so I ssume it supports DSC and that's why it could drive the monitor with 6K.

CPU-Z.PNG

GPU-Z.PNG

Monitor Asset Manager.PNG

CRU.PNG
 

Attachments

  • CPU-Z.PNG
    CPU-Z.PNG
    44.1 KB · Views: 181
  • CRU.PNG
    CRU.PNG
    59.8 KB · Views: 158
  • Monitor Asset Manager.PNG
    Monitor Asset Manager.PNG
    1.5 MB · Views: 242
My Razer laptop has Nvidia RTX, so I ssume it supports DSC and that's why it could drive the monitor with 6K.
I think so. GPU-Z says you are connecting with HBR2 link rate (5.4 Gbps) which requires DSC to do 6K 60Hz 12bpc (YCbCr 4:2:0 8bpc HBR2 can do 6K too but it's not as good as DSC and the XDR does not support YCbCr).

Both CRU and MAM show virtually empty EDIDs (no timings, only one extension block). Does MAM show any more info using the Active option?

With the EDID you're seeing, you should not be able to get 6K. Maybe the EDID that gets used is elsewhere. Maybe the Boot Camp driver works around this.

Something strange may be happening in Windows with Nvidia and the XDR. Another way to get the display to work (without Boot Camp driver) is to use CRU to override the EDID but you still need the Boot Camp driver to set brightness or use presets.
 
  • Like
Reactions: henrymyf
Thanks. I have already received info for the following connection types from other people:
  • Dual HBR2 (5K) (Thunderbolt only, Alpine Ridge, or Blackmagic eGPU before firmware update)
  • HBR2 DSC (6K) (Navi GPUs such as the 5300M or 5500M in the MBP 16", 5700XT, W5700, W5700X, RTX GPUs)
  • Dual HBR3 (6K) (Thunderbolt only, non-Navi GPUs, macOS only until PCs get new Titan Ridge drivers or firmware)
The product IDs are ae22 for the first two and ae2e for the last one. ae22 has an overlay EDID using ae23 product ID. ae2e has an overlay EDID using ae2f product ID. I am still wondering where ae21 or ae2d might come from.

I am looking for info about non-DSC single cable connection (such as Moshi USB-C to DisplayPort cable from a Radeon VII or RX 580 (or Vega II with Cable Matters USB-C to DisplayPort adapter or Thunderbolt 3 dock with DisplayPort output). There is a question of whether this mode allows HBR3.

Another thread elsewhere has experiments attempting to use YCbCr modes with 4:4:4, 4:2:2, or 4:2:0 chroma sub sampling. Windows was used with a Nvidia Pascal GTX card because the Pascal card supports DisplayPort 1.4 (required for HBR3 testing and 4:2:0 testing) and doesn't support DSC (because DSC is more preferable for compression than chroma sub sampling) and the Nvidia control panel has control over color format (RGB/YCbCr), and chroma sub sampling (4:4:4, 4:2:2, 4:2:0), and bit depth (6,8,10,12 bpc). The tests were unable to get any YCbCr mode to work or HBR3. What did work is 6 bpc for RGB to get 5K using only HBR2.

Also, I am wondering if HBR2 DSC works with USB through a Sunix UPD2018, Huawei VR 2 Computer Connection Cable, or Wacom Link Plus (for people with a card that supports DSC but doesn't have a USB-C port like the RTX or W5700 cards - the USB connection should enable brightness control, preset selection, etc. in macOS and maybe Windows if you can get the Boot Camp drivers installed which is a question for PCs).

Hi Joevt,

The new version LG UltraFine 5K monitor using Titan Ridge chip too. Do it support HBR2 DSC mode? My real question is can RTX GPUs drive new version to 5K with that USB-C "virtual link" ?? Thanks
 
The new version LG UltraFine 5K monitor using Titan Ridge chip too. Do it support HBR2 DSC mode? My real question is can RTX GPUs drive new version to 5K with that USB-C "virtual link" ?? Thanks

The LG UltraFine 5K supports the following:
  • Single HBR2 (4K) (old LG UltraFine 5K is Thunderbolt only)
  • Dual HBR2 (5K) (Thunderbolt only)
Even though the new LG UltraFine 5K has Titan Ridge, it does not support DisplayPort 1.4 at all (no HBR3 or DSC). The XDR display is the only one I know that supports DSC. There are many displays that support HBR3 (Planar has a full 5K display supporting 8 bpc max with HBR3). Maybe some 8K TVs support DSC? DisplayPort 1.4 MST hub supports DSC input to allow higher bandwidth for multiple DSC or non-DSC outputs (but macOS doesn't support MST).

To drive LG UltraFine 5K monitor with RTX or any GPU, you need a Thunderbolt 3 add-in card (such as GC-ALPINE RIDGE or GC-TITAN RIDGE) or a motherboard that has Thunderbolt controller and two DisplayPort inputs. It is possible to use a single DisplayPort output from the GPU by using a DisplayPort 1.4 MST hub (MST hub only works in Windows; it might count as two displays; you still need a Thunderbolt controller; it is limited to 8 bpc if you use a GPU older than RTX; I have not tried this with an RTX so I don't know if 10 bpc will work which requires DSC input to the MST hub).
 
  • Like
Reactions: alex76dvd
The LG UltraFine 5K supports the following:
  • Single HBR2 (4K) (old LG UltraFine 5K is Thunderbolt only)
  • Dual HBR2 (5K) (Thunderbolt only)
Even though the new LG UltraFine 5K has Titan Ridge, it does not support DisplayPort 1.4 at all (no HBR3 or DSC). The XDR display is the only one I know that supports DSC. There are many displays that support HBR3 (Planar has a full 5K display supporting 8 bpc max with HBR3). Maybe some 8K TVs support DSC? DisplayPort 1.4 MST hub supports DSC input to allow higher bandwidth for multiple DSC or non-DSC outputs (but macOS doesn't support MST).

To drive LG UltraFine 5K monitor with RTX or any GPU, you need a Thunderbolt 3 add-in card (such as GC-ALPINE RIDGE or GC-TITAN RIDGE) or a motherboard that has Thunderbolt controller and two DisplayPort inputs. It is possible to use a single DisplayPort output from the GPU by using a DisplayPort 1.4 MST hub (MST hub only works in Windows; it might count as two displays; you still need a Thunderbolt controller; it is limited to 8 bpc if you use a GPU older than RTX; I have not tried this with an RTX so I don't know if 10 bpc will work which requires DSC input to the MST hub).

Thanks! Very Ddetailed explanation.
 
Just received a Pro Display XDR. I haven’t had time to fiddle around with it yet.
But I have questions.

1. 1600 nit vs 500 nit profile. Which one am I supposed to use for general desktop usage? (Office, internet etc)

2. Is there no HDR on/off toggle option?
 
Just received a Pro Display XDR. I haven’t had time to fiddle around with it yet.
But I have questions.

1. 1600 nit vs 500 nit profile. Which one am I supposed to use for general desktop usage? (Office, internet etc)

2. Is there no HDR on/off toggle option?

Use the 500 nit one, but you can use the 1600 if you want--it will max out at 500 for regular stuff anyway. It can only do the 1600 nits when displaying HDR content.

If you're not in one of the HDR modes, it won't display HDR content--so there's no "on/off" for the HDR besides choosing the correct profiles.
 
Use the 500 nit one, but you can use the 1600 if you want--it will max out at 500 for regular stuff anyway. It can only do the 1600 nits when displaying HDR content.

If you're not in one of the HDR modes, it won't display HDR content--so there's no "on/off" for the HDR besides choosing the correct profiles.
I wonder what the "HDR modes" means: if the HDR content is only displayed in a window, then will only that window show 1600 nits, and the rest of the monitor will still show 500 nits?
 
Use the 500 nit one, but you can use the 1600 if you want--it will max out at 500 for regular stuff anyway. It can only do the 1600 nits when displaying HDR content.

If you're not in one of the HDR modes, it won't display HDR content--so there's no "on/off" for the HDR besides choosing the correct profiles.
Ok. It sound like there’s no reason to use 500 nit mode; if 1600 nit mode automatically fall backs to 500 nit mode for regular SDR contents. Am I missing something?
 
Ok. It sound like there’s no reason to use 500 nit mode; if 1600 nit mode automatically fall backs to 500 nit mode for regular SDR contents. Am I missing something?

That would seem to be the case, yes. I believe others would argue that some of the other preset and recalibrated profiles are more appropriate for other tasks, but IMO for general desktop use you're fine leaving it on the 1600 mode.
 
That would seem to be the case, yes. I believe others would argue that some of the other preset and recalibrated profiles are more appropriate for other tasks, but IMO for general desktop use you're fine leaving it on the 1600 mode.
I just wonder why there’s a separate 500 nit mode when 1600 nit mode is inclusive of 500 nit mode.
 
I just wonder why there’s a separate 500 nit mode when 1600 nit mode is inclusive of 500 nit mode.

That's because they are not identical. There are subtle differences in the settings. Whether those matter to you or anyone else in daily use are another thing. The primary differences are the color gamut and SDR transfer functions. See here:
 

Attachments

  • Screen Shot 2020-05-13 at 8.33.54 AM.png
    Screen Shot 2020-05-13 at 8.33.54 AM.png
    1 MB · Views: 232
  • Screen Shot 2020-05-13 at 8.32.57 AM.png
    Screen Shot 2020-05-13 at 8.32.57 AM.png
    1 MB · Views: 236
That's because they are not identical. There are subtle differences in the settings. Whether those matter to you or anyone else in daily use are another thing. The primary differences are the color gamut and SDR transfer functions. See here:
Not sure why you posted a comparison screenshots between 1600 nit mode and sRGB mode. I was talking about 1600 nit mode and 500 nit mode.
 
Not sure why you posted a comparison screenshots between 1600 nit mode and sRGB mode. I was talking about 1600 nit mode and 500 nit mode.

Sorry my bad, I confused which modes were being discussed. Yeah...the regular Apple 500 is pretty much identical to the 1600 with the primary difference being only the nits level possible. Either the 500 is there just for folks to "match" a typical Apple monitor or there's some reason behind the scenes that it's different, who knows?
 
Sorry my bad, I confused which modes were being discussed. Yeah...the regular Apple 500 is pretty much identical to the 1600 with the primary difference being only the nits level possible. Either the 500 is there just for folks to "match" a typical Apple monitor or there's some reason behind the scenes that it's different, who knows?
Good point. Maybe only Apple knows. Or maybe even Apple doesn't know.

Maybe the real reason is these two modes were created by two individuals, who are equally aggressive, so Apple put the two modes on both. XD
[automerge]1589474703[/automerge]
Sorry my bad, I confused which modes were being discussed. Yeah...the regular Apple 500 is pretty much identical to the 1600 with the primary difference being only the nits level possible. Either the 500 is there just for folks to "match" a typical Apple monitor or there's some reason behind the scenes that it's different, who knows?
Good point. Maybe only Apple knows. Or maybe even Apple doesn't know.

Maybe the real reason is these two modes were created by two individuals, who are equally aggressive, so Apple put the two modes on both. XD
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.