Mac Mini HDR?

WMU328

macrumors newbie
Original poster
Oct 27, 2017
26
9
I know that HDMI 2.0a is needed for HDR. The Mac Mini has HDMI 2.0, so we can't use HDR out of the box. But what if a eGPU is used? Then can HDR be possible? Or is there another catch? I want to use this for editing HDR videos, and also watching HDR content.

Thanks
 

benzslrpee

macrumors 6502
Jan 1, 2007
406
26
if the eGPU can support HDR (e.g. Blackmagic eGPU Pro) then you are good to go in terms of graphics. the other catch is the monitor. i was searching for an HDR capable, wide color gamut display... the pickings were a bit sparse

I know that HDMI 2.0a is needed for HDR. The Mac Mini has HDMI 2.0, so we can't use HDR out of the box. But what if a eGPU is used? Then can HDR be possible? Or is there another catch? I want to use this for editing HDR videos, and also watching HDR content.

Thanks
 

SpacemanSpiffed

macrumors regular
Mar 27, 2013
183
265
Pacific NW
Are you sure HDMI 2.0 can’t handle HDR? Is the Mac mini HDMI 2.0 or 2.0a?
[doublepost=1541309212][/doublepost]I called Apple and they confirmed it’s not compatible. That sucks.
That sucks. All they needed to do was include a LSPCON (Level Shifter and Protocol Converter) on the motherboard. The GPU has had all the needed 10-bit support since Kaby Lake. I hope the person you spoke with was wrong .. it seems like something they wouldn't have missed.

(And you dont need the LSPCON for 10-bit/HDR over display port - that should be possible no matter what)
 
  • Like
Reactions: ElectronGuru

SpacemanSpiffed

macrumors regular
Mar 27, 2013
183
265
Pacific NW

ElectronGuru

macrumors 68000
Sep 5, 2013
1,520
370
Oregon, USA
tl;dr - UHD630 Graphics has HDR supported internally (hardware for 10-bit video, etc), but needs an extra chip on the motherboard to send an HDR signal out over HDMI.
Is LSPCON function such that it can only be done by a dedicated chip or is it possible for Apple to emulate LSPCON functionality with an S2 or other chip already on board?
 

SpacemanSpiffed

macrumors regular
Mar 27, 2013
183
265
Pacific NW
Is LSPCON function such that it can only be done by a dedicated chip or is it possible for Apple to emulate LSPCON functionality with an S2 or other chip already on board?
No idea. I think we are 'wait and see' mode to see if by HDMI 2.0 they meant "all of HDMI 2" or "2.0 specifically".

Intel's Kaby Lake NUCs (not even the latest) have 2.0a and HDR support. Seems Crazy that Apple would overlook that.
 
  • Like
Reactions: ElectronGuru

WMU328

macrumors newbie
Original poster
Oct 27, 2017
26
9
No idea. I think we are 'wait and see' mode to see if by HDMI 2.0 they meant "all of HDMI 2" or "2.0 specifically".

Intel's Kaby Lake NUCs (not even the latest) have 2.0a and HDR support. Seems Crazy that Apple would overlook that.
I agree. Wait and see mode. Looking forward to some answers!!
 
  • Like
Reactions: SpacemanSpiffed

Chancha

macrumors 6502a
Mar 19, 2014
941
830
So macOS itself is HDR ready? Meaning that as long as the GPU and the output path supports HDR, the OS can show the entire interface respecting HDR?
 

Chancha

macrumors 6502a
Mar 19, 2014
941
830
This really sucks. It essentially diminishes the mini's potential as a modern HTPC. Some of the HDR capable 4K TVs costs less than a MBP nowadays.
 
  • Like
Reactions: trifid

EugW

macrumors G3
Jun 18, 2017
9,037
6,418
Good question. We shall see soon enough but I’d be surprised if it did not support 10-bit colour. That’s a pretty basic feature for some of its target audience - video and multimedia content creators.
 

SpacemanSpiffed

macrumors regular
Mar 27, 2013
183
265
Pacific NW
So macOS itself is HDR ready? Meaning that as long as the GPU and the output path supports HDR, the OS can show the entire interface respecting HDR?
No it’s not. See the answer from SpaceManSpiffed.
Actually I'm unsure if the OS itself needs to be "aware" of HDR. It's really a byproduct of the application doing the image processing and the data being processed.

Hardware wise, HDR seems to mean support for 10-bit color components and getting them to the display, so you have 1 billion possible colors instead of 16 million. All recent GPUs support 10-10-10-2 pixel formats (10 bits for each RGB and 2 bits for alpha). You've probably seen Monitor specs as either 6-bit +FRC (a form of dither) and meh or 8-bit color component support as well the whole 'how much of which color gamut' they cover. The newest high-end pro monitors support 10-bit uncompressed components with 16-bit internal LUTs for the color curves, but you still have the issue of brightest vs darkest, which is impacted by panel type, backlighting, etc (OLED panels having much 'blacker' blacks for example)

I'm guess I'm saying "It is complicated" and I don't know that we have an HDR "standard" yet.
 
  • Like
Reactions: ElectronGuru

Chancha

macrumors 6502a
Mar 19, 2014
941
830
I think we can stick to the DisplayHDR certification as a "HDR standard" reference, since future displays are likely going to be certified with it, at least on the consumer end, where it concerns the Mini used as an HTPC.

But my above question on the OS itself is more on the editing / color grading prospects. I remember back during the Cheese Grater Mac Pro days, still on OS X 10.6.8 and FCP7, since the OS itself does not support 10-bit, we had to install a BlackMagic Intensity Pro, which is just a 2 lane PCI card that takes care of 10-bit 4:2:2 HDMI, where we connected a SONY PVM OLED to it. The monitor could only get inputs from "supported" applications, it could not act as a regular display recognized by the OS and used as a mirror / extend desktop for instance, therefore the usefulness of such a setup was not that great.

If the Mini's limitation is just on its built-in I/O, then we can hope a TB3/USB-C external video card can deal with this. If it is OS or firmware level limit then it is dead end, for now.
 

Chancha

macrumors 6502a
Mar 19, 2014
941
830
  • Like
Reactions: SpacemanSpiffed

WMU328

macrumors newbie
Original poster
Oct 27, 2017
26
9
I believe the S2719DC is 1440p, so even at 10-bit the bandwidth of the HDMI version that it takes to happen is not the same as 4K HDR. But this is still assuring since HDR is respected on the OS level it seems.
I wonder if an eGPU would help. If I have 2 4k HDR monitors, what are my options? I think it would have to be an eGPU. I am waiting for a review will cover this soon..
 

archer75

macrumors 68030
Jan 26, 2005
2,924
1,472
Oregon
Thanks for the explanations.

The alternative would be to use a NAS with Apple TV 4K & Infuse apparently.
A ATV4k is wonderful! I use it, a DAS on my desktop(though you don't need a DAS or NAS depending on your setup) but primarily plex though I use infuse when needed. I also have rokus, chromecasts, nvidia shield and more. And use plex to stream outside the home to friends and family.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.