Mac Mini HDR?

Discussion in 'Mac mini' started by WMU328, Nov 3, 2018.

  1. WMU328 macrumors newbie

    Joined:
    Oct 27, 2017
    #1
    I know that HDMI 2.0a is needed for HDR. The Mac Mini has HDMI 2.0, so we can't use HDR out of the box. But what if a eGPU is used? Then can HDR be possible? Or is there another catch? I want to use this for editing HDR videos, and also watching HDR content.

    Thanks
     
  2. benzslrpee macrumors 6502

    benzslrpee

    Joined:
    Jan 1, 2007
    #2
    if the eGPU can support HDR (e.g. Blackmagic eGPU Pro) then you are good to go in terms of graphics. the other catch is the monitor. i was searching for an HDR capable, wide color gamut display... the pickings were a bit sparse

     
  3. gigatoaster macrumors 6502

    gigatoaster

    Joined:
    Jul 22, 2018
    Location:
    Singapore
    #3
    Hello there

    Are you sure HDMI 2.0 can’t handle HDR? Is the Mac mini HDMI 2.0 or 2.0a?
    --- Post Merged, Nov 3, 2018 ---
    I called Apple and they confirmed it’s not compatible. That sucks.
     
  4. SpacemanSpiffed macrumors regular

    SpacemanSpiffed

    Joined:
    Mar 27, 2013
    Location:
    Pacific NW
    #4
    That sucks. All they needed to do was include a LSPCON (Level Shifter and Protocol Converter) on the motherboard. The GPU has had all the needed 10-bit support since Kaby Lake. I hope the person you spoke with was wrong .. it seems like something they wouldn't have missed.

    (And you dont need the LSPCON for 10-bit/HDR over display port - that should be possible no matter what)
     
  5. gigatoaster macrumors 6502

    gigatoaster

    Joined:
    Jul 22, 2018
    Location:
    Singapore
  6. SpacemanSpiffed macrumors regular

    SpacemanSpiffed

    Joined:
    Mar 27, 2013
    Location:
    Pacific NW
    #6
  7. gigatoaster macrumors 6502

    gigatoaster

    Joined:
    Jul 22, 2018
    Location:
    Singapore
    #7
    Thanks for the explanations.

    The alternative would be to use a NAS with Apple TV 4K & Infuse apparently.
     
  8. ElectronGuru macrumors 65816

    Joined:
    Sep 5, 2013
    Location:
    Oregon, USA
    #8
    Is LSPCON function such that it can only be done by a dedicated chip or is it possible for Apple to emulate LSPCON functionality with an S2 or other chip already on board?
     
  9. SpacemanSpiffed macrumors regular

    SpacemanSpiffed

    Joined:
    Mar 27, 2013
    Location:
    Pacific NW
    #9
    No idea. I think we are 'wait and see' mode to see if by HDMI 2.0 they meant "all of HDMI 2" or "2.0 specifically".

    Intel's Kaby Lake NUCs (not even the latest) have 2.0a and HDR support. Seems Crazy that Apple would overlook that.
     
  10. WMU328 thread starter macrumors newbie

    Joined:
    Oct 27, 2017
    #10
    I agree. Wait and see mode. Looking forward to some answers!!
     
  11. SpacemanSpiffed macrumors regular

    SpacemanSpiffed

    Joined:
    Mar 27, 2013
    Location:
    Pacific NW
    #11
    We will get them this week, which you have to admit is way better than it has been. :)
     
  12. WMU328 thread starter macrumors newbie

    Joined:
    Oct 27, 2017
    #12
    AGREED!!!
     
  13. Chancha macrumors 6502a

    Joined:
    Mar 19, 2014
    #13
    So macOS itself is HDR ready? Meaning that as long as the GPU and the output path supports HDR, the OS can show the entire interface respecting HDR?
     
  14. gigatoaster macrumors 6502

    gigatoaster

    Joined:
    Jul 22, 2018
    Location:
    Singapore
    #14
    No it’s not. See the answer from SpaceManSpiffed.
     
  15. Chancha macrumors 6502a

    Joined:
    Mar 19, 2014
    #15
    This really sucks. It essentially diminishes the mini's potential as a modern HTPC. Some of the HDR capable 4K TVs costs less than a MBP nowadays.
     
  16. EugW macrumors 603

    EugW

    Joined:
    Jun 18, 2017
    #16
    Good question. We shall see soon enough but I’d be surprised if it did not support 10-bit colour. That’s a pretty basic feature for some of its target audience - video and multimedia content creators.
     
  17. SpacemanSpiffed macrumors regular

    SpacemanSpiffed

    Joined:
    Mar 27, 2013
    Location:
    Pacific NW
    #17
    Actually I'm unsure if the OS itself needs to be "aware" of HDR. It's really a byproduct of the application doing the image processing and the data being processed.

    Hardware wise, HDR seems to mean support for 10-bit color components and getting them to the display, so you have 1 billion possible colors instead of 16 million. All recent GPUs support 10-10-10-2 pixel formats (10 bits for each RGB and 2 bits for alpha). You've probably seen Monitor specs as either 6-bit +FRC (a form of dither) and meh or 8-bit color component support as well the whole 'how much of which color gamut' they cover. The newest high-end pro monitors support 10-bit uncompressed components with 16-bit internal LUTs for the color curves, but you still have the issue of brightest vs darkest, which is impacted by panel type, backlighting, etc (OLED panels having much 'blacker' blacks for example)

    I'm guess I'm saying "It is complicated" and I don't know that we have an HDR "standard" yet.
     
  18. Chancha macrumors 6502a

    Joined:
    Mar 19, 2014
    #18
    I think we can stick to the DisplayHDR certification as a "HDR standard" reference, since future displays are likely going to be certified with it, at least on the consumer end, where it concerns the Mini used as an HTPC.

    But my above question on the OS itself is more on the editing / color grading prospects. I remember back during the Cheese Grater Mac Pro days, still on OS X 10.6.8 and FCP7, since the OS itself does not support 10-bit, we had to install a BlackMagic Intensity Pro, which is just a 2 lane PCI card that takes care of 10-bit 4:2:2 HDMI, where we connected a SONY PVM OLED to it. The monitor could only get inputs from "supported" applications, it could not act as a regular display recognized by the OS and used as a mirror / extend desktop for instance, therefore the usefulness of such a setup was not that great.

    If the Mini's limitation is just on its built-in I/O, then we can hope a TB3/USB-C external video card can deal with this. If it is OS or firmware level limit then it is dead end, for now.
     
  19. ladodger5 macrumors newbie

    ladodger5

    Joined:
    Sep 12, 2016
    Location:
    Los Angeles
    #19
  20. gigatoaster macrumors 6502

    gigatoaster

    Joined:
    Jul 22, 2018
    Location:
    Singapore
  21. Chancha macrumors 6502a

    Joined:
    Mar 19, 2014
    #21
    I believe the S2719DC is 1440p, so even at 10-bit the bandwidth of the HDMI version that it takes to happen is not the same as 4K HDR. But this is still assuring since HDR is respected on the OS level it seems.
     
  22. WMU328 thread starter macrumors newbie

    Joined:
    Oct 27, 2017
    #22
    I wonder if an eGPU would help. If I have 2 4k HDR monitors, what are my options? I think it would have to be an eGPU. I am waiting for a review will cover this soon..
     
  23. archer75 macrumors 68020

    Joined:
    Jan 26, 2005
    Location:
    Oregon
    #23
    A ATV4k is wonderful! I use it, a DAS on my desktop(though you don't need a DAS or NAS depending on your setup) but primarily plex though I use infuse when needed. I also have rokus, chromecasts, nvidia shield and more. And use plex to stream outside the home to friends and family.
     
  24. gigatoaster macrumors 6502

    gigatoaster

    Joined:
    Jul 22, 2018
    Location:
    Singapore
    #24
    If someone has the time and an OLED TV that supports HDR, would you do us a favor & try one of these files to see if HDR is correctly output?

    https://4kmedia.org/

    Thanks!
     
  25. gigatoaster macrumors 6502

    gigatoaster

    Joined:
    Jul 22, 2018
    Location:
    Singapore
    #25
    Anyone with a new Mac mini could try, please??
     

Share This Page

50 November 3, 2018