Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Has anyone been able to try this?

This is my first mac computer I have always had windows. When I would enable hdr on my windows pc I would get a green screen but on the mac mini when I enable hdr I do get picture and 4k video looks beautiful. Although not completely sure if I am getting true hdr. I am using a Samsung 40" 4k hdr monitor as my display. I am not very tech savvy but here is my graphics/display info.

Intel UHD Graphics 630:

Chipset Model: Intel UHD Graphics 630

Type: GPU

Bus: Built-In

VRAM (Dynamic, Max): 1536 MB

Vendor: Intel

Device ID: 0x3e9b

Revision ID: 0x0000

Metal: Supported, feature set macOS GPUFamily2 v1

Displays:

SAMSUNG:

Resolution: 3840 x 2160 (2160p 4K UHD - Ultra High Definition)

UI Looks like: 1920 x 1080 (1080p FHD - Full High Definition)

Framebuffer Depth: 30-Bit Color (ARGB2101010)

Main Display: Yes

Mirror: Off

Online: Yes

Rotation: Supported

Adapter Type: DVI or HDMI

Automatically Adjust Brightness: No

Adapter Firmware Version: 2.08

Television: Yes
 
Last edited:
  • Like
Reactions: ElectronGuru
This is my first mac computer I have always had windows. When I would enable hdr on my windows pc I would get a green screen but on the mac mini I when I enable hdr I do get picture and 4k video looks beautiful. Although not completely sure if I am getting true hdr. I am using a Samsubg 40" 4k hdr monitor as my display. I am not very tech savvy but here is my graphics/display info.

Intel UHD Graphics 630:

Chipset Model: Intel UHD Graphics 630

Type: GPU

Bus: Built-In

VRAM (Dynamic, Max): 1536 MB

Vendor: Intel

Device ID: 0x3e9b

Revision ID: 0x0000

Metal: Supported, feature set macOS GPUFamily2 v1

Displays:

SAMSUNG:

Resolution: 3840 x 2160 (2160p 4K UHD - Ultra High Definition)

UI Looks like: 1920 x 1080 (1080p FHD - Full High Definition)

Framebuffer Depth: 30-Bit Color (ARGB2101010)

Main Display: Yes

Mirror: Off

Online: Yes

Rotation: Supported

Adapter Type: DVI or HDMI

Automatically Adjust Brightness: No

Adapter Firmware Version: 2.08

Television: Yes
To see HDR both the video card and display must support HDR(one of the versions, HDR10, HDR10+, HLG, Dolby Vision). And then to see it you have to be playing content that supports HDR.
If you're watching a movie that is HDR then the TV would typically display an HDR icon when the movie starts. If I play an xbox game with HDR the same thing happens when the game launches. If the game doesn't support HDR then nothing happens.
In some cases you can force a sort of HDR on but the content just won't look right. A lot of people think they're getting HDR when they really aren't.
 
Last edited:
  • Like
Reactions: ElectronGuru
I just got a Mac mini.

When I play HDR content, I have access to HDR settings HDR displayed on my Panasonic TV:

IMG_0046.JPG


When I play demo files listed, my TV is not showing HDR but when I play the same file on Apple TV 4K, it shows in HDR via Infuse Pro.

I have many settings on my TV like HDMI mode, Rec. 2020 etc and I don't know what to try.

Before buying the Mac mini, I called Apple and they told me it can't do HDR.

Here are the display information:

IMG_0049.JPG
 
I just got a Mac mini.

When I play HDR content, I have access to HDR settings HDR displayed on my Panasonic TV:

View attachment 804704

When I play demo files listed, my TV is not showing HDR but when I play the same file on Apple TV 4K, it shows in HDR via Infuse Pro.

I have many settings on my TV like HDMI mode, Rec. 2020 etc and I don't know what to try.

Before buying the Mac mini, I called Apple and they told me it can't do HDR.

Here are the display information:

View attachment 804705

Thanks for trying! I wonder if an eGPU could send HDR? That may be the next option, if not hopefully the Mac Pro!
 
Actually, looks like it does support HDR.

I activated on my TV the following settings:

Menu > Picture > Option Settings > HDMI EOTF Type > PQ & HDMI Colorimetry Type Rec. 2020 and I have access to HDR settings on the Mac mini:

IMG_0052.JPG


However, OPTIONS is slightly different:

IMG_0053.jpg


IMG_0054.jpg


I'll ask Panasonic.

So in order to activate HDR you might need on your TV, depending on the brand to play with the options....
 
Actually, looks like it does support HDR.

I activated on my TV the following settings:

Menu > Picture > Option Settings > HDMI EOTF Type > PQ & HDMI Colorimetry Type Rec. 2020 and I have access to HDR settings on the Mac mini:

View attachment 804711

However, OPTIONS is slightly different:

View attachment 804712

View attachment 804713

I'll ask Panasonic.

So in order to activate HDR you might need on your TV, depending on the brand to play with the options....
That's not really HDR. Yes, your TV has HDR settings, but the mini isn't sending HDR.
[doublepost=1542419458][/doublepost]
Thanks for trying! I wonder if an eGPU could send HDR? That may be the next option, if not hopefully the Mac Pro!
An ATV4k is going to be a lot cheaper!
 
That's not really HDR. Yes, your TV has HDR settings, but the mini isn't sending HDR.
[doublepost=1542419458][/doublepost]
An ATV4k is going to be a lot cheaper!

An Apple TV would work for just streaming, I am looking at editing HDR video. But for most an Apple TV would be great.
 
Could you clarify? I thought it was “HDR” or “no HDR” but I don’t get the “not really HDR”.
You can force an HDR mode on the TV. It's just another preset like game, sports, calibrated, vivid, etc. It just gives it a look but it's not HDR.
All you really do is make sure the TV has ultra wide color or whatever it's called enabled for the input the ATV4k is plugged in to. Leave your TV on whatever preset you normally watch on. On my LG it's calibrated bright or some such thing.

You set the ATV4k to SDR and the two match settings to on. Don't leave the ATV in HDR mode all the time. It's pointless and it messes up SDR content and makes for unnecessary switching.

Then when you watch a movie with HDR/Dolby Vision your TV and ATV will automatically switch modes. Most tv's will have some sort of a badge or icon pop up at that point saying HDR or Dolby Vision.

A lot of people leave their TV in some HDR mode and the ATV GUI in HDR and watch a movie without HDR and think they're getting HDR when they're not.
 
Thanks for your message. I followed your instructions on the Apple TV 4K as someone else told me the same.

I just don't get it the "fake HDR mode"...
 
Will an eGPU solve this?
I don't think the GPU is the limit here, the mini's built-in HDMI port not being HDMI 2.0a is.
You can use just the iGPU, the UHD 630 supports 10-bit 4:4:4: and use a type-C dongle or TB3 dock with HDMI or DisplayPort version high enough to support HDR.
 
  • Like
Reactions: ElectronGuru
So an advisor was supposed to call and he didn’t. Must have forgotten.

I called back and a new advisor was trying to discredit either the cable or the TV. 3 times I had to tell him it was a brand new TV, not ****** brand and yes the cables were fine, because HDR is supported on Apple TV 4K.

Then he read the specs of the mini, which doesn’t say if it supports or not HDR.

It seems no one at Apple is able to answer that question. I just need confirmation. I’ll ask Tim.
 
Did you get a chance to try it? Curious on the results!

I got it all set up and running, its looks damn nice. I didnt have much time but when I get home from work ill have more time and let you know.
 
I dont think I was able to get HDR out of my mini. The monitor supports it but it did not look like HDR to me.
 
I dont think I was able to get HDR out of my mini. The monitor supports it but it did not look like HDR to me.

How would you distinguish 10 bit from 8 bit? The eye can distinguish between 10 million colors, fewer than the 16 million offered by 8 bits. Using sRGB gamut, you simply can't tell any difference between 8/10 bits.

Bit depth is only relevant in wider color gamuts, and when processing data for intermediary purposes. RAW images are 14 bit for that reason.

For 10 bit monitors to be useful, certain things need to apply: 1) the content must be 10+ bits; 2) the cables must support 10 bits; 3) the device must transmit 10 bits properly; 4) the monitor must have wide gamut. Only then can you perceive some colors that regular screens can't, but you would only know by direct comparison.
 
  • Like
Reactions: ElectronGuru
How would you distinguish 10 bit from 8 bit? The eye can distinguish between 10 million colors, fewer than the 16 million offered by 8 bits. Using sRGB gamut, you simply can't tell any difference between 8/10 bits.

Bit depth is only relevant in wider color gamuts, and when processing data for intermediary purposes. RAW images are 14 bit for that reason.

For 10 bit monitors to be useful, certain things need to apply: 1) the content must be 10+ bits; 2) the cables must support 10 bits; 3) the device must transmit 10 bits properly; 4) the monitor must have wide gamut. Only then can you perceive some colors that regular screens can't, but you would only know by direct comparison.

The content is good, it was:

https://4kmedia.org/sony-camping-in-nature-4k-demo/

The monitor is 10 bit (or so LG claims, 8bit + A-FRC), supports HDR10:

https://www.lg.com/us/monitors/lg-27UK600-W-4k-uhd-led-monitor

I used the included HDMI cable that came with the monitor. The content ran smoothly with no issues.

Its true that I had no comparison, so who know really if it was showing HDR.
 
  • Like
Reactions: ElectronGuru
The content is good, it was:

https://4kmedia.org/sony-camping-in-nature-4k-demo/

The monitor is 10 bit (or so LG claims, 8bit + A-FRC), supports HDR10:

https://www.lg.com/us/monitors/lg-27UK600-W-4k-uhd-led-monitor

I used the included HDMI cable that came with the monitor. The content ran smoothly with no issues.

Its true that I had no comparison, so who know really if it was showing HDR.

This gradient map will tell you instantly if your 10 bit chain is proper: https://github.com/jursonovicst/gradient

You would have to use Photoshop or another application with actual support for 10 bits. You probably have to zoom in to see the actual difference between banding and continuous tones. It is most obvious in the darker gray.

When it comes to real content using 10 bits, banding will occur because of compression. Unless you have very large files, the advantage of 10 bits is wiped away immediately in compression.
 
  • Like
Reactions: ElectronGuru
This thread is hilarious. The mini doesn’t support HDR. NO Macs support HDR. If they did, Apple would be touting it to the ends of the earth.

The Mac Mini in 2018 is a terrible, terrible HTPC option for so many reasons. There’s a reason Apple removed the IR port.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.