Anyone with a new Mac mini could try, please??
Has anyone been able to try this?
Anyone with a new Mac mini could try, please??
Has anyone been able to try this?
To see HDR both the video card and display must support HDR(one of the versions, HDR10, HDR10+, HLG, Dolby Vision). And then to see it you have to be playing content that supports HDR.This is my first mac computer I have always had windows. When I would enable hdr on my windows pc I would get a green screen but on the mac mini I when I enable hdr I do get picture and 4k video looks beautiful. Although not completely sure if I am getting true hdr. I am using a Samsubg 40" 4k hdr monitor as my display. I am not very tech savvy but here is my graphics/display info.
Intel UHD Graphics 630:
Chipset Model: Intel UHD Graphics 630
Type: GPU
Bus: Built-In
VRAM (Dynamic, Max): 1536 MB
Vendor: Intel
Device ID: 0x3e9b
Revision ID: 0x0000
Metal: Supported, feature set macOS GPUFamily2 v1
Displays:
SAMSUNG:
Resolution: 3840 x 2160 (2160p 4K UHD - Ultra High Definition)
UI Looks like: 1920 x 1080 (1080p FHD - Full High Definition)
Framebuffer Depth: 30-Bit Color (ARGB2101010)
Main Display: Yes
Mirror: Off
Online: Yes
Rotation: Supported
Adapter Type: DVI or HDMI
Automatically Adjust Brightness: No
Adapter Firmware Version: 2.08
Television: Yes
I just got a Mac mini.
When I play HDR content, I have access to HDR settings HDR displayed on my Panasonic TV:
View attachment 804704
When I play demo files listed, my TV is not showing HDR but when I play the same file on Apple TV 4K, it shows in HDR via Infuse Pro.
I have many settings on my TV like HDMI mode, Rec. 2020 etc and I don't know what to try.
Before buying the Mac mini, I called Apple and they told me it can't do HDR.
Here are the display information:
View attachment 804705
That's not really HDR. Yes, your TV has HDR settings, but the mini isn't sending HDR.Actually, looks like it does support HDR.
I activated on my TV the following settings:
Menu > Picture > Option Settings > HDMI EOTF Type > PQ & HDMI Colorimetry Type Rec. 2020 and I have access to HDR settings on the Mac mini:
View attachment 804711
However, OPTIONS is slightly different:
View attachment 804712
View attachment 804713
I'll ask Panasonic.
So in order to activate HDR you might need on your TV, depending on the brand to play with the options....
An ATV4k is going to be a lot cheaper!Thanks for trying! I wonder if an eGPU could send HDR? That may be the next option, if not hopefully the Mac Pro!
That's not really HDR. Yes, your TV has HDR settings, but the mini isn't sending HDR.
[doublepost=1542419458][/doublepost]
An ATV4k is going to be a lot cheaper!
That's not really HDR. Yes, your TV has HDR settings, but the mini isn't sending HDR.
You can force an HDR mode on the TV. It's just another preset like game, sports, calibrated, vivid, etc. It just gives it a look but it's not HDR.Could you clarify? I thought it was “HDR” or “no HDR” but I don’t get the “not really HDR”.
I don't think the GPU is the limit here, the mini's built-in HDMI port not being HDMI 2.0a is.Will an eGPU solve this?
I don't think the GPU is the limit here, the mini's built-in HDMI port not being HDMI 2.0a is.
I picked up this monitor to use with my mini:
https://www.bestbuy.com/site/lg-27u...nc-monitor-gray-white/6204328.p?skuId=6204328
Its 4K and has HDR, I will set it up later today and see what the deal is. I would be at least slightly disheartened if the mini did not support HDR.
But my Windows box with a 2080 Ti will be connected to it as well so ill get HDR one way or the other!
Did you get a chance to try it? Curious on the results!
I dont think I was able to get HDR out of my mini. The monitor supports it but it did not look like HDR to me.
How would you distinguish 10 bit from 8 bit? The eye can distinguish between 10 million colors, fewer than the 16 million offered by 8 bits. Using sRGB gamut, you simply can't tell any difference between 8/10 bits.
Bit depth is only relevant in wider color gamuts, and when processing data for intermediary purposes. RAW images are 14 bit for that reason.
For 10 bit monitors to be useful, certain things need to apply: 1) the content must be 10+ bits; 2) the cables must support 10 bits; 3) the device must transmit 10 bits properly; 4) the monitor must have wide gamut. Only then can you perceive some colors that regular screens can't, but you would only know by direct comparison.
The content is good, it was:
https://4kmedia.org/sony-camping-in-nature-4k-demo/
The monitor is 10 bit (or so LG claims, 8bit + A-FRC), supports HDR10:
https://www.lg.com/us/monitors/lg-27UK600-W-4k-uhd-led-monitor
I used the included HDMI cable that came with the monitor. The content ran smoothly with no issues.
Its true that I had no comparison, so who know really if it was showing HDR.