Separate names with a comma.
Discussion in 'Mac Pro' started by high heaven, Mar 24, 2018.
Im using gtx 1060 6gb but do I need a workstation GPU like Quadro P2000 to get 10 bit color support?
Even though 10 but colour may work, but Nvidia said currently is not supported in their macOS driver.
For your info, I am using 1080Ti.
So how to use a 10-bit color monitor?
some of the old mac Quadro cards will give out 10 bit. (i think)
or maybe the RX560/580 will give 10 bit out in osx?
or boot windows
or some specialized PCI cards for video editing like from blackmagicdesign.com etc
You don't have driver support in Nvidia and only latest AMD cards from Mac Pro 2013, Polaris and Vega support it in drivers.
Creatives and print industry has been using 8 bit and 16 bit images on Macs without 10 bit support for many years. You will have an 8 bit output with dithering. Professional level monitors have up to 14 bit internal processing and calibration settings to help you tune the display.
In 99% of cases it won't make a difference if you display an 8 bit image (like most are) on an 8 bit or 10 bit screen. It mainly makes a difference if images have long gradients and shadows.
On Windows, AMD and Nvidia drivers only show the setting if a genuine 10 bit monitor is attached. If it is one of the 8 bit + dithering monitors that the manufacturer markets as '10 bit' then the setting doesn't appear.
In 2-3 years there will be only real 10 bit HDR panels. Then the driver situation will be improved.
so a RX560 is a budget option to get 10 bit video out or something like a DeckLink Mini Monitor https://www.blackmagicdesign.com/products/decklink/techspecs/W-DLK-05
I used the BlackMagic Intensity on a 3,1 for FCP briefly, IIRC it could handle 10-bit. The trick is that the connected monitor does not act as an extended display that's handled by Mac OS itself (which requires graphics driver). Their DeckLink lineup uses similar approach and are supporting more modern applications I think.
Blackmagic Intensity cards act the same way as the DeckLink. The UltraStudio Mini Monitor also does 10-bit, but only available through Thunderbolt connection.
AJA Kona cards were handled in a similar way. There were addons or plugins for Photoshop that allowed images to be output to the monitor for precise color. FCP7 and Adobe both worked with AJA as an output device as well. AJA did have a desktop extension function built into their driver. Believe it needed to be engaged through the system preferences.
@high heaven just wondered what the intended use of the display is, all my replies have been under the assumption that you want to use it for professional use under a graded environment if it's just for watching films etc (in a not calibrated envirement) then the display will work fine being sent 8 bit or 10 bit signals
it will still work
ps some of the AJA Kona cards are super cheep on ebay just looked quickly, not looked in to it in detail but surprised to see some so cheep.
any way use case of the display will mater a lot with what you want or need, as long as it's calibrated for most like work it will be fine with 8 bit, 10 bit is more valued in less cases (nice to have but not needed for lots of things)
the rabbit hole of 10 bit display 10 bit GPU out apps that correctly work with a 10 bit work flow and calibration as well as different color spaces is fun to look in to but as most media is consumed on 8 bit displays in less high end color spaces there is value in having a second cheaper display to test/preview content on can help a lot depending on what you want to do
Notably most "10 bit" displays are 8 bit using FRC, which does simulate to a degree more hues, but I don't know of any sub grandish 4k displays sporting true 10 bit. I just put in for a GTX 1060 so fingers crossed we'll get 10 bit support but most of us probably don't have true 10 bit panels so not missing as much.
I think the RX580 supports 10bit displays.
At least my displays are reported as 30bit color.
I was looking at the NEC PA272W monitor in my searches for an affordable color critical display. That one is 10-bit emulatiion via 8-bit + FRC. But as I began searching again today, the new NEC PA271Q-BK which is meant to replace the PA272W claims to have true 10-bit and costs about $1300. If this is true, than pairing this with something like a Blackmagic Decklink will give me the accuracy I need for Davinci Resolve, Nuke and Photoshop.
I'm going to look into it a little more.
I'm going to use a monitor like this at the office I work at... once connected to a Windows machine and another on an iMac Pro. Does anyone here know if the iMac Pro can send a 10-bit video signal via Thunderbolt/display port, given that it has a Vega card that would support it?
Would be great to not have to get another Decklink if I can avoid it.