Now that OS X supports 10-bit color... will cMP with Nvidia drivers do it?

PowerMike G5

macrumors 6502
Original poster
Oct 22, 2005
405
121
New York, NY
Now that it seems that OS X El Capitan supports 10-bit color output now, is there anyone running a cMP using the latest Nvidia Webdrivers (with a Maxwell-based card) that has a true 10-bit monitor connected to it? I want to know if the pixel depth shows up as 10-bit output in system preferences.

I currently have the Apple 27" LED Cinema Display, which is only an 8-bit panel.

I am considering one of these 4K 10-bit panels now as the main computer monitor, but any confirmation of the above would help me a lot before trying to just buy and see if it works.
 

dmylrea

macrumors 68030
Sep 27, 2005
2,695
3,094
Now that it seems that OS X El Capitan supports 10-bit color output now, is there anyone running a cMP using the latest Nvidia Webdrivers (with a Maxwell-based card) that has a true 10-bit monitor connected to it? I want to know if the pixel depth shows up as 10-bit output in system preferences.

I currently have the Apple 27" LED Cinema Display, which is only an 8-bit panel.

I am considering one of these 4K 10-bit panels now as the main computer monitor, but any confirmation of the above would help me a lot before trying to just buy and see if it works.
Where would I see that? I have a Dell P2715Q and in system report it says my pixel depth is "Pixel Depth:32-Bit Color (ARGB8888)"
 

PowerMike G5

macrumors 6502
Original poster
Oct 22, 2005
405
121
New York, NY
Well your monitor is 10-bit, but you're system report is showing it is operating at 8-bit.

Hmmm... I read somewhere that you need the Nvidia Quadro for 10-bit output, but I was hoping the GTX 960-980 line would do it with this latest update.
 
Jul 4, 2015
4,491
2,507
Paris
The GTX980 sends 10bit via Windows to my Eizo. I have it set in the Nvidia Display properties and set 30bit option in Photoshop. However on OSX there are no options anywhere for choosing between 8 or 10 bit. I assume it will send the default 8bit signal. Fortunately my monitor has internal processing up to 14bit so as long as it is calibrated I can't tell the difference between the Mac's default and Windows' 10bit setting.
 
Last edited:

PowerMike G5

macrumors 6502
Original poster
Oct 22, 2005
405
121
New York, NY
Well just an update - I decided to get the LG Full 4K DCI monitor and it runs flawlessly (so far) from the GTX980 on the cMP in full 4K DCI/60hz mode. I used SwitchResX to enable 10-bit and it switched to it just fine. So far, so good.
 

netkas

macrumors 65816
Oct 2, 2007
1,122
292
Yeah, works with 98- and mp3,1 here as well. but needs app support as was said.
So far situation is same as with Metal.
 
  • Like
Reactions: VAGDesign

hockeyamd

macrumors member
Jun 22, 2006
73
12
Well just an update - I decided to get the LG Full 4K DCI monitor and it runs flawlessly (so far) from the GTX980 on the cMP in full 4K DCI/60hz mode. I used SwitchResX to enable 10-bit and it switched to it just fine. So far, so good.
How did you enable 10 bit in switchresx?
Thanks
 

VAGDesign

macrumors 6502
Feb 1, 2014
344
187
Greece
How did you enable 10 bit in switchresx?
Thanks
On the top bar icon on your desktop, after you install it, will see a selection for Colors. Select Billions instead of Millions. Then you will see the new profile on your Graphics info in System Profiler.
 
Jul 4, 2015
4,491
2,507
Paris
If you really do need 10bt colour in your workflow and need 10bit monitor, bear in mind that many of the cheapish monitors that claim 'billions of colours' display are not true 10bit. There was discussion about this on pro and PC forums about this issue last year and the result is some of those monitors are 8bit with a bit of dithering style trickery to look richer.