Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

PowerMike G5

macrumors 6502a
Original poster
Oct 22, 2005
556
245
New York, NY
Now that it seems that OS X El Capitan supports 10-bit color output now, is there anyone running a cMP using the latest Nvidia Webdrivers (with a Maxwell-based card) that has a true 10-bit monitor connected to it? I want to know if the pixel depth shows up as 10-bit output in system preferences.

I currently have the Apple 27" LED Cinema Display, which is only an 8-bit panel.

I am considering one of these 4K 10-bit panels now as the main computer monitor, but any confirmation of the above would help me a lot before trying to just buy and see if it works.
 
Now that it seems that OS X El Capitan supports 10-bit color output now, is there anyone running a cMP using the latest Nvidia Webdrivers (with a Maxwell-based card) that has a true 10-bit monitor connected to it? I want to know if the pixel depth shows up as 10-bit output in system preferences.

I currently have the Apple 27" LED Cinema Display, which is only an 8-bit panel.

I am considering one of these 4K 10-bit panels now as the main computer monitor, but any confirmation of the above would help me a lot before trying to just buy and see if it works.

Where would I see that? I have a Dell P2715Q and in system report it says my pixel depth is "Pixel Depth:32-Bit Color (ARGB8888)"
 
Well your monitor is 10-bit, but you're system report is showing it is operating at 8-bit.

Hmmm... I read somewhere that you need the Nvidia Quadro for 10-bit output, but I was hoping the GTX 960-980 line would do it with this latest update.
 
The GTX980 sends 10bit via Windows to my Eizo. I have it set in the Nvidia Display properties and set 30bit option in Photoshop. However on OSX there are no options anywhere for choosing between 8 or 10 bit. I assume it will send the default 8bit signal. Fortunately my monitor has internal processing up to 14bit so as long as it is calibrated I can't tell the difference between the Mac's default and Windows' 10bit setting.
 
Last edited:
Well just an update - I decided to get the LG Full 4K DCI monitor and it runs flawlessly (so far) from the GTX980 on the cMP in full 4K DCI/60hz mode. I used SwitchResX to enable 10-bit and it switched to it just fine. So far, so good.
 
Yeah, works with 98- and mp3,1 here as well. but needs app support as was said.
So far situation is same as with Metal.
 
  • Like
Reactions: VAGDesign
Well just an update - I decided to get the LG Full 4K DCI monitor and it runs flawlessly (so far) from the GTX980 on the cMP in full 4K DCI/60hz mode. I used SwitchResX to enable 10-bit and it switched to it just fine. So far, so good.

How did you enable 10 bit in switchresx?
Thanks
 
How did you enable 10 bit in switchresx?
Thanks
On the top bar icon on your desktop, after you install it, will see a selection for Colors. Select Billions instead of Millions. Then you will see the new profile on your Graphics info in System Profiler.
 
If you really do need 10bt colour in your workflow and need 10bit monitor, bear in mind that many of the cheapish monitors that claim 'billions of colours' display are not true 10bit. There was discussion about this on pro and PC forums about this issue last year and the result is some of those monitors are 8bit with a bit of dithering style trickery to look richer.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.