Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

PowerMike G5

macrumors 6502a
Original poster
Oct 22, 2005
558
250
New York, NY
Can anyone chime in on this? I'm curious... with a new P3 gamut display, is the panel showing up as 10-bit color? Or is it still being used at 8-bit?

I can't find any information on this at the moment.
 
Can anyone chime in on this? I'm curious... with a new P3 gamut display, is the panel showing up as 10-bit color? Or is it still being used at 8-bit?

I can't find any information on this at the moment.

I'm pretty sure Apple would have mentioned it if it were 10bit. besides, AFAIK OS X still doesn't support 10bit color.
 
I'm pretty sure Apple would have mentioned it if it were 10bit. besides, AFAIK OS X still doesn't support 10bit color.

Thanks. That's what I suspected, but wanted to hear from others. It's too bad, considering all these 10-bit video camera systems with wide gamut recording options now.
 
Screen Shot 4.png


Not quite sure what CGSThirtyBitColor actually means in the real world. The S211H is by most measures, a pretty crappy display, and I'd be surprised if it was 10 bits per channel. Does DVI support more than 8 bits?
 
View attachment 593804

Not quite sure what CGSThirtyBitColor actually means in the real world. The S211H is by most measures, a pretty crappy display, and I'd be surprised if it was 10 bits per channel. Does DVI support more than 8 bits?

Thanks a lot for the screen grab!

Really interesting... according to that, it suggests that it is indeed 10-bit color. Every other Retina screen on macs till this were designated at 32-bit, but parsed out as 8 -bits per channel, including an alpha channel. Your screenshot seems to possibly indicate actual 10-bits per channel, since there is no other calculation to amount to thirty bit color other than 10-bits per RGB channel.
 
The thing is-- I'm running a first generation retina iMac. So maybe it's a El Capitan thing.
 
Thanks. So it seems hardware is capable, but the OS and/or application needs to send it the signal?

I get 10-bit color depth out of my AJA video I/O PCIe card to a 10-bit monitor. But it would be lovely if we can also get it from the GPU onto the computer monitor itself. I won't have to turn my head so much back and forth with color grading!
 
Great, thank you!
I'm also in the final decision for the 5k iMac, and your BTO (M390, i7) is one of the options I'm looking for.
 
I'm not familiar with monitors, why is 10 bit better than 8 bit. I understand bits, what does 10 bit give me?
 
  • Like
Reactions: ohsnaphappy
I'm not familiar with monitors, why is 10 bit better than 8 bit. I understand bits, what does 10 bit give me?
More colors.
On an 8-bit display, the number of possible colors per pixel is 2^8 * 2^8 * 2^8 (ignoring alpha at the moment) = 2^24 = 16,777,216 (16.7 million). On a proper 10-bit display, the number of possible colors is 2^10 * 2^10 * 2^10 (again, ignoring alpha atm) = 2^30 = 1,073,741,824 (just over a billion), or 64 times as many possibilities as the 8 bit panel. More practically, it means 1024 shades of a color (or shades of grey), as opposed to 256, which helps prevent the weird banding you see in gradients of grey.

The reality is, very little supports full 10 bit, and most 10 bit displays don't really support that full billion possibilities. Including this iMac, which doesn't claim to be a proper 10 bit display.
 
More colors.
On an 8-bit display, the number of possible colors per pixel is 2^8 * 2^8 * 2^8 (ignoring alpha at the moment) = 2^24 = 16,777,216 (16.7 million). On a proper 10-bit display, the number of possible colors is 2^10 * 2^10 * 2^10 (again, ignoring alpha atm) = 2^30 = 1,073,741,824 (just over a billion), or 64 times as many possibilities as the 8 bit panel. More practically, it means 1024 shades of a color (or shades of grey), as opposed to 256, which helps prevent the weird banding you see in gradients of grey.

The reality is, very little supports full 10 bit, and most 10 bit displays don't really support that full billion possibilities. Including this iMac, which doesn't claim to be a proper 10 bit display.

Hasn't Apple been staying the iMac has 16.7M colors?
 
german IT magazine "Mac & i" got confirmation from Apple that we indeed do have 10-Bit (LCD panel and OS X)!!!
article unfortunately in german: http://www.heise.de/mac-and-i/meldung/iMacs-lernen-10-Bit-Farbwiedergabe-2854496.html

Yup, it's the real thing.
Currently, you can only view these 10bit in Preview and in the "Edit" mode of Photos, apparently.
(Which is what the article says)

I don't have a new iMac. But I trust heise.de ;-)

Any bets how long it will take before Adobe picks it up in one of their apps?
;-)

That said, there seems to be no mention of it in the developer-documentation (that heise could find).
 
german IT magazine "Mac & i" got confirmation from Apple that we indeed do have 10-Bit (LCD panel and OS X)!!!
article unfortunately in german: http://www.heise.de/mac-and-i/meldung/iMacs-lernen-10-Bit-Farbwiedergabe-2854496.html

This is excellent news! I think for viewing for the majority of folks, 8-bit is fine. For someone like me who works in media and works with 10-bit (and 12-bit) video sources, it's really nice having this. Even though I output with an AJA I/O card in true 10-bit to a 10-bit monitor, being able to grade within UI of my programs on the mac monitor was something I've been looking forward to for a while.
 
I'm thinking that if you have manually calibrated your displays in the past, now might be a good time to recalibrate them. (Remember to hold down the option key while pressing the "Calibrate..." button.)
one advantage of 10 bit color is that adjusting the gamma on the computer will result in less banding.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.