Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Thank you for all the responses. I would be curious to see the results of a test using a 10bit gradient on a 2017 15". Also a comparison with the 5K iMac which is also 10bit to see if its just dithering or true 10bit

I just tested using multiple monitors on the 2017 15 mbpr and all of the monitors are showing up as 10bit and that can't be right.

It looks like there is a bug and it would not surprise me if its reporting all displays as 10bit when they're all 8bit.
 
I just tested using multiple monitors on the 2017 15 mbpr and all of the monitors are showing up as 10bit and that can't be right.

It looks like there is a bug and it would not surprise me if its reporting all displays as 10bit when they're all 8bit.

Interesting. It could very well be a bug if its reporting everything as 10bit. I would assume the displays you tested it with were not 10bit
 
Interesting. It could very well be a bug if its reporting everything as 10bit. I would assume the displays you tested it with were not 10bit

I got a 15" MBP 2017, and I was curious about this too. I went to Xcode's Swift Playground, and typed this:

NSBitsPerPixelFromDepth((NSScreen.main()?.depth)!)

The line returned "24", which seemed to indicate a 8-bit screen. Anyone with an iMac 5K or 10-bit display knows if this will be 30 if you type this on a 10-bit screen? Anyway, I do concur it seems more like a software bug now. Even Apple's tech specs for MBP says "millions of colors" instead of the "billions of colors" that they advertise for iMac 5K.

It's a little disappointing I guess, but maybe not too surprising. It would be weird if they randomly did this upgrade (and only on 15") and told no one about this.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.