Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

neekon

macrumors member
Original poster
May 30, 2008
60
38
Hey, So I was at the Apple Store looking at the new 2017 15" MacBook Pro, and I noticed that under Graphic/Display in System Information, the Pixel Depth showed (ARGB2101010), as opposed to the 2015 and my 13" 2016 that says (ARGBB8888). Had anyone else noticed this?
A 10bit display in the 15" would be a huge improvement for color work for video and stills.
 
Hey, So I was at the Apple Store looking at the new 2017 15" MacBook Pro, and I noticed that under Graphic/Display in System Information, the Pixel Depth showed (ARGB2101010), as opposed to the 2015 and my 13" 2016 that says (ARGBB8888). Had anyone else noticed this?
A 10bit display in the 15" would be a huge improvement for color work for video and stills.

I don't recall them announcing anything. But I do seem to remember something about a bug affecting the store builds of macOS that are loaded onto the display units. 2016 is 8bit, and I'm 99% positive they haven't changed the screen without anyone noticing.
 
I don't recall them announcing anything. But I do seem to remember something about a bug affecting the store builds of macOS that are loaded onto the display units. 2016 is 8bit, and I'm 99% positive they haven't changed the screen without anyone noticing.

Thats what I thought, but it was there clear as day under the pixel depth, I checked a few models, the 13" 2017, 12" 2017, 15" 2015 all had ARGB8888
The 2017 15", both iMacs and the 5K LG display all had ARGB2101010.
I wish I had taken a photo.
 
Thats what I thought, but it was there clear as day under the pixel depth, I checked a few models, the 13" 2017, 12" 2017, 15" 2015 all had ARGB8888
The 2017 15", both iMacs and the 5K LG display all had ARGB2101010.
I wish I had taken a photo.

Best thing to do in future is ask a member of staff. The iMacs and 5K displays are 10bit as far as I know. But as I said I believe there is a bug, whether it affects only certain builds (GPUs) I don't know. I also don't know if the staff members would even be aware of it but you've usually got at least one knowledgable folk at the stores.

https://www.macrumors.com/2015/10/30/4k-5k-imacs-10-bit-color-depth-osx-el-capitan/
 
Best thing to do in future is ask a member of staff. The iMacs and 5K displays are 10bit as far as I know. But as I said I believe there is a bug, whether it affects only certain builds (GPUs) I don't know. I also don't know if the staff members would even be aware of it but you've usually got at least one knowledgable folk at the stores.

https://www.macrumors.com/2015/10/30/4k-5k-imacs-10-bit-color-depth-osx-el-capitan/


Saw that link.
I will look into this bug you mention, have not heard of it before.
I wonder if someone with a 15" 2017 MBPro would post a screen shot of the Pixel Depth in Graphics/Displays to help figure this out.
 
Saw that link.
I will look into this bug you mention, have not heard of it before.
I wonder if someone with a 15" 2017 MBPro would post a screen shot of the Pixel Depth in Graphics/Displays to help figure this out.

I tried to find a reference to this bug for you as I definitely remember talking about it when the 2016s came out. All I could find was plenty of people discussing the functionality of 10bit. Maybe somebody will come along and post that 'proof' you need though :cool:
 
I am trying to find a reference to the bug as well. I haven't turned up anything yet. My question or concern is whether other people have noticed it. I haven't noticed anything on the forums about it and no reviews have mentioned it, so it makes me think that maybe it is a bug. If its the case and the new displays are 10bit thats a pretty significant upgrade that wasn't mentioned, but makes them a huge upgrade
I will reach out to apple as well, see if they can answer it definitively
 
If its the case and the new displays are 10bit thats a pretty significant upgrade that wasn't mentioned, but makes them a huge upgrade
I will reach out to apple as well, see if they can answer it definitively

It's not as big of an upgrade as you might think without driver support and public APIs, which is probably not the case. I wouldn't get too hung up on bit depth alone. It generally solves problems with banding and shadow detail if the rest of the support is there, but that is the extent of it.
 
  • Like
Reactions: macintoshmac
I am trying to find a reference to the bug as well. I haven't turned up anything yet. My question or concern is whether other people have noticed it. I haven't noticed anything on the forums about it and no reviews have mentioned it, so it makes me think that maybe it is a bug. If its the case and the new displays are 10bit thats a pretty significant upgrade that wasn't mentioned, but makes them a huge upgrade
I will reach out to apple as well, see if they can answer it definitively

Very few people will go looking at system information on the display units, fewer will report findings here. Also as I said the display units use a different build of macOS. So you would really need to work in an Apple shop to have any say on it. Probably won't find much or be very hard to find.

And has been said 10bit really isn't that big of an upgrade. It would be a huge upgrade if every computer had 10bit support, but as they don't it's a really limited field. Most people are still using 1280x900 screens for instance, it's very disheartening making crisp retina graphics that are unlikely to be seen (And frankly just a pain making retina graphics for those that will see them). 10bit is kind of the same thing.
 
Very few people will go looking at system information on the display units, fewer will report findings here. Also as I said the display units use a different build of macOS. So you would really need to work in an Apple shop to have any say on it. Probably won't find much or be very hard to find.

And has been said 10bit really isn't that big of an upgrade. It would be a huge upgrade if every computer had 10bit support, but as they don't it's a really limited field. Most people are still using 1280x900 screens for instance, it's very disheartening making crisp retina graphics that are unlikely to be seen (And frankly just a pain making retina graphics for those that will see them). 10bit is kind of the same thing.

That is true, there are not really many or any people that would bother to do that. I was just looking around and saw it, thought I would report if anyone else has noticed it. I thought it was cool and possibly a big deal, guess it isn't as exciting to you guys as it is to me.
 
10 bit ...
Screen Shot 2017-06-21 at 5.31.18 PM.png
 
Apple's official spec page for the 2017 15" implies 8-bit:

https://www.apple.com/macbook-pro/specs/



Compare to the 10-bit iMacs:



So, what the hell?
If I had to guess ... Apple has multiple suppliers of screens for the 15" Macbook Pro, and some will get 10 bit displays, others will get 8 bit displays.

Queue the next "gate".

I'll also add that this screen is absolutely beautiful, the color accuracy is great, and it achieves a large portion of DCI-P3.
[doublepost=1498082357][/doublepost]The color area is showing the color that the screen can output. The wireframe is showing DCI-P3 ... Almost complete saturation in all colors, and this screen actually does better than DCI-P3 in the blues and violets.

This is def the best computer display I currently own.
Screen Shot 2017-06-21 at 5.57.17 PM.png Screen Shot 2017-06-21 at 5.57.10 PM.png Screen Shot 2017-06-21 at 5.57.00 PM.png
 
If I had to guess ... Apple has multiple suppliers of screens for the 15" Macbook Pro, and some will get 10 bit displays, others will get 8 bit displays.

Queue the next "gate".

I'll also add that this screen is absolutely beautiful, the color accuracy is great, and it achieves a large portion of DCI-P3.

Yea the 15" screen is definitely stunning, a bit nicer IMHO than my 13".

I would hope that isn't the case, that some people are getting 10bit and some 8bit, that would be pretty bad.
 

Cool thank you.

Hmm thanks for providing this. It's raising more questions than answers though.

Why is your machine showing that under the internal GPU? What does is say under the dGPU? And why is it 30bit and not 32bit?

I'm thinking the newer Kabylake might have added support for 10bit (It might have always been there, really don't pay much attention) and maybe that's what's making it show that? My internal GPU doesn't have anything for displays, and my dGPU isn't in use. So rather confused how yours is reporting that.

Screen Shot 2017-06-21 at 23.01.28.jpg Screen Shot 2017-06-21 at 23.01.35.jpg

Either way the screen itself is not 10bit. A screenshot of the system info is useful, but there are way too many factors that can interfere there. Even software compatibility.

The only real test is to load a gradient map into something like PhotoShop and check for banding.

Screen Shot 2017-06-21 at 22.59.12.jpg
 
A gradient map would be the best way to show it

@winterny how did you get those wireframes showing, would be interested in checking them out on my end.

it would be awesome to get to the bottom of this.
 
Hmm thanks for providing this. It's raising more questions than answers though.

Why is your machine showing that under the internal GPU? What does is say under the dGPU? And why is it 30bit and not 32bit?
If the dGPU is active, the screen shows up under the dGPU. If the Intel GPU is active, it shows up under that.

It shows up as 30 bit because there are 10 bits for red, green, and blue, 3*10 = 30.
an 8 bit screen is 3*8 = 24

The reason why sometimes an 8 bit display is called "32 bit" is because they are counting the alpha channel (8*4 = 32) ... By that same math, this would be a "40 bit" display.

I'm thinking the newer Kabylake might have added support for 10bit (It might have always been there, really don't pay much attention) and maybe that's what's making it show that? My internal GPU doesn't have anything for displays, and my dGPU isn't in use. So rather confused how yours is reporting that.
I can't explain your display not showing up in System Info, it does on all of my mac's, and always has.

No idea if this is something new to the internal GPU built into the Kaby Lake CPU or not.
View attachment 705116 View attachment 705117

Either way the screen itself is not 10bit. A screenshot of the system info is useful, but there are way too many factors that can interfere there. Even software compatibility.

The only real test is to load a gradient map into something like PhotoShop and check for banding.

View attachment 705115
This System Info output is very clearly telling us that the signal from the GPU (regardless of intel or dGPU) is outputting (at least in my case) to the screen as 10-bit. There is of course still the question of if the screen itself is actually displaying 10 bits of color or not, or if it is dithering, or dropping the extra bits.

Will check some gradients in a bit.
 
Hmm thanks for providing this. It's raising more questions than answers though.

Why is your machine showing that under the internal GPU? What does is say under the dGPU? And why is it 30bit and not 32bit?

I'm thinking the newer Kabylake might have added support for 10bit (It might have always been there, really don't pay much attention) and maybe that's what's making it show that? My internal GPU doesn't have anything for displays, and my dGPU isn't in use. So rather confused how yours is reporting that.

View attachment 705116 View attachment 705117

Either way the screen itself is not 10bit. A screenshot of the system info is useful, but there are way too many factors that can interfere there. Even software compatibility.

The only real test is to load a gradient map into something like PhotoShop and check for banding.

View attachment 705115

Problem: Photoshop (or maybe even macOS?) does dithering for 10-bit images on 8-bit displays now if 30-bit display support is enabled, and on a Retina display, that dithering can be extremely hard to discern by eye. On my 2016 MBP with 30-bit support enabled in Photoshop, a 10-bit gradient test ramp looks seamless to me.
 
A gradient map would be the best way to show it

@winterny how did you get those wireframes showing, would be interested in checking them out on my end.

it would be awesome to get to the bottom of this.
It is colorsync utility. In order to get the mapping of what the actual screen is, I used a colorimeter -- X-rite i1Display Pro
[doublepost=1498083226][/doublepost]I just played the Spears & Munsil Quantazation Test... The screen is displaying 10 bits, as I said earlier, there is still a question of if it is 'native' 10 bits, or dithered 10 bits (and there is a further question of if that actually matters), but either way, playing back a 10 bit mp4 has smooth gradients.
[doublepost=1498083271][/doublepost]If you'd like to check for yourself, this is the best way of testing to see if the screen can display 10 bit video:
https://drive.google.com/file/d/0B68jIlCvW85gMW5aX21CbU1IOGs/view

Use VLC to play the .ts file.
 
From a cursory search, 10-bit color support appears to have been added in El Capitan. I would say it depends more on the GPU than the CPU. On one of my computers (iGPU - not a MBP) running Sierra which is not Kaby Lake it shows "30-Bit Color (ARGB2101010)" in the System Report. However, in looking at the tech specs for my monitor, it's clearly a 24-bit monitor. As mentioned, doing a visual gradient test should confirm it or not. I think the System Report will report the capabilities of the GPU properly as it relates to 24 bit vs. 30+ or not (can't say about 30 vs. 32), but it's clearly not reporting the correct monitor capabilities in all cases.
 
If the dGPU is active, the screen shows up under the dGPU. If the Intel GPU is active, it shows up under that.

It shows up as 30 bit because there are 10 bits for red, green, and blue, 3*10 = 30.
an 8 bit screen is 3*8 = 24

The reason why sometimes an 8 bit display is called "32 bit" is because they are counting the alpha channel (8*4 = 32) ... By that same math, this would be a "40 bit" display.


I can't explain your display not showing up in System Info, it does on all of my mac's, and always has.

No idea if this is something new to the internal GPU built into the Kaby Lake CPU or not.

This System Info output is very clearly telling us that the signal from the GPU (regardless of intel or dGPU) is outputting (at least in my case) to the screen as 10-bit. There is of course still the question of if the screen itself is actually displaying 10 bits of color or not, or if it is dithering, or dropping the extra bits.

Will check some gradients in a bit.

Thanks for that. I did a restart and disconnected charger to force it.

Screen Shot 2017-06-21 at 23.15.32.jpg

What does your dGPU show as? I'm just thinking this is something to do with Kabylake. I believe the MBP 2016 does have the capability of outputting 10bit though, so assuming you connected it to a 10bit display it would work. Maybe that's causing the confusion?

Also to rule it out, I'm running:
Screen Shot 2017-06-21 at 23.19.06.jpg
 
It is colorsync utility. In order to get the mapping of what the actual screen is, I used a colorimeter -- X-rite i1Display Pro
[doublepost=1498083226][/doublepost]I just played the Spears & Munsil Quantazation Test... The screen is displaying 10 bits, as I said earlier, there is still a question of if it is 'native' 10 bits, or dithered 10 bits (and there is a further question of if that actually matters), but either way, playing back a 10 bit mp4 has smooth gradients.
[doublepost=1498083271][/doublepost]If you'd like to check for yourself, this is the best way of testing to see if the screen can display 10 bit video:
https://drive.google.com/file/d/0B68jIlCvW85gMW5aX21CbU1IOGs/view

Use VLC to play the .ts file.

On my 2016 MBP, there is a slight degree of banding on both frames, but much more so on the "8-bit" one.
 
Thank you for all the responses. I would be curious to see the results of a test using a 10bit gradient on a 2017 15". Also a comparison with the 5K iMac which is also 10bit to see if its just dithering or true 10bit
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.