Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

CodeJingle

macrumors 6502a
Original poster
Oct 23, 2009
593
217
Greater Seattle, WA
So I noticed the 5k display is not 30 bit color depth. 30 bit color depth is the new standard, I'm really surprised ALL Apple displays are still 24 bit color. The whole world doesn't seem to notice. All 4k displays are 30 bit color depth. Any feedback?
 
well it's not really a problem since most Mac Os versions of applications like Photoshop or Lightroom don't support 30bit monitors
 
well it's not really a problem since most Mac Os versions of applications like Photoshop or Lightroom don't support 30bit monitors

Mac Photoshop does support 30 bit color display

----------

because the whole world (your average consumers) won't notice or care.

Any techie who cares about high resolution should be caring just as much about high color depth.
 
It does not, there is no 30bit option in Performance settings within Photoshop, this because Mac OS X does not have 30bit support at all.
 
It does not, there is no 30bit option in Performance settings within Photoshop, this because Mac OS X does not have 30bit support at all.

I am running OS X in 30 bit color depth right now. Photoshop supports 30 bit color depth. The 4k standard for video is 30 bit color depth.
 
Last edited:
Is there any official language out there stating support? Screenshots?

The DisplayPort 1.2 standard states is supports color depth greater than 24 bit. In SwitchResX I switch the colors from Millions to Billions, after doing that only certain apps work, others crash but only in Billion color mode, if the OS didn't support 30 bit color then the app would work the same regardless of the color mode of the display.
 
The DisplayPort 1.2 standard states is supports color depth greater than 24 bit. In SwitchResX I switch the colors from Millions to Billions, after doing that only certain apps work, others crash but only in Billion color mode, if the OS didn't support 30 bit color then the app would work the same regardless of the color mode of the display.

OSX does not support 30 bit.

SwithResX isn't going to tell you the true support.

Just because DP 1.2 supports it doesn't mean the OS does.
 
OSX does not support 30 bit.

SwithResX isn't going to tell you the true support.

Just because DP 1.2 supports it doesn't mean the OS does.

That still doesn't explain why certain apps crash when I use SwitchResX to switch the display to billion color mode.

At some point there isn't any way to make a better display, then the only thing left to do is increase the color depth. If my graphics card, my display, and the connection between my display and graphics card, all support 30 bit color depth, then there is no reason for the OS to be that dumb.
 
Last edited:
That still doesn't explain why certain apps crash when I use SwitchResX to switch the display to billion color mode.

At some point there isn't any way to make a better display, then the only thing left to do is increase the color depth. If my graphics card, my display, and the connection between my display and graphics card, all support 30 bit color depth, then there is no reason for the OS to be that dumb.

Apple will most definitely support 30 bit (10 bit basically), but like you said, very few displays support it.
 
Apple will most definitely support 30 bit (10 bit basically), but like you said, very few displays support it.

I never said 'very few displays support 30 bit color depth' (except for Apple). Pretty much all new displays coming out that are larger than 1600 x 1200 are 30 bit color. Almost all 4k displays are 30 bit color. Apple is one of the few making new displays that aren't 30 bit color depth. My 3840x2160 ($1k new) and my 2560x1440 ($350 used) displays both support 30 bit color. I think in Japan regular TV is streamed in 4k with 30 bit color. It is 10 bits per channel, the total number of bits per pixel is 30. If you have ever programmed graphics hardware you would understand we tend to describe color depth in bits per pixel not bits per color channel.
 
Last edited:
I am running OS X in 30 bit color depth right now. Photoshop supports 30 bit color depth. The 4k standard for video is 30 bit color depth.

That still doesn't explain why certain apps crash when I use SwitchResX to switch the display to billion color mode.

At some point there isn't any way to make a better display, then the only thing left to do is increase the color depth. If my graphics card, my display, and the connection between my display and graphics card, all support 30 bit color depth, then there is no reason for the OS to be that dumb.

Mac OS doesn't magically support 30-bit because all the other links in the chain do. You claimed you were running Mac OS in 30-bit - another poster correctly pointed out you're not because Mac OS doesn't support 30-bit, it only supports up to 24-bit (Windows 7 and 8 do support 30-bit). Your response is to claim Mac OS is 30-bit because a third party app crashes when you put it into 30-bit mode (huh? Maybe it crashes because the OS doesn't support it?) and then bitch that it's "dumb" Mac OS isn't 30-bit because everything else you own is. All of that is irrelevant - you are not running Mac OS in 30-bit. Run a google search - "Mac OS 30-bit" and see what you get.

And Photoshop for Mac does not support 30-bit color either. From Adobe's Photoshop FAQ:

30-bit Display (Windows only) Allows Photoshop to display 30-bit data directly to screen on video cards that support it

I never said 'very few displays support 30 bit color depth' (except for Apple). Pretty much all new displays coming out that are larger than 1600 x 1200 are 30 bit color. Almost all 4k displays are 30 bit color. Apple is one of the few making new displays that aren't 30 bit color depth. My 3840x2160 ($1k new) and my 2560x1440 ($350 used) displays both support 30 bit color. I think in Japan regular TV is streamed in 4k with 30 bit color. It is 10 bits per channel, the total number of bits per pixel is 30. If you have ever programmed graphics hardware you would understand we tend to describe color depth in bits per pixel not bits per color channel.

10-bit displays are not the norm - 8-bit panels with FRC dithering to approximate 10-bit color is what's commonly sold. That goes for most 4k monitors and televisions; they get advertised as "10-bit" panels but if you press the manufacturers or dig a little, you discover most of them aren't true 10-bit panels. See this link to a review of the Asus PB287Q 28-Inch 4K monitor that lists its color depth support as "10-bit (8-bit with FRC)." Your professional "30-bit" workflow has been run with a photo application and OS that don't support it, displayed on monitors that likely used dithering to approximate it.

Finally, nobody in the world broadcasts 4k television, Japan include. Japan did a trial test run of 4k broadcasting that was beamed only to commercial retailers this past summer. And it wasn't in Rec 2020. UHD and 4k standards support Rec 2020 with 10- and 12-bit color depth but there is no programming, including the Netflix 4k streams, that uses it. The Sony 4k movies that are on its UHD media player are also 24-bit. Even if you had a genuine 10-bit UHD television (and there are a couple out there), the 4k program you're watching is coded to the regular 8-bit per channel/24-bit Rec 709 standard.
 
Last edited:
At some point there isn't any way to make a better display, then the only thing left to do is increase the color depth. If my graphics card, my display, and the connection between my display and graphics card, all support 30 bit color depth, then there is no reason for the OS to be that dumb.

The Apple's OpenGL implementation does not support 30bit pixel formats. Its all there is to it. Even if you had a display that supports 30bit color output, the graphics stack on OS X would not support it.
 
just plz show me and let's compare the display quality of this 5k imac with your 30 bit display
 
Here is link to Apple video codec supporting 10 and even 12 bits per color channel https://www.apple.com/final-cut-pro/docs/Apple_ProRes_White_Paper.pdf

This Apple support page also states how Final Cut Pro X supports 10 bits per color channel http://support.apple.com/kb/PH18028?viewlocale=en_EG&locale=en_EG

See also here http://support.apple.com/kb/PH12754 and here http://support.apple.com/kb/DL1719

And obviously there is no way to compare 10+ bit content to 8 bit content unless you have a way to view the 10+ bit content natively. It is like asking show me how great Oculus Rift is without having a unit on-hand to demo.

I still say the 5k display is fantastic. I'm just rather confused that Apple put so much effort to support 10 bit content in the 1.2 DisplayPort standard and add support for it in Final Cut Pro X yet it is alleged that 10 bit content cannot be natively viewed within their operating system.
 
Last edited:
The Apple's OpenGL implementation does not support 30bit pixel formats. Its all there is to it. Even if you had a display that supports 30bit color output, the graphics stack on OS X would not support it.
If that is true, that must be why the 3d apps I have tried crash when I force the display in 30 bit mode. The pipeline between the 3d hardware and the display must be aware the display is in 30 bit mode even if the operating system otherwise ignores it.
 
Here is link to Apple video codec supporting 10 and even 12 bits per color channel https://www.apple.com/final-cut-pro/docs/Apple_ProRes_White_Paper.pdf

This Apple support page also states how Final Cut Pro X supports 10 bits per color channel http://support.apple.com/kb/PH18028?viewlocale=en_EG&locale=en_EG

See also here http://support.apple.com/kb/PH12754 and here http://support.apple.com/kb/DL1719

And obviously there is no way to compare 10+ bit content to 8 bit content unless you have a way to view the 10+ bit content natively. It is like asking show me how great Oculus Rift is without having a unit on-hand to demo.

You don't seem to understand the distinction between Mac OS and programs not displaying 30-bit color versus being able to calculate, manipulate and export 30-bit color data. THEY ARE NOT THE SAME. All your links show is that Final Cut Pro allows you to export and save files with 10-bit channels. Here's a thread from the Blackmagic forums that discusses whether you can output 10-bit color data over Thunderbolt even though Mac OS doesn't support 10-bit color display:

Well I am puzzled ... a senior MacPro advisor from Apple confirmed today that what is coming out of the Thunderbolt 2 port is information that will only support 8 bit color: no more and no less. He said, you can't turn an 8 bit color profile in the current OS that's driving the GPUs into a 10 bit color profile if the software (Mavericks) doesn't have a 10 bit look up table to deliver that information to the Thunderbolt port. I argued with him and he said WolfCrow was right.

Be nice if someone from BMD [Black Magic Design] would chime in and find away to get this clear with Apple, because Apple is telling it's customers that trying to squeeze 10 bit out of a MacPro will be impossible with BMD Thunderbolt device until Apple flips the OS switch to support a 10 bit table; and, that NO BMD box will facilitate a 10 bit output the way the current MacPro sits.

You can absolutely get 10 bit output from pro video apps through a BMD Thunderbolt interface. It's true that Mavericks can't natively drive a monitor at 10 bits, but as far as the OS is concerned a BMD device isn't a monitor; it's just a Thunderbolt device to which data can be sent. The OS doesn't know or care that that data might be a representation of a 10 bit image. It's just 1s and 0s on the bus as far as the OS is concerned.

When outputting video via a BMD interface from a pro video app, you're not actually using the operating system's display stack or the graphics card's video signal. You are (depending on the app) possibly using the GPU as a compute engine to process images (possibly at a lot more than 10 bits, i.e. FCP X and Resolve process in 32 bit float), but that's just math; the GPU doesn't care whether the numbers it's crunching represent image data or nodes in a neural network or what.

Once that high bit depth image data exists, nothing prevents it from being sent to a Thunderbolt device. Saying you can't send data representing a 10 bit image to a BMD video interface because the operating system's display stack doesn't support 10 bit display is like saying you can't save 10 bit image data to a hard drive because the operating system's display stack doesn't support 10 bit display. It makes absolutely no sense. 10 bit image data is just so many 1s and 0s, which can be transmitted over Thunderbolt like any other data.

Consider, as additional support, that an UltraStudio 4K can do 4K output even from a system that can't drive a 4K GUI monitor. How is this possible? Because the UltraStudio is generating its own video signal from the data it's sent over Thunderbolt. It's not using the video signal generated by the graphics card and OS, so the limitations those have are irrelevant.

http://forum.blackmagicdesign.com/viewtopic.php?f=3&t=22946
 
Last edited:
  • Like
Reactions: Somian
No it does not support 30 bit color depth. Most consumer displays do not even come close to 30 bit color depth. No consumer 4k displays support. Zero. Nor is it common. Reason being each panel would require a more expensive panel system board, more expensive polorizor etc. higher PPI panel are great, but just because they are a higher PPI does not make them better. It is good for marketing purposes. That is about it. A higher sub-pixel count is more important. Not how many are in front of the panel, but beneath it.

Quite the contrary. Contrast, black level, color accuracy are more important than a higher ppi. High ppi is needed but if you have that without the other factors that make a great display than that higher ppi is a waste. This is not the case with the imac 5k retina display. It is a great display. Fantastic.

But it cant touch a professional panel. Nor is it meant to. It is a consumer product. Nor does it need 30 bit per color like a professional display. And 30 bit displays are not common, at all. 8 bit with FRC is not a 30 bit panel. Nor is it a 10 bit panel.

If it can do billions or trillions of colors without a asterick next to the specs than it is most likely a 10 bit panel. Millions of colors is not a 10 bit panel, it is 8 bit. If the panel says " billions of colors" but has a low color gamut, like 72 percent, that is a 8 bit panel with frc. Not a true 10 bit panel. A NEC professional panel can display 100 percent color gamut in any range. A HP dreamcolor display tests out at over 115 percent color gamut with being a true 10 bit panel with FRC to take it to 14 bit! These 4k panels in the 500 range cannot compare. None of them are true 10 bit panels. If they were they would not cost 500-800 dollars.

Most professional panels use higher sub-pixels to achieve a higher color depth than consumer panels. Most of the best panels are 10 bit + FRC and 30 bit per color. Have very powerful panel system boards to do calibrations on the fly in realtime. Self calibration is built into the panel. The polarizer is usually higher quality single pane to reduce loss of color depth and accuracy.

The 5k imac is great. Just bought one. But it cannot touch any of the professional displays i have at my business, nor was it meant to.

The bit-precision of the display determines how many steps of brightness are possible. A display which supports 6 bits per sub-pixel will provide 64 (26) steps from darkest to brightest; a display which supports 8 bits will provide 256 (28) steps. A display that supports 10 bits per sub-pixel would givie it 1024 (210) steps. The bit-precision is a result of the design of the electronics which control the liquid crystal cells in the panel.

Since there are three sub-pixels, the maximum number of colors that a pixel can present is 2n x 2n x 2n
where n is the bit-precision of a sub-pixel. Therefore, an 8-bit design gives
28 x 28 x 28 = 16.7 million colors.

A display, with a 10-bit design, gives a palette of 210 x 210 x 210 = 1.07 billion colors

By the way, while many people talk about “an 8-bit panel” or “a 10-bit panel”, it’s also common to refer to the total number of bits needed to define a red-green-blue pixel. Therefore, it’s valid (and preferred) to refer to a 10 bit display’s panel as “a 30-bit panel.” 30 is 10 + 10 + 10, which takes account of the 10 bits for each sub-pixel.
Many consumer displays have low-cost 18-bit panels. Some high end consumer high ppi displays (such as the imac 5k and the dell 5k ) have 24-bit panels. No 8 bit plus frc do not count. But a high end professional display is one of a small number of displays with a true 30-bit panel. The higher the bit- precision of a display, the better able it is to represent colors accurately.

A 8 bit or 24-bit panel, which offers 16.7 million colors, would be good for most consumers needs. However, there are cases where 8-bits per sub-pixel is not enough. Consider a grayscale image. Gray (including white and black) is produced when the three sub- pixels (red, green, and blue) are equally bright.

This means that the values for the three sub-pixels are the same: 35/35/35, for example. With 8-bits per sub-pixel, gray can go from 0/0/0 (black) to 255/255/255 (white). Therefore, there are only 256 levels of gray possible. This can lead to “banding,’ which is an effect that arises because the step between adjacent levels of gray is big enough for the eye to detect. It can be a problem in certain kinds of visualization, such as 3D rendering for automotive styling. With a 30-bit panel, there are 1024 levels of gray, and it’s almost impossible for the eye to detect the step between adjacent levels.

Also, there are cases where images can have greater bit-precision than 24 bits, especially where subtle detail is important. Examples are: satellite imagery for intelligence agencies, or medical imagery for, say, mammography for example, or color accuracy for tv or movie production.

You will not see pixar or dreamworks getting rid of their dreamcolor or NEC displays to get a new 5k imac anytime soon. The imac retina with the 5k display is the best "consumer" display on the market, hands down. But it is not a professional display nor is it meant to be.
 
Last edited:
  • Like
Reactions: Somian
Thanks iamthedudeman for all the infos.

If it can do billions or trillions of colors without a asterick next to the specs than it is most likely a 10 bit panel. Millions of colors is not a 10 bit panel, it is 8 bit. If the panel says " billions of colors" but has a low color gamut, like 72 percent, that is a 8 bit panel with frc. Not a true 10 bit panel. A NEC professional panel can display 100 percent color gamut in any range.

Do you mean 72 percent of Adobe RGB space? Do you have an iMac 5k ICC profile to share?
 
OSX does not support 30 bit.

SwithResX isn't going to tell you the true support.

Just because DP 1.2 supports it doesn't mean the OS does.

Does it mean that Windows supports up to 48-bit color (including 30-bit) and OS X does not even support 30-bit? If this holds true, how can OS X be more suitable for professional work?
 
Does it mean that Windows supports up to 48-bit color (including 30-bit) and OS X does not even support 30-bit? If this holds true, how can OS X be more suitable for professional work?

There's a difference between 'suited' and 'popular'.
 
Somehow today, 'maximum everything' has become the norm for tech enthusiasts that visit forums like these. Nothing but the latest and most expensive graphic cards will do, 24 and 32Gb RAM is "normal", 1600Mhz DDR is "slow", you can't do video editing without an i7, SSDs are already standard for everything and everyone, etc. Oh, and also, 30-bit is required for professional work. Because, you know, if you see gradation in grayscale gradients, then you just can't work. Professionally, that is.

Everything else - you're an average consumer who doesn't care for that, you're a casual user, etc.

Wrong.

The truth is - most professionals out there actually don't work on latest and greatest tech. Some of them, like the artists I follow, are not into tech, they don't have a clue if their panel is 24 or 30-bit or if their RAM is faster or if their CPU has hyperthreading. Others know all of that stuff but since they are in business, they will often save money where they can - by not buying things only forum-goers notice. You don't 'need' any of that.

Now, I don't know why Apple doesn't support 30-bit. Would I like to have something "better" if I could? Sure. But am I 'shocked' they don't support it? No, I didn't even know about it. And I am a professional - and by that I mean that for the past 10 years, I've been living as an illustrator. I've been creating things that payed my bills on computers you wouldn't even look at. I managed to get to the point where I can afford good equipment, high-end Wacom Tablets, iMac 5K, etc. but it's not something I *need*. It's just something I like.

So, I think it's safe to say - no, it's not a big deal that the iMac 5K doesn't have a 30-bit panel. And yes, it is a great professional device, because the screen looks great and makes working on it a joy. That not professional enough for you? Too bad. I'm sorry, but that's all that it takes, even for professionals. Half of the best visual artists I know don't even know what computers they have. In fact, the only people I know who think such things are important are tech geeks. That doesn't mean Apple shouldn't put 30-bit support in OSX. Let them - it is better. Just let's not make a big deal out of it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.