iMac 5k not 30 bit color

Discussion in 'iMac' started by CodeJingle, Oct 17, 2014.

  1. CodeJingle macrumors regular

    Oct 23, 2009
    So I noticed the 5k display is not 30 bit color depth. 30 bit color depth is the new standard, I'm really surprised ALL Apple displays are still 24 bit color. The whole world doesn't seem to notice. All 4k displays are 30 bit color depth. Any feedback?
  2. Outrigger macrumors 68000


    Dec 22, 2008
    because the whole world (your average consumers) won't notice or care.
  3. librarian macrumors regular

    Sep 24, 2011
    well it's not really a problem since most Mac Os versions of applications like Photoshop or Lightroom don't support 30bit monitors
  4. CodeJingle thread starter macrumors regular

    Oct 23, 2009
    Mac Photoshop does support 30 bit color display


    Any techie who cares about high resolution should be caring just as much about high color depth.
  5. librarian macrumors regular

    Sep 24, 2011
    It does not, there is no 30bit option in Performance settings within Photoshop, this because Mac OS X does not have 30bit support at all.
  6. CodeJingle, Oct 17, 2014
    Last edited: Oct 17, 2014

    CodeJingle thread starter macrumors regular

    Oct 23, 2009
    I am running OS X in 30 bit color depth right now. Photoshop supports 30 bit color depth. The 4k standard for video is 30 bit color depth.
  7. Deasnutz macrumors 6502

    Jun 9, 2011
    Is there any official language out there stating support? Screenshots?
  8. CodeJingle thread starter macrumors regular

    Oct 23, 2009
    The DisplayPort 1.2 standard states is supports color depth greater than 24 bit. In SwitchResX I switch the colors from Millions to Billions, after doing that only certain apps work, others crash but only in Billion color mode, if the OS didn't support 30 bit color then the app would work the same regardless of the color mode of the display.
  9. EnderTW macrumors 6502

    Jun 30, 2007
    OSX does not support 30 bit.

    SwithResX isn't going to tell you the true support.

    Just because DP 1.2 supports it doesn't mean the OS does.
  10. CodeJingle, Oct 17, 2014
    Last edited: Oct 17, 2014

    CodeJingle thread starter macrumors regular

    Oct 23, 2009
    That still doesn't explain why certain apps crash when I use SwitchResX to switch the display to billion color mode.

    At some point there isn't any way to make a better display, then the only thing left to do is increase the color depth. If my graphics card, my display, and the connection between my display and graphics card, all support 30 bit color depth, then there is no reason for the OS to be that dumb.
  11. EnderTW macrumors 6502

    Jun 30, 2007
    Apple will most definitely support 30 bit (10 bit basically), but like you said, very few displays support it.
  12. CodeJingle, Oct 17, 2014
    Last edited: Oct 17, 2014

    CodeJingle thread starter macrumors regular

    Oct 23, 2009
    I never said 'very few displays support 30 bit color depth' (except for Apple). Pretty much all new displays coming out that are larger than 1600 x 1200 are 30 bit color. Almost all 4k displays are 30 bit color. Apple is one of the few making new displays that aren't 30 bit color depth. My 3840x2160 ($1k new) and my 2560x1440 ($350 used) displays both support 30 bit color. I think in Japan regular TV is streamed in 4k with 30 bit color. It is 10 bits per channel, the total number of bits per pixel is 30. If you have ever programmed graphics hardware you would understand we tend to describe color depth in bits per pixel not bits per color channel.
  13. ddarko, Oct 18, 2014
    Last edited: Oct 18, 2014

    ddarko macrumors 6502

    May 7, 2007
    Mac OS doesn't magically support 30-bit because all the other links in the chain do. You claimed you were running Mac OS in 30-bit - another poster correctly pointed out you're not because Mac OS doesn't support 30-bit, it only supports up to 24-bit (Windows 7 and 8 do support 30-bit). Your response is to claim Mac OS is 30-bit because a third party app crashes when you put it into 30-bit mode (huh? Maybe it crashes because the OS doesn't support it?) and then bitch that it's "dumb" Mac OS isn't 30-bit because everything else you own is. All of that is irrelevant - you are not running Mac OS in 30-bit. Run a google search - "Mac OS 30-bit" and see what you get.

    And Photoshop for Mac does not support 30-bit color either. From Adobe's Photoshop FAQ:

    10-bit displays are not the norm - 8-bit panels with FRC dithering to approximate 10-bit color is what's commonly sold. That goes for most 4k monitors and televisions; they get advertised as "10-bit" panels but if you press the manufacturers or dig a little, you discover most of them aren't true 10-bit panels. See this link to a review of the Asus PB287Q 28-Inch 4K monitor that lists its color depth support as "10-bit (8-bit with FRC)." Your professional "30-bit" workflow has been run with a photo application and OS that don't support it, displayed on monitors that likely used dithering to approximate it.

    Finally, nobody in the world broadcasts 4k television, Japan include. Japan did a trial test run of 4k broadcasting that was beamed only to commercial retailers this past summer. And it wasn't in Rec 2020. UHD and 4k standards support Rec 2020 with 10- and 12-bit color depth but there is no programming, including the Netflix 4k streams, that uses it. The Sony 4k movies that are on its UHD media player are also 24-bit. Even if you had a genuine 10-bit UHD television (and there are a couple out there), the 4k program you're watching is coded to the regular 8-bit per channel/24-bit Rec 709 standard.
  14. Talarspeed macrumors member

    Dec 12, 2009
  15. leman macrumors 604

    Oct 14, 2008
    The Apple's OpenGL implementation does not support 30bit pixel formats. Its all there is to it. Even if you had a display that supports 30bit color output, the graphics stack on OS X would not support it.
  16. Serban Suspended

    Jan 8, 2013
    just plz show me and let's compare the display quality of this 5k imac with your 30 bit display
  17. CodeJingle, Oct 18, 2014
    Last edited: Oct 18, 2014

    CodeJingle thread starter macrumors regular

    Oct 23, 2009
    Here is link to Apple video codec supporting 10 and even 12 bits per color channel

    This Apple support page also states how Final Cut Pro X supports 10 bits per color channel

    See also here and here

    And obviously there is no way to compare 10+ bit content to 8 bit content unless you have a way to view the 10+ bit content natively. It is like asking show me how great Oculus Rift is without having a unit on-hand to demo.

    I still say the 5k display is fantastic. I'm just rather confused that Apple put so much effort to support 10 bit content in the 1.2 DisplayPort standard and add support for it in Final Cut Pro X yet it is alleged that 10 bit content cannot be natively viewed within their operating system.
  18. CodeJingle thread starter macrumors regular

    Oct 23, 2009
    If that is true, that must be why the 3d apps I have tried crash when I force the display in 30 bit mode. The pipeline between the 3d hardware and the display must be aware the display is in 30 bit mode even if the operating system otherwise ignores it.
  19. ddarko, Oct 18, 2014
    Last edited: Oct 18, 2014

    ddarko macrumors 6502

    May 7, 2007
    You don't seem to understand the distinction between Mac OS and programs not displaying 30-bit color versus being able to calculate, manipulate and export 30-bit color data. THEY ARE NOT THE SAME. All your links show is that Final Cut Pro allows you to export and save files with 10-bit channels. Here's a thread from the Blackmagic forums that discusses whether you can output 10-bit color data over Thunderbolt even though Mac OS doesn't support 10-bit color display:
  20. iamthedudeman, Oct 18, 2014
    Last edited: Oct 18, 2014

    iamthedudeman macrumors 65816

    Jul 7, 2007
    No it does not support 30 bit color depth. Most consumer displays do not even come close to 30 bit color depth. No consumer 4k displays support. Zero. Nor is it common. Reason being each panel would require a more expensive panel system board, more expensive polorizor etc. higher PPI panel are great, but just because they are a higher PPI does not make them better. It is good for marketing purposes. That is about it. A higher sub-pixel count is more important. Not how many are in front of the panel, but beneath it.

    Quite the contrary. Contrast, black level, color accuracy are more important than a higher ppi. High ppi is needed but if you have that without the other factors that make a great display than that higher ppi is a waste. This is not the case with the imac 5k retina display. It is a great display. Fantastic.

    But it cant touch a professional panel. Nor is it meant to. It is a consumer product. Nor does it need 30 bit per color like a professional display. And 30 bit displays are not common, at all. 8 bit with FRC is not a 30 bit panel. Nor is it a 10 bit panel.

    If it can do billions or trillions of colors without a asterick next to the specs than it is most likely a 10 bit panel. Millions of colors is not a 10 bit panel, it is 8 bit. If the panel says " billions of colors" but has a low color gamut, like 72 percent, that is a 8 bit panel with frc. Not a true 10 bit panel. A NEC professional panel can display 100 percent color gamut in any range. A HP dreamcolor display tests out at over 115 percent color gamut with being a true 10 bit panel with FRC to take it to 14 bit! These 4k panels in the 500 range cannot compare. None of them are true 10 bit panels. If they were they would not cost 500-800 dollars.

    Most professional panels use higher sub-pixels to achieve a higher color depth than consumer panels. Most of the best panels are 10 bit + FRC and 30 bit per color. Have very powerful panel system boards to do calibrations on the fly in realtime. Self calibration is built into the panel. The polarizer is usually higher quality single pane to reduce loss of color depth and accuracy.

    The 5k imac is great. Just bought one. But it cannot touch any of the professional displays i have at my business, nor was it meant to.

    The bit-precision of the display determines how many steps of brightness are possible. A display which supports 6 bits per sub-pixel will provide 64 (26) steps from darkest to brightest; a display which supports 8 bits will provide 256 (28) steps. A display that supports 10 bits per sub-pixel would givie it 1024 (210) steps. The bit-precision is a result of the design of the electronics which control the liquid crystal cells in the panel.

    Since there are three sub-pixels, the maximum number of colors that a pixel can present is 2n x 2n x 2n
    where n is the bit-precision of a sub-pixel. Therefore, an 8-bit design gives
    28 x 28 x 28 = 16.7 million colors.

    A display, with a 10-bit design, gives a palette of 210 x 210 x 210 = 1.07 billion colors

    By the way, while many people talk about “an 8-bit panel” or “a 10-bit panel”, it’s also common to refer to the total number of bits needed to define a red-green-blue pixel. Therefore, it’s valid (and preferred) to refer to a 10 bit display’s panel as “a 30-bit panel.” 30 is 10 + 10 + 10, which takes account of the 10 bits for each sub-pixel.
    Many consumer displays have low-cost 18-bit panels. Some high end consumer high ppi displays (such as the imac 5k and the dell 5k ) have 24-bit panels. No 8 bit plus frc do not count. But a high end professional display is one of a small number of displays with a true 30-bit panel. The higher the bit- precision of a display, the better able it is to represent colors accurately.

    A 8 bit or 24-bit panel, which offers 16.7 million colors, would be good for most consumers needs. However, there are cases where 8-bits per sub-pixel is not enough. Consider a grayscale image. Gray (including white and black) is produced when the three sub- pixels (red, green, and blue) are equally bright.

    This means that the values for the three sub-pixels are the same: 35/35/35, for example. With 8-bits per sub-pixel, gray can go from 0/0/0 (black) to 255/255/255 (white). Therefore, there are only 256 levels of gray possible. This can lead to “banding,’ which is an effect that arises because the step between adjacent levels of gray is big enough for the eye to detect. It can be a problem in certain kinds of visualization, such as 3D rendering for automotive styling. With a 30-bit panel, there are 1024 levels of gray, and it’s almost impossible for the eye to detect the step between adjacent levels.

    Also, there are cases where images can have greater bit-precision than 24 bits, especially where subtle detail is important. Examples are: satellite imagery for intelligence agencies, or medical imagery for, say, mammography for example, or color accuracy for tv or movie production.

    You will not see pixar or dreamworks getting rid of their dreamcolor or NEC displays to get a new 5k imac anytime soon. The imac retina with the 5k display is the best "consumer" display on the market, hands down. But it is not a professional display nor is it meant to be.
  21. Tenoxor macrumors newbie

    Oct 22, 2014
    Thanks iamthedudeman for all the infos.

    Do you mean 72 percent of Adobe RGB space? Do you have an iMac 5k ICC profile to share?
  22. skaertus macrumors 68030


    Feb 23, 2009
    Does it mean that Windows supports up to 48-bit color (including 30-bit) and OS X does not even support 30-bit? If this holds true, how can OS X be more suitable for professional work?
  23. shaocaholica macrumors member


    May 26, 2010
    There's a difference between 'suited' and 'popular'.
  24. aevan macrumors 68000


    Feb 5, 2015
    Somehow today, 'maximum everything' has become the norm for tech enthusiasts that visit forums like these. Nothing but the latest and most expensive graphic cards will do, 24 and 32Gb RAM is "normal", 1600Mhz DDR is "slow", you can't do video editing without an i7, SSDs are already standard for everything and everyone, etc. Oh, and also, 30-bit is required for professional work. Because, you know, if you see gradation in grayscale gradients, then you just can't work. Professionally, that is.

    Everything else - you're an average consumer who doesn't care for that, you're a casual user, etc.


    The truth is - most professionals out there actually don't work on latest and greatest tech. Some of them, like the artists I follow, are not into tech, they don't have a clue if their panel is 24 or 30-bit or if their RAM is faster or if their CPU has hyperthreading. Others know all of that stuff but since they are in business, they will often save money where they can - by not buying things only forum-goers notice. You don't 'need' any of that.

    Now, I don't know why Apple doesn't support 30-bit. Would I like to have something "better" if I could? Sure. But am I 'shocked' they don't support it? No, I didn't even know about it. And I am a professional - and by that I mean that for the past 10 years, I've been living as an illustrator. I've been creating things that payed my bills on computers you wouldn't even look at. I managed to get to the point where I can afford good equipment, high-end Wacom Tablets, iMac 5K, etc. but it's not something I *need*. It's just something I like.

    So, I think it's safe to say - no, it's not a big deal that the iMac 5K doesn't have a 30-bit panel. And yes, it is a great professional device, because the screen looks great and makes working on it a joy. That not professional enough for you? Too bad. I'm sorry, but that's all that it takes, even for professionals. Half of the best visual artists I know don't even know what computers they have. In fact, the only people I know who think such things are important are tech geeks. That doesn't mean Apple shouldn't put 30-bit support in OSX. Let them - it is better. Just let's not make a big deal out of it.

Share This Page