Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The UHD Blu-Ray spec includes 10-bit colour depth and Rec.2020 amongst the improvements in addition to resolution. And the TV manufacturers appear to be pushing all those standards with their announced HDR sets due for release in time for the launch of UHD Blu-Ray (end of this year).

We may well see monitors improving to the same standards as they generally have the same panel manufacturers, so I hope OS X will support them in 10.11.
 
The UHD Blu-Ray spec includes 10-bit colour depth and Rec.2020 amongst the improvements in addition to resolution. And the TV manufacturers appear to be pushing all those standards with their announced HDR sets due for release in time for the launch of UHD Blu-Ray (end of this year)..

But just because the specs are there, does not mean that the manufacturers will not just do the slight of hand that they always seem to do. Just like the computer monitors now, they will accept 10bit colour and render it down to 8bit output.

I am old enough to remember the revolution when we make the huge jump from 4 to 8bit colour. Back then they were talking about all the bits, not just one colour channel. The awe about seeing 256 different colours on the screen at one point in time. The wow of colour GIF files.

To be totally honest, i would bet that most people would not be able to tell the difference between the old 16 bit (32768 possible colours) and the more moderns 24bit (16.7 million possible colours).
 
Somehow today, 'maximum everything' has become the norm for tech enthusiasts that visit forums like these. Nothing but the latest and most expensive graphic cards will do, 24 and 32Gb RAM is "normal", 1600Mhz DDR is "slow", you can't do video editing without an i7, SSDs are already standard for everything and everyone, etc. Oh, and also, 30-bit is required for professional work. Because, you know, if you see gradation in grayscale gradients, then you just can't work. Professionally, that is.

Everything else - you're an average consumer who doesn't care for that, you're a casual user, etc.

Wrong.

The truth is - most professionals out there actually don't work on latest and greatest tech. Some of them, like the artists I follow, are not into tech, they don't have a clue if their panel is 24 or 30-bit or if their RAM is faster or if their CPU has hyperthreading. Others know all of that stuff but since they are in business, they will often save money where they can - by not buying things only forum-goers notice. You don't 'need' any of that.

Now, I don't know why Apple doesn't support 30-bit. Would I like to have something "better" if I could? Sure. But am I 'shocked' they don't support it? No, I didn't even know about it. And I am a professional - and by that I mean that for the past 10 years, I've been living as an illustrator. I've been creating things that payed my bills on computers you wouldn't even look at. I managed to get to the point where I can afford good equipment, high-end Wacom Tablets, iMac 5K, etc. but it's not something I *need*. It's just something I like.

So, I think it's safe to say - no, it's not a big deal that the iMac 5K doesn't have a 30-bit panel. And yes, it is a great professional device, because the screen looks great and makes working on it a joy. That not professional enough for you? Too bad. I'm sorry, but that's all that it takes, even for professionals. Half of the best visual artists I know don't even know what computers they have. In fact, the only people I know who think such things are important are tech geeks. That doesn't mean Apple shouldn't put 30-bit support in OSX. Let them - it is better. Just let's not make a big deal out of it.

Hmm. The question was asking for feedback of why Apple doesn't use 30 bit. This just sounds like your trying to find an excuse for why the question shouldn't even be asked. Lol "Lets not make a big deal out of it"?

I dont do anything remotely art/design related on my iMac. However doing IT we supply systems for those that do. And bit depth is spec requirement quite often for those in design.
 
But just because the specs are there, does not mean that the manufacturers will not just do the slight of hand that they always seem to do. Just like the computer monitors now, they will accept 10bit colour and render it down to 8bit output.

I am old enough to remember the revolution when we make the huge jump from 4 to 8bit colour. Back then they were talking about all the bits, not just one colour channel. The awe about seeing 256 different colours on the screen at one point in time. The wow of colour GIF files.

To be totally honest, i would bet that most people would not be able to tell the difference between the old 16 bit (32768 possible colours) and the more moderns 24bit (16.7 million possible colours).

I'm more hopeful. The manufacturers seem to realise that the increase in resolution will not be enough to get most people to upgrade to UHD, so were pushing the other aspects of image quality when announcing their upcoming TV ranges. In particular they were talking about the effects of quantum dots that allows brighter images and a wider colour gamut. It was mentioned several times that this would make the banding of 8 bit colour more noticeable, but wasn't apparent due to a move to 10 bit.
 
What's the spec of the new iMacs with dCI P3 ?
The new iMac is displayed as 30bit color. As are displays that are capable of 30bit color.

However, I can still see banding on a test psd in Preview. I'm not quite sure on how to go about with testing this.
 

Attachments

  • Screen Shot 2015-10-27 at 20.20.01 .png
    Screen Shot 2015-10-27 at 20.20.01 .png
    390.6 KB · Views: 169
Well, my 2014 RiMac with the R9 M290x also says it's 30 bit if I look in that spot. Now does that mean that El Cap has suddenly added a new mode and the old displays could do 30 bit colour before but where being restrained??
 
Haha this thread...

Pretty sure iMacs will NEVER support 30-bit color. iMacs, while used by many, aren't professional machines by Apples standard. Which is why they have the Mac Pro(fessional)...for connecting a pro machine to a pro monitor.

Who knows...maybe that long rumored 5k Thunderbolt display refresh (along with an OS X update) will carry along with it 30-bit color mode.

But on an iMac...forget it lol.
 
Somehow today, 'maximum everything' has become the norm for tech enthusiasts that visit forums like these. Nothing but the latest and most expensive graphic cards will do, 24 and 32Gb RAM is "normal", 1600Mhz DDR is "slow", you can't do video editing without an i7, SSDs are already standard for everything and everyone, etc. Oh, and also, 30-bit is required for professional work. Because, you know, if you see gradation in grayscale gradients, then you just can't work. Professionally, that is.

THIS! LOL. I was wondering the same thing about most of these guys.

Personally I'm upgrading from a Hackintosh with i7-950, 24GB of 800MHz RAM (OMG so slow right???), 6TB of storage, and a GTX 280, and my machine still puts out a lot of professional service products for my clients without breaking a sweat lol.

FYI I'm a UI/UX designer, high-end retoucher, motion/visual fx artist, so I'm totally justified in my purchase. But these guys and their spreadsheets, yo...:p
 
  • Like
Reactions: aevan
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.