because the whole world (your average consumers) won't notice or care.
Fair enough. But call it "Retina2" the world will wonder how they ever lived without it.
because the whole world (your average consumers) won't notice or care.
The UHD Blu-Ray spec includes 10-bit colour depth and Rec.2020 amongst the improvements in addition to resolution. And the TV manufacturers appear to be pushing all those standards with their announced HDR sets due for release in time for the launch of UHD Blu-Ray (end of this year)..
Somehow today, 'maximum everything' has become the norm for tech enthusiasts that visit forums like these. Nothing but the latest and most expensive graphic cards will do, 24 and 32Gb RAM is "normal", 1600Mhz DDR is "slow", you can't do video editing without an i7, SSDs are already standard for everything and everyone, etc. Oh, and also, 30-bit is required for professional work. Because, you know, if you see gradation in grayscale gradients, then you just can't work. Professionally, that is.
Everything else - you're an average consumer who doesn't care for that, you're a casual user, etc.
Wrong.
The truth is - most professionals out there actually don't work on latest and greatest tech. Some of them, like the artists I follow, are not into tech, they don't have a clue if their panel is 24 or 30-bit or if their RAM is faster or if their CPU has hyperthreading. Others know all of that stuff but since they are in business, they will often save money where they can - by not buying things only forum-goers notice. You don't 'need' any of that.
Now, I don't know why Apple doesn't support 30-bit. Would I like to have something "better" if I could? Sure. But am I 'shocked' they don't support it? No, I didn't even know about it. And I am a professional - and by that I mean that for the past 10 years, I've been living as an illustrator. I've been creating things that payed my bills on computers you wouldn't even look at. I managed to get to the point where I can afford good equipment, high-end Wacom Tablets, iMac 5K, etc. but it's not something I *need*. It's just something I like.
So, I think it's safe to say - no, it's not a big deal that the iMac 5K doesn't have a 30-bit panel. And yes, it is a great professional device, because the screen looks great and makes working on it a joy. That not professional enough for you? Too bad. I'm sorry, but that's all that it takes, even for professionals. Half of the best visual artists I know don't even know what computers they have. In fact, the only people I know who think such things are important are tech geeks. That doesn't mean Apple shouldn't put 30-bit support in OSX. Let them - it is better. Just let's not make a big deal out of it.
But just because the specs are there, does not mean that the manufacturers will not just do the slight of hand that they always seem to do. Just like the computer monitors now, they will accept 10bit colour and render it down to 8bit output.
I am old enough to remember the revolution when we make the huge jump from 4 to 8bit colour. Back then they were talking about all the bits, not just one colour channel. The awe about seeing 256 different colours on the screen at one point in time. The wow of colour GIF files.
To be totally honest, i would bet that most people would not be able to tell the difference between the old 16 bit (32768 possible colours) and the more moderns 24bit (16.7 million possible colours).
The new iMac is displayed as 30bit color. As are displays that are capable of 30bit color.What's the spec of the new iMacs with dCI P3 ?
Somehow today, 'maximum everything' has become the norm for tech enthusiasts that visit forums like these. Nothing but the latest and most expensive graphic cards will do, 24 and 32Gb RAM is "normal", 1600Mhz DDR is "slow", you can't do video editing without an i7, SSDs are already standard for everything and everyone, etc. Oh, and also, 30-bit is required for professional work. Because, you know, if you see gradation in grayscale gradients, then you just can't work. Professionally, that is.