Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

SoyCapitanSoyCapitan

Suspended
Original poster
Jul 4, 2015
4,487
2,551
Paris
You've probably seen this marketing line from a number of monitor and television makers. Now Apple has followed suite claiming that the iMac can show 1 billion colours on the display.

There's around 14.7 million pixels on the 5K display. A typical very detailed and colourful photo has several thousand individual colours.

Yes it helps to have graphics cards and monitors that can pick from a very wide gamut, especially for people who work in print. But...

Are these companies really trying to dupe the public with big numbers or do they really not know the difference between a colour palette and the number of colours that can be shown at one time?

sell $btc
sell $eth
sell $ltc
 

Attachments

  • Capture.JPG
    Capture.JPG
    29.4 KB · Views: 560
Last edited:
  • Like
Reactions: OttawaGuy
Worse, it's dithering, so not really a 10 bit panel, I guess. A 8 bit panel will not display that many colors.
 
What's even more confusing about this is how it was announced as a new feature on the Mid 2017 Retina iMacs. Both the Late 2015 and Late 2014 5K iMacs have 10-bit panels for content that supports it (but certainly not "every photo" supports it).
 
You've probably seen this marketing line from a number of monitor and television makers. Now Apple has followed suite claiming that the iMac can show 1 billion colours on the display.

There's around 14.7 million pixels on the 5K display. A typical very detailed and colourful photo has several thousand individual colours.

Yes it helps to have graphics cards and monitors that can pick from a very wide gamut, especially for people who work in print. But...

Are these companies really trying to dupe the public with big numbers or do they really not know the difference between a colour palette and the number of colours that can be shown at one time?
They did not claim the display could show 1Billion colours at the same time.
[doublepost=1496755969][/doublepost]
Worse, it's dithering, so not really a 10 bit panel, I guess. A 8 bit panel will not display that many colors.
Could you elaborate? I thought the first 5K iMac already had a true 10-bit display.
 
They did not claim the display could show 1Billion colours at the same time.
[doublepost=1496755969][/doublepost]
Could you elaborate? I thought the first 5K iMac already had a true 10-bit display.

They very specifically stated it was 10-bit dithering. This is extremely common, but a little surprising given how much emphasis Apple puts on their displays. I'd bet the iMac Pro will have a true 10-bit display, but I could be wrong.

Honestly it doesn't make *that* much difference in the end for 90% of users, even content creators. But if you do anything that requires extremely accurate colors you'll want to use the display out capability to use an external monitor. I don't recall if the 5k Ultrafine LG is 10 bit or not.
 
They very specifically stated it was 10-bit dithering. This is extremely common, but a little surprising given how much emphasis Apple puts on their displays. I'd bet the iMac Pro will have a true 10-bit display, but I could be wrong.

Honestly it doesn't make *that* much difference in the end for 90% of users, even content creators. But if you do anything that requires extremely accurate colors you'll want to use the display out capability to use an external monitor. I don't recall if the 5k Ultrafine LG is 10 bit or not.

With native support for P3 profile it's aimed for video. That gamut isn't ideal for print but it's fine for nearly all digital purposes. The portion of people who really need 10-bit is miniscule. Macs didn't support such a bit depth until El Capitan and yet for two decades digital print houses and photographers did well with just 8-bit.

Then there's cameras themselves. Nearly all professional photography has and still is 8 bit. You can shoot in 14 bit still photography or 10 bit video, but it's only necessary in very rare circumstances when banding is occurring in gradients.

Marketing material isn't written by professionals. If they did it could you imagine the result. Honest marketing and realistic expectations? Less slogans? ;)
[doublepost=1496757498][/doublepost]
They did not claim the display could show 1Billion colours at the same time.

YES, they have implied it in the attached image first post.
[doublepost=1496757711][/doublepost]
Worse, it's dithering, so not really a 10 bit panel, I guess. A 8 bit panel will not display that many colors.

And also you need 10 bit support in graphics drivers if you need that bit depth at all. Only a few most expensive Macs have that support in that last two years. Before that, never.
 
  • Like
Reactions: CarlJ
Others probably referred to "16 million colours on your monitor", when said monitor had a max resolution of 640*480. People are not that stupid and know that 14M pixels won't show a billion colours. If you want to sue Apple for deceptive advertising, be my guest.
 
Then there's cameras themselves. Nearly all professional photography has and still is 8 bit. You can shoot in 14 bit still photography or 10 bit video, but it's only necessary in very rare circumstances when banding is occurring in gradients.

Professional photographers need more than eight bits constantly. That's one of the main reasons that editing is done on 10/12/14-bit RAW or TIFF files, not 8-bit JPEGs. 8 bits simply isn't enough exposure latitude to fix shadows and highlights. Even a basic feature-restricted entry level camera like the D3300 has 12-bit RAW because it's necessary.
 
  • Like
Reactions: Larry-K
Professional photographers need more than eight bits constantly.

Thanks for your opinion. I worked in this field for 30+ years. I know we are anonymous internet people but it helps not to underestimate who you are speaking with in the first place before you make such comments. No further debate. All the essentials have been outlined in posts above in very straight forward manner.
 
It'd be better if you refuted what amb said instead or resorting to an argument from authority, but apparently it's "No further debate".
I take that 10-12 bit photography does more than just eliminating banding as you seem to imply.
 
Last edited:
  • Like
Reactions: hurtmemore
Thanks for your opinion. I worked in this field for 30+ years. I know we are anonymous internet people but it helps not to underestimate who you are speaking with in the first place before you make such comments. No further debate. All the essentials have been outlined in posts above in very straight forward manner.

Needing a monitor to display more than 8 bits is quite rare, but you specifically specified that shooting at 14 bits is "very rare". There's no defensible argument for that when every full frame camera on the market and essentially every interchangeable lens camera of any sensor size supports > 10 bit RAW. Some professionals shoot JPEG sometimes, but you're essentially asserting that it's "very rare" for a professional photographer to shoot in RAW, and that's simply not true.
 
Needing a monitor to display more than 8 bits is quite rare, but you specifically specified that shooting at 14 bits is "very rare".

14 bit (or as high as you can go) is essential for things like architectural photography where linear light and shadows (real or created) need to avoid banding effects. Most photography in the world though is fashion, street, and journalism where such issues don't often arise (and when they do then you can change mode when needed).

What's upsetting is you just invented the following assertion out of thin air:

"you're essentially asserting that it's "very rare" for a professional photographer to shoot in RAW"

That's really either an insult or you just don't know that 8 bit RAW is the most common shooting format.

I'm sure in person in a professional environment with your cred on the line you wouldn't say such a thing. I'd ask you to apologise for making such a comment just out of politeness and so that you learn to communicate properly to elders in the creative industry. That's why I really wanted to terminate our discussion in my last response. I always know which direction these "debates" take online and it is really a waste of the precious life we have.
 
Layman question: does this "10bit dithering" claim (which I guess is not "real" 10bit? isn't dithering a software thing? the wording sounds confusing) help in any way with displaying more faithfully real-10bit tv series and anime?
 
That's really either an insult or you just don't know that 8 bit RAW is the most common shooting format.

Nikon cameras only do 12 or 14 bit RAW. From the lowliest D3000-series all the way up to the D5. It's even listed explicitly on their website.

D5: "NEF (RAW): 12 or 14 bit"
D3300: "12-bit NEF (RAW)"

8 bit RAW isn't even an option.

The situation is the same for Canon.

EOS 1D X Mark II: "Still Image: JPEG, RAW (14 bit Canon Original)"

I'm not sure where you came up with the idea of 8-bit RAW, that's just not a thing. Even the 2 MP Nikon D1 from 1999 shot 12 bit RAW.
 
I have started shooting 12bit RAWs over 10 years ago, and 14bit for 5 years since I got Nikon D800 that could do it. To check if I am working on a file in Photoshop that's 16bit per channel is like the first thing I do almost every day. Banding issues can easily occur with not even drastic color manipulation of a smooth gradient. It is of course unacceptably visible on print, but if done badly it can easily be recognized in 8bit digital destination such as a website where most viewers are in sRGB.

There is a misconception here that 8 bit is enough for professional work. I had some experience in a print house preparing for pre-press material and it seems it is novice Photoshop users are at fault here. You can shoot in 8bit, or have a collage of visual materials in 8bit, but if your workflow in PS et al is at 16bit you can at least manipulate smooth colors over these sources. But I would say it is 90% of the time that we got files from folks who simply do not know or care of the issue.

I am glad that the issue is semi-gone now that even consumer devices can handle wide gamut and higher than 8bit color depth.
[doublepost=1496773660][/doublepost]
Nikon cameras only do 12 or 14 bit RAW. From the lowliest D3000-series all the way up to the D5. It's even listed explicitly on their website.

D5: "NEF (RAW): 12 or 14 bit"
D3300: "12-bit NEF (RAW)"

8 bit RAW isn't even an option.

The situation is the same for Canon.

EOS 1D X Mark II: "Still Image: JPEG, RAW (14 bit Canon Original)"

I'm not sure where you came up with the idea of 8-bit RAW, that's just not a thing. Even the 2 MP Nikon D1 from 1999 shot 12 bit RAW.
I actually don't even think sensors that can only handle 8 bit data existed. The 8 bit limit was probably from the camera DSP, when you do a real time encoding of a JPEG and then worse off is telling your camera to use sRGB. This probably explains why we see 8bit as some "norm" in digital photography until lately.
[doublepost=1496774267][/doublepost]
That's really either an insult or you just don't know that 8 bit RAW is the most common shooting format.
8 bit delivery from unaware "designers" is indeed common. 8 bit RAW on the other hand is almost unheard of for at least a decade if ever.
 
Then there's cameras themselves. Nearly all professional photography has and still is 8 bit. You can shoot in 14 bit still photography or 10 bit video, but it's only necessary in very rare circumstances when banding is occurring in gradients.
I don't know what professional photographers you've dealt with over the past 30 years, but if someone sends me 8 bit files these days, they're getting a phone call, and not a pleasant one.
 
They did not claim the display could show 1Billion colours at the same time.
[doublepost=1496755969][/doublepost]
Could you elaborate? I thought the first 5K iMac already had a true 10-bit display.

I think the first 5K iMac had not even a false 10-bit display, it was 8-bit. The second one launched at the same time as the 4K iMac got a P3 display, which is the same as 10-bit(?)
 
I'm finding it hard to understand what's the difference between the new iMac display and the 2015 one. I thought the 2015 one was already 10bit? What's the difference in amount of colours the 2015 one can show vs the new one?
 
Didn't they talk about 10-bit dithered output?
I seem to recall they mentioned dithered.
 
The 2017 one is still "dithered 10 bit", just as the previous model. Judging by the increase to 500 nits though, I am guessing the LCD panel stays the same, while the backlight is a newer component.
 
Very few panels are real 10-bit, they usually offer even 12-bit processing (dithering) but in reality, they will mostly display 8-bit. In my cinema world, 12-bit is a reality on DCI projectors (Cinema), we usually shoot in 12 bit RAW, grade in 10 bit panel or directly in a 12-bit projector.

Popular video monitor brand, take a peek and see that only a handful of their monitors is a real 10-bit one. http://www.flandersscientific.com
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.