1 billion colour display. This marketing needs to die

Discussion in 'iMac' started by SoyCapitanSoyCapitan, Jun 6, 2017.

  1. SoyCapitanSoyCapitan macrumors 68040

    SoyCapitanSoyCapitan

    Joined:
    Jul 4, 2015
    #1
    You've probably seen this marketing line from a number of monitor and television makers. Now Apple has followed suite claiming that the iMac can show 1 billion colours on the display.

    There's around 14.7 million pixels on the 5K display. A typical very detailed and colourful photo has several thousand individual colours.

    Yes it helps to have graphics cards and monitors that can pick from a very wide gamut, especially for people who work in print. But...

    Are these companies really trying to dupe the public with big numbers or do they really not know the difference between a colour palette and the number of colours that can be shown at one time?
     

    Attached Files:

  2. zoomp macrumors member

    Joined:
    Aug 20, 2010
    #2
    Worse, it's dithering, so not really a 10 bit panel, I guess. A 8 bit panel will not display that many colors.
     
  3. redheeler macrumors 603

    redheeler

    Joined:
    Oct 17, 2014
    #3
    What's even more confusing about this is how it was announced as a new feature on the Mid 2017 Retina iMacs. Both the Late 2015 and Late 2014 5K iMacs have 10-bit panels for content that supports it (but certainly not "every photo" supports it).
     
  4. jeanlain macrumors 65816

    Joined:
    Mar 14, 2009
    #4
    They did not claim the display could show 1Billion colours at the same time.
    --- Post Merged, Jun 6, 2017 ---
    Could you elaborate? I thought the first 5K iMac already had a true 10-bit display.
     
  5. torquer macrumors regular

    Joined:
    Oct 16, 2014
    #5
    They very specifically stated it was 10-bit dithering. This is extremely common, but a little surprising given how much emphasis Apple puts on their displays. I'd bet the iMac Pro will have a true 10-bit display, but I could be wrong.

    Honestly it doesn't make *that* much difference in the end for 90% of users, even content creators. But if you do anything that requires extremely accurate colors you'll want to use the display out capability to use an external monitor. I don't recall if the 5k Ultrafine LG is 10 bit or not.
     
  6. SoyCapitanSoyCapitan thread starter macrumors 68040

    SoyCapitanSoyCapitan

    Joined:
    Jul 4, 2015
    #6
    With native support for P3 profile it's aimed for video. That gamut isn't ideal for print but it's fine for nearly all digital purposes. The portion of people who really need 10-bit is miniscule. Macs didn't support such a bit depth until El Capitan and yet for two decades digital print houses and photographers did well with just 8-bit.

    Then there's cameras themselves. Nearly all professional photography has and still is 8 bit. You can shoot in 14 bit still photography or 10 bit video, but it's only necessary in very rare circumstances when banding is occurring in gradients.

    Marketing material isn't written by professionals. If they did it could you imagine the result. Honest marketing and realistic expectations? Less slogans? ;)
    --- Post Merged, Jun 6, 2017 ---
    YES, they have implied it in the attached image first post.
    --- Post Merged, Jun 6, 2017 ---
    And also you need 10 bit support in graphics drivers if you need that bit depth at all. Only a few most expensive Macs have that support in that last two years. Before that, never.
     
  7. jeanlain macrumors 65816

    Joined:
    Mar 14, 2009
    #7
    Others probably referred to "16 million colours on your monitor", when said monitor had a max resolution of 640*480. People are not that stupid and know that 14M pixels won't show a billion colours. If you want to sue Apple for deceptive advertising, be my guest.
     
  8. amb* macrumors newbie

    amb*

    Joined:
    Sep 25, 2015
    Location:
    USA
    #8
    Professional photographers need more than eight bits constantly. That's one of the main reasons that editing is done on 10/12/14-bit RAW or TIFF files, not 8-bit JPEGs. 8 bits simply isn't enough exposure latitude to fix shadows and highlights. Even a basic feature-restricted entry level camera like the D3300 has 12-bit RAW because it's necessary.
     
  9. SoyCapitanSoyCapitan thread starter macrumors 68040

    SoyCapitanSoyCapitan

    Joined:
    Jul 4, 2015
    #9
    Thanks for your opinion. I worked in this field for 30+ years. I know we are anonymous internet people but it helps not to underestimate who you are speaking with in the first place before you make such comments. No further debate. All the essentials have been outlined in posts above in very straight forward manner.
     
  10. jeanlain, Jun 6, 2017
    Last edited: Jun 6, 2017

    jeanlain macrumors 65816

    Joined:
    Mar 14, 2009
    #10
    It'd be better if you refuted what amb said instead or resorting to an argument from authority, but apparently it's "No further debate".
    I take that 10-12 bit photography does more than just eliminating banding as you seem to imply.
     
  11. amb* macrumors newbie

    amb*

    Joined:
    Sep 25, 2015
    Location:
    USA
    #11
    Needing a monitor to display more than 8 bits is quite rare, but you specifically specified that shooting at 14 bits is "very rare". There's no defensible argument for that when every full frame camera on the market and essentially every interchangeable lens camera of any sensor size supports > 10 bit RAW. Some professionals shoot JPEG sometimes, but you're essentially asserting that it's "very rare" for a professional photographer to shoot in RAW, and that's simply not true.
     
  12. SoyCapitanSoyCapitan thread starter macrumors 68040

    SoyCapitanSoyCapitan

    Joined:
    Jul 4, 2015
    #12
    14 bit (or as high as you can go) is essential for things like architectural photography where linear light and shadows (real or created) need to avoid banding effects. Most photography in the world though is fashion, street, and journalism where such issues don't often arise (and when they do then you can change mode when needed).

    What's upsetting is you just invented the following assertion out of thin air:

    "you're essentially asserting that it's "very rare" for a professional photographer to shoot in RAW"

    That's really either an insult or you just don't know that 8 bit RAW is the most common shooting format.

    I'm sure in person in a professional environment with your cred on the line you wouldn't say such a thing. I'd ask you to apologise for making such a comment just out of politeness and so that you learn to communicate properly to elders in the creative industry. That's why I really wanted to terminate our discussion in my last response. I always know which direction these "debates" take online and it is really a waste of the precious life we have.
     
  13. jerwin macrumors 65816

    Joined:
    Jun 13, 2015
    #13
    My Nikon allows me to choose between 12 bit and 14 bit raw.
     
  14. giggles macrumors 6502

    Joined:
    Dec 15, 2012
    #14
    Layman question: does this "10bit dithering" claim (which I guess is not "real" 10bit? isn't dithering a software thing? the wording sounds confusing) help in any way with displaying more faithfully real-10bit tv series and anime?
     
  15. jeanlain macrumors 65816

    Joined:
    Mar 14, 2009
    #15
    Yeah, Nikon DLSRs have been shooting 12-bit raw for quite some time.
     
  16. amb* macrumors newbie

    amb*

    Joined:
    Sep 25, 2015
    Location:
    USA
    #16
    Nikon cameras only do 12 or 14 bit RAW. From the lowliest D3000-series all the way up to the D5. It's even listed explicitly on their website.

    D5: "NEF (RAW): 12 or 14 bit"
    D3300: "12-bit NEF (RAW)"

    8 bit RAW isn't even an option.

    The situation is the same for Canon.

    EOS 1D X Mark II: "Still Image: JPEG, RAW (14 bit Canon Original)"

    I'm not sure where you came up with the idea of 8-bit RAW, that's just not a thing. Even the 2 MP Nikon D1 from 1999 shot 12 bit RAW.
     
  17. Chancha macrumors 6502a

    Joined:
    Mar 19, 2014
    #17
    I have started shooting 12bit RAWs over 10 years ago, and 14bit for 5 years since I got Nikon D800 that could do it. To check if I am working on a file in Photoshop that's 16bit per channel is like the first thing I do almost every day. Banding issues can easily occur with not even drastic color manipulation of a smooth gradient. It is of course unacceptably visible on print, but if done badly it can easily be recognized in 8bit digital destination such as a website where most viewers are in sRGB.

    There is a misconception here that 8 bit is enough for professional work. I had some experience in a print house preparing for pre-press material and it seems it is novice Photoshop users are at fault here. You can shoot in 8bit, or have a collage of visual materials in 8bit, but if your workflow in PS et al is at 16bit you can at least manipulate smooth colors over these sources. But I would say it is 90% of the time that we got files from folks who simply do not know or care of the issue.

    I am glad that the issue is semi-gone now that even consumer devices can handle wide gamut and higher than 8bit color depth.
    --- Post Merged, Jun 6, 2017 ---
    I actually don't even think sensors that can only handle 8 bit data existed. The 8 bit limit was probably from the camera DSP, when you do a real time encoding of a JPEG and then worse off is telling your camera to use sRGB. This probably explains why we see 8bit as some "norm" in digital photography until lately.
    --- Post Merged, Jun 6, 2017 ---
    8 bit delivery from unaware "designers" is indeed common. 8 bit RAW on the other hand is almost unheard of for at least a decade if ever.
     
  18. Larry-K macrumors 68000

    Joined:
    Jun 28, 2011
    #18
    I don't know what professional photographers you've dealt with over the past 30 years, but if someone sends me 8 bit files these days, they're getting a phone call, and not a pleasant one.
     
  19. potatis macrumors 6502a

    potatis

    Joined:
    Dec 9, 2006
    #19
    I think the first 5K iMac had not even a false 10-bit display, it was 8-bit. The second one launched at the same time as the 4K iMac got a P3 display, which is the same as 10-bit(?)
     
  20. AdamSeen macrumors member

    Joined:
    Jun 5, 2013
    #20
    I'm finding it hard to understand what's the difference between the new iMac display and the 2015 one. I thought the 2015 one was already 10bit? What's the difference in amount of colours the 2015 one can show vs the new one?
     
  21. pmau macrumors 65816

    Joined:
    Nov 9, 2010
    #21
    Didn't they talk about 10-bit dithered output?
    I seem to recall they mentioned dithered.
     
  22. Chancha macrumors 6502a

    Joined:
    Mar 19, 2014
    #22
    The 2017 one is still "dithered 10 bit", just as the previous model. Judging by the increase to 500 nits though, I am guessing the LCD panel stays the same, while the backlight is a newer component.
     
  23. zoomp macrumors member

    Joined:
    Aug 20, 2010
    #23
    Very few panels are real 10-bit, they usually offer even 12-bit processing (dithering) but in reality, they will mostly display 8-bit. In my cinema world, 12-bit is a reality on DCI projectors (Cinema), we usually shoot in 12 bit RAW, grade in 10 bit panel or directly in a 12-bit projector.

    Popular video monitor brand, take a peek and see that only a handful of their monitors is a real 10-bit one. http://www.flandersscientific.com
     

Share This Page