10bit panels

Discussion in 'iMac' started by yellowscreen, Jun 10, 2017.

  1. yellowscreen macrumors regular

    yellowscreen

    Joined:
    Nov 11, 2015
    #1
    so it appears apple isnt using real 10bit panels in imacs, just 8bit + dithering. am i getting this right? that would be a bummer
     
  2. redheeler macrumors 603

    redheeler

    Joined:
    Oct 17, 2014
    #2
    All 5K Retina iMacs since the Late 2014 have had 10-bit panels. Apple simply never chose to advertise it until now.

    From my Late 2015 5K iMac:
     
  3. yellowscreen thread starter macrumors regular

    yellowscreen

    Joined:
    Nov 11, 2015
    #3
    why would they advertise dithering then?
     
  4. torquer macrumors regular

    Joined:
    Oct 16, 2014
    #4
    Incorrect. The monitor can accept a 10 bit signal but uses FRC dithering to an 8-bit native panel. This has been spoken about in almost all of the reviews as well. Its still a great display, but it is not native 10-bit. That may be because no one makes that particular panel size and resolution in that native color depth or due to cost or who knows what, but it is definitely not native 10 bit.
     
  5. redheeler macrumors 603

    redheeler

    Joined:
    Oct 17, 2014
    #5
    Can you link to one of those reviews here? This is the article I was thinking of, having seen it back in 2015, but it doesn't go into detail on whether the LCD panel itself is true 10-bit or not.

    Interestingly enough, Preview in MacOS Sierra appears to be software-dithering 10-bit images even on my 2012 Retina MacBook Pro, which I don't recall the earlier versions doing.
     
  6. yellowscreen thread starter macrumors regular

    yellowscreen

    Joined:
    Nov 11, 2015
    #6
    now i'm wondering what's the difference between 2015 5k and 2017 5k, considering they are only now mentioning 10bit dithering.

    and i really thought those are real 10bit displays. bummer

    i'm still waiting for anandtech review
     
  7. dengkai macrumors newbie

    dengkai

    Joined:
    Jun 5, 2017
    #7
  8. yellowscreen thread starter macrumors regular

    yellowscreen

    Joined:
    Nov 11, 2015
    #8
    yeah, its 100% 10bit because they just advertised 10bit dithering.

    oh, and i'm also definitely waiting, at least for HDR, like the new ipad pro
     
  9. SoyCapitanSoyCapitan macrumors 68040

    SoyCapitanSoyCapitan

    Joined:
    Jul 4, 2015
    #9
    Won't matter to 99.9% of users. Research what 10 bit is and how very few people really benefit from it. Those who do benefit would be crazy to use P3 profiles and high contrast glass screens. Eizos should be used when working with that kind of depth and range.
     
  10. torquer macrumors regular

    Joined:
    Oct 16, 2014
    #10
    Cnet, arstechnica, digitaltrends, appleinsider... Since there are no non-specialty sources above 10-bit, theres really no such thing as a native 10 bit panel doing dithering. So, anytime you see "10 bit dithering" by default it refers to dithering a 10-bit source to an 8-bit panel. This kind of thing is extremely common as there really aren't *that* many native 10 bit panels out there and certainly even fewer 5k 10-bit panels.

    That said, theres really nothing wrong with it. The people who need a 10 bit panel likely already have one and the rest of us won't notice any difference
     
  11. jukkhop macrumors newbie

    Joined:
    Dec 27, 2016
    #11
    Could you elaborate on this? Why would it be crazy to use the iMac display? Not trying to be cheeky, just trying to understand.
     
  12. Cosmosent macrumors regular

    Cosmosent

    Joined:
    Apr 20, 2016
    Location:
    La Jolla, CA
    #12
    IMO, that's simply Apple's way of tipping their hand that 10-bit Display P3 capture will in-deed be included with the iPhone 8 ... nobody picked up on it at WWDC 2017 ... and to this day, very few have.

    The dithering is needed for those 10-bit captures (i.e., photos) when rendered to 8-bit RGB displays, that's all.

    It's one of those tech topics that's off the Radar of most, but should not be.

    Not a single Professional Analyst who covers Apple has figured it out.
     
  13. Bghead8che macrumors member

    Joined:
    Jan 15, 2015
    #13
    There are several reasons the iMac monitor is not suitable for many "professionals" and "professional" work.

    For one the panels have poor uniformity in terms of back light bleed, color consistency from edge to edge, and brightness variation over the screen.

    Also, the P3 color space can be problematic. Most of the world works in the sRGB color space. There is no sRGB setting on the iMac so when you export or save an image and display it on monitor not using the P3 color space the colors will most likely look different (likely oversaturated).

    I openly wonder how professional photographers and in particular web designer are using iMac P3 displays. If there is no sRGB mode or emulation how do you know what the images will look like on someone else's monitor (most likely viewing in sRGB?). And, no, choosing the sRGB color space on the iMac does not help as the native monitor is still P3.

    Going with an Eizo or NEC will give you much more accurate colors and panel quality, not to mention a color space preferred by most photographers and designers.
     
  14. guibo macrumors member

    Joined:
    Jun 8, 2017
    #14
    I don't think you understand colour management at all.

    Why would an iMac with p3 colour gamut which is larger and completely contains sRGB not be good for professionals but an eizo or nec which are 10bit adobeRGB be good? The eizo and nec are not native 8bit sRGB either.

    Any professional application supports colour management, ie, you can work in the limited sRGB gamut while outputting visually to a wide gamut display. You will see the same colours as an sRGB display save for some slight transformation rounding errors. Errors which are surely much less than the calibration variation expected across end user displays.

    All a wide gamut display like the imac's does is allow you to see more saturated colours beyond sRGB. You don't have to use this ability if you want to restrict yourself to sRGB. With proper colour management work flow, when you save your sRGB limited image on an iMac and display it on an sRGB monitor, it will look essentially the same.

    I think what you are thinking about is someone working in an extended gamut and then saving the image as is with no colour management. This image displayed on an sRGB monitor will look completely off. However, this scenario is a user mistake stemming from poor understanding of colour management.

    Or maybe you are talking about someone finalizing an image which uses the extended gamut's saturated colours then converting it to sRGB at export without proofing. Although working with a proper colour managed workflow, the result on an sRGB monitor will show clipping at the most saturated colours. This is just the result of lack of proofing for the target medium. The same thing happens when printing from a true sRGB computer system to photo paper which is a limited subset of sRGB (since ink and paper have obvious limitations). What you are suggesting in this case is that photographers editing for print should work with monitors that are native ink+paper colour space? (Hint those monitors don't exist).
     
  15. dlewis23 macrumors 6502a

    Joined:
    Oct 23, 2007
    #15
    Thank you...
     

Share This Page