I thought I read somewhere you cannot calibrate this display and it comes "pre-calibrated" from Apple.
Hello,
About color, the imac 5k seems nearby 100% of sRGB when the NEC should be 100% of adobe RGB which is a huge difference. But of course as you said it's a huge difference for those who matters about color space... And some people will prefer sRGB on Adobe RGB. It's ok.
My question is about resolution.
When you say it will not match resolution, is it because the imac is 5k and the NEC is 4k or more about the retina technology.
I would like to compare the imac retina 5k to an other "classic" 4k screen but obviously my local dealer will not let me do what i want with 3k$ products.
As i don't understand well the advantage of Retina technology, i'm interested by this point. If you could explain me a little bit, so i can imagine the difference.
Thanks.
Colin
Adobe 1998 can help in some cases. It's often easier to spot if something is grossly over-saturated. Don't make the common mistake of buying into the idea that bigger gamut = better accuracy. First you should be aware that these things don't exactly plot spectral alignment. It's a quasi-perceptive model, which assumes a narrow field of view. It assumes a low level of ambient lighting, as that can influence cone response. From there it theoretically depicts the deviation between the
perception of a color seen on the display and the desired
perception. This is not the same as requiring the same composition of light scaled to the appropriate brightness level.
I wanted to include that to assist with this point. The error on screen relates to the actual color displayed, which may be more or less on the wider gamut display.
Just to include this, the software you use to calibrate has no real direct path to the underlying hardware itself. It first measures the maximum primaries and greyscale of the display, and remember we're limited by the tolerance of the device. It calls for (0xFF,0,0), (0,0xFF,00), and (0,0,0xFF) and measures the chromacity of each. It then measures points (a,b,c) such that a=b=c. Based on those things it builds an ICC profile that describes the basic response of the hardware. The patches that are then measured and depicted in terms of Delta E values refer approximately to the ****coordinate distance from the perceived display color to the desired color within the reference color model plus or minus the tolerance of the measurement device which may vary between units.
Now you may wonder what would be the difference with displays that claim hardware calibration. Regardless of testing at the factory, all displays drift over time due to shifts in both the backlight and LCD response. If you are trying to maintain a specific target over time, that is achieved by a linear mapping between the hardware gamut and a subset of that gamut that falls within the desired target range, and there is an ISO specification for how that should be applied, which is also referred to in ICC's documentation. You can only map to values within range, so you are mapping fewer total values in this way. Hardware calibration does something similar post framebuffer using the display's internal processing. It may have greater amount of low level access, but it attempts to meet the desired target and provide the most desirable hardware response.
In my own experience they tend to sacrifice shadow detail before other things. NEC, Eizo, and some of the others also use a form of local dimming via panel blocking to improve the brightness uniformity of the displays. NEC allows that feature to be toggled on or off, and it does reduce overall brightness somewhat. I think it's fine because you shouldn't run them at max brightness anyway.
TLDR it's complicated, and this was a very shallow explanation that probably contained some errors. It definitely contains some intentional omissions.
**** Correction : I didn't mean coordinate distance. It refers to the magnitude of the vector depicting the coordinate difference between the two points.