Edit: You know I haven't compared ipad models. I don't own one as I wouldn't use it. I noted that I didn't respond on that point, but I'm not sure what is causing the eye strain on the new one. My issue is with the idea of addressing contrast ratio as a linear thing where more is always better regardless of other compromises in the optimization of the device.
Edit:Edit: I can also kind of understand the desire for a somewhat high contrast ratio if you are viewing at full brightness so that the blacks don't start to drift to a point that feels more like dark grey when combined with ambient reflections, but even then writing a contrast ratio on a piece of paper won't tell you what to expect in terms of viewing experience. If they accomplished this via aggressive sharpening (yes lcds use somewhat of a sharpening algorithm, I think it's due to the issue of pixel pitch), that can have many adverse effects.
Hi, Kev, thanks for the very technical reply. I was indeed enlightened to read your comments.
What's a colorimeter device? I suppose is what one uses to measure contrast of a screen, what anandtech used for example.
Yeah it's a kind of measurement device. A spectrophotometer would be another type. They're not perfect. Typically you're talking about $100-300 devices, and they can drift over time or vary moderately due to things like humidity, ambient temperature, etc. Typically the documentation will give a suggested operating range on these things. With most displays you can measure the display to rebuild a more accurate profile to describe the hardware response. These profiles describe the estimated behavior of the display within a device independent reference space (such as LAB, even though it has some issues), and the OS and graphics card compensate use this information to help refine the instructions that they feed to the display. In the case of some very expensive displays, they'll have something more like an LUT (lookup table) system. The device will then try to compensate internally. This is somewhat better when it comes to stabilizing behavior. They still drift, so the usable gamut can seem to shrink over time, but the goal is to maintain a neutral greyscale and accurate response per channel.
Note wiki here.
http://en.wikipedia.org/wiki/Gamma_correction
The response isn't linear. Gamma encoding in itself isn't really evil. It's used as a method of displaying imagery in a pleasing manner over a limited dynamic range or contrast ratio. Gamma 2.2 in this regard is basically the norm in "almost" every usage case.
It has been reported that the rmbp has about 20% less brightness than last years pro, which would make sense since the ipad also with retina has lower brightness, since the retina's are hard to illuminate despite the added led's.
I am not sure I understand how contrast ratio will change with brightness, it's a ratio, how could it change? What I know is that if say a monitor has 200cd/m2 brightness and 0.5cd/m2 for 0 black and 200cd/m2 for 255 white, that means it has a 400/1 contrast ratio. I am told they do with ips panels that have worse contrast ratios than tn or va (mpva, pva, s-pva, c-pva etc.) panels) they raise the brightness to report higher contrasts.
You're not raising the brightness level of the lcd itself. You are changing the intensity of something that is shown through it. If you look at a stained glass window, it's a similar concept. .5 is definitely a bit high, but I find 200 cd/m2 gives me a headache. I calibrate around 90 on my Eizo (I tweaked it slightly to get a good match to a wide format epson under the RIP I was using at the time, even though my printer broke down recently

). If I set the calibration for best greyscale tracking, it came out to something like 450:1. This is on a modern display. The tech specs say 800:1 or so. I don't personally care. It looks great. Usually when things look like crap is when the brightness gains are inconsistent across its range in one or more color channels (note the gamma wiki). The values are spaced well, although the shadow details are slightly inferior to the older models. These were made for 10 bit displayport connections, and Apple doesn't support 10 bit framebuffers on any of their cards. Regarding greyscale tracking, you've probably seen the numbers D65 and 6500k. 6500k is a black body temperature, and in the context of display profiling, it's just a typical standard as sRGB is based upon that. Most of the time you won't get it perfect. Apple has used white points as high as 7000k because they were closer to the native white point of the hardware. It doesn't present any real problems.
I am not sure I understand what you mean by values tracking well. Surely a contrast ratio of 350/1 is very low indeed. Wouldn't that induce more fatigue since you 'd have to keep it a higher brightness level?
Why do you think the brightness of the old cinema displays gave you fatigue then, low contrast was the culprit I suppose, that you couldn't get the screen low enough in brightness to avoid it? I am getting eye fatigue from the new ipad and I can't pinpoint the problem, it seems everything is more washed out, I tried colour profiles via cydia didn't really work, I read somewhere there's a problem with the display grade.
I can't stand really bright lights. They give me a headache, and those displays were extremely bright. If I turned them down, I was fine as it was just a brightness thing. The 30" display maintained a more visually pleasing image when turned down in brightness relative to the 23" or the other (20 or 21" or something like that). Overall they weren't really optimized to be used at those brightness levels. Note the stained glass window reference from before. It's also important to note that the backlights have a specific color/temperature of their own which can change based on brightness and physical temperature of the backlight (actually this was an earlier problem with LEDs). The way these things work only looks like simple math.
You should note that no display is 100% consistent across the board, which is why the Cydia profiles may not have worked so well. Realistic manufacturing tolerances are well within what the human eye can perceive, and colors can change during warm up and drift over time. It's just in a well manufactured display device, you shouldn't notice a significant difference unless two are viewed side by side as your mind compensates for minor differences. Some of the high end software products have correlation features so that you can attempt to match displays rather than optimize for the best possible use of their gamut. This can be helpful in environments that use many of the same display. Even Eizo only tests to within 3 Delta E which is a great enough difference to be noticeable side by side.
At the store I did try out the rmbp but it's sometimes hard to tell at stores with all that strange lighting around. What I did notice, and was dissapointed by, was that I wasn't satisfied with max brightness.
I am really having a hard time deciding on a new mac/monitor set up. Retina displays are not here yet on the desktops, so it's hard to pull the trigger since hidpi mode can't be enabled otherwise (and thus ui element scaling settings). And I am really confused as to what causes my eye strain, I know too much brightness is implicated somehow, and I am wondering if the extra led used by the retinas will make the problem worse for me, not better. So I 've ended up looking at va panels (eizo, nec) with good contrasts for a solution there...Then there's the consideration of a light ag coating, good dot pitch. etc.
To tell you the truth I am more confused than ever, I have the new ipad, and despite the retina it's giving me more eye strain than my old ipad...
I'd probably end up trying a lower brightness setting on the ipad. Sometimes it feels weird or flat at first if you turn down the brightness on a device. Give it some time and see how it feels the next time you pick it up. The biggest problem is typically if turning something down in brightness crushes the shadows making the dark values indiscernible or totally hoses the contrast entirely. If you look at any printed media including magazines, paintings, printed photographs, etc. (yes I know they've become less common), those typically don't exceed 300:1 contrast ratios. I understand the difference between reflective and backlit/translucent media, yet more contrast for contrast sake doesn't necessarily improve the device if it just takes the same number of addressable values and moves them further apart. If you look at a 3d gamut plot, each color combination that can be displayed can be plotted into that from a finite pool. Even if they're advertising something like "16 bit internal processing" it's more a way of ensuring against rounding errors in the internal math. The display still has a certain number of addressable hardware values. The ability to display greater saturation or higher contrast ratios pushes these apart, thus the use of dithering.
Bleh I'm going on here. I just wanted to make the point that manufacturer spec data does not always hold up in actual use, and the contrast ratio thing is probably more important at very low contrast ratios.
Here is my last point about manufacturers and their marketing kool-aid.
OMG 5,000,000:1!!!!
.
I couldn't resist that. I'd post a manufacturer spec link rather than newegg if I could find one, but listing "dynamic contrast ratio" as in comparing the blackest black it can produce at minimum settings to the brightest white at full brightness is just so silly.