Separate names with a comma.
Discussion in 'MacBook Pro' started by blow45, Jul 14, 2012.
Might have missed them, but i ve been looking for them to no avail...
As far as I know nothing official from Apple, but there is this analysis from Anandtech:
"29 percent higher contrast ratio than a standard MacBook Pro display"
But honestly, does this matter?
of course it does, contrast is a very important parameter to a display. If the contrast is high enough you can reduce the brightness on the screen more and fatigue your eyes much less (and in the long term save your eye sight). All the more so since this model has double the leds to illuminate this display (similar to the ipad), and since led backlight has been time and again implicated in eye fatigue symptoms, the lower the brightness you will be able to have this work in will be the better.
I don't understand why apple do not provide figures for the contrast of their (lg's) screen. Surely if the selling point of this computer is the display it has to be judged on its other merits and not solely on its pixel count. I am beginning to get suspicious that they don't report on the contrast because they don't have enough qc to standardize it above a certain level.
Apple only focus on certain tech specs and ignore the others. For example, they alway talk about the number of megapixels and the number of cores in iOS devices, but not the amount of RAM or CPU frequency.
Now the focus is on Retina displays and their resolution / PPI. They will also mention "IPS" but other common measurable specs like brightness, contrast, response time and color gamut are never mentioned.
I guess Apple doesn't want to confuse people with lots of technical terms, but they could at least make the full specs accessible somewhere less visible, even if it's not in their marketing and keynotes.
I think it's unacceptable that such basic industry wide monitor specifications as brightness, contrast, response time and color gamut are not mentioned in any user manual, or specifications sheet. This is supposed to be a pro machine and pros like video editors, creatives, photographers etc. are supposed to know all of the above.
And I don't think it's because they don't want to confuser the user. Nothing confuses the user if it appears somewhere in a user manual. I think they don't want their screen to be compared on any other aspect other than the amount of pixels it has.
Which is very dishonest, and it's also an unacceptable business practice. All of these factors are very important at making an informed purchasing choice and they are basic computer display specifications. All the more so since the selling point here is the display.
Edit: Edit: As long as we're sticking to gamma 2.2 device responses and 8 bit panels (or "10 bit" panels that behave virtually the same and still use dithering), we're most likely reach the limit of higher contrast for contrast sake. Past a certain point it just crushes your shadow detail.
Contrast ratio does change with brightness. It doesn't sound like it should happen, but if you profile a display at different luminance levels, it's reasonably measurable. The absolute contrast ratio really doesn't tell you as much as you'd think. You can get an excellent viewing experience at 350:1 if your values track really well, yet it sounds low so consumers would shun it. Some of the older NEC and Eizo displays were around that post calibration, and they were excellent even by current standards. The eye fatigue precedes the use of LED backlights. The brightness levels of the old aluminimum cinema displays always gave me a headache. If you have an Apple store nearby, I would suggest checking one out to see if you can find a comfortable setting, but never believe published contrast ratio data. It's always complete marketing drivel, and they never match the measured specs even when considering the error margin of a colorimeter device.
You don't know how wrong you are here. Gamut percentages are so silly. The reason for this is that adding a wider range of colors doesn't improve detail, and it doesn't necessarily hit the color combinations that will benefit real work. It's something you have to evaluate on a model by model basis as the engineering isn't as one dimensional on any given spec as these marketing papers want to make it. Response time can be a more significant issue, but contrast ratio is just silly when they measure at maximum brightness. It doesn't tell you how it will look at your viewing brightness, or if the display will even perform at the desired brightness level. Beyond that, it's not definitively a pro machine. Apple makes mass market devices. They are trying to appeal to anyone who can afford it. By the way, take a look at Eizo or NEC. Their displays are often lower in contrast ratio than many of these laptop displays, yet they are easier to calibrate for predictable results. If you're buying based on specs alone (I don't care what manufacturer), you're doing yourself a disservice.
It's not dishonest, because the numbers don't tell you whether it's a quality display. These things never match up in actual use, and you need to get over it. You should absolutely never purchase a display based on specs like maximum brightness and contrast ratio. If you have special requirements, these things won't tell you anything. Do solid colors swim? Does it have noticeable dithering? How well does it reproduce non primary colors? How is the shadow detail? These things actually matter. 500:1 or 1000:1 don't provide you with any information capable of aiding you in making an informed choice. They publish this stuff for marketing purposes only.
Hi, Kev, thanks for the very technical reply. I was indeed enlightened to read your comments.
What's a colorimeter device? I suppose is what one uses to measure contrast of a screen, what anandtech used for example.
It has been reported that the rmbp has about 20% less brightness than last years pro, which would make sense since the ipad also with retina has lower brightness, since the retina's are hard to illuminate despite the added led's.
I am not sure I understand how contrast ratio will change with brightness, it's a ratio, how could it change? What I know is that if say a monitor has 200cd/m2 brightness and 0.5cd/m2 for 0 black and 200cd/m2 for 255 white, that means it has a 400/1 contrast ratio. I am told they do with ips panels that have worse contrast ratios than tn or va (mpva, pva, s-pva, c-pva etc.) panels) they raise the brightness to report higher contrasts.
I am not sure I understand what you mean by values tracking well. Surely a contrast ratio of 350/1 is very low indeed. Wouldn't that induce more fatigue since you 'd have to keep it a higher brightness level?
Why do you think the brightness of the old cinema displays gave you fatigue then, low contrast was the culprit I suppose, that you couldn't get the screen low enough in brightness to avoid it? I am getting eye fatigue from the new ipad and I can't pinpoint the problem, it seems everything is more washed out, I tried colour profiles via cydia didn't really work, I read somewhere there's a problem with the display grade.
At the store I did try out the rmbp but it's sometimes hard to tell at stores with all that strange lighting around. What I did notice, and was dissapointed by, was that I wasn't satisfied with max brightness.
I am really having a hard time deciding on a new mac/monitor set up. Retina displays are not here yet on the desktops, so it's hard to pull the trigger since hidpi mode can't be enabled otherwise (and thus ui element scaling settings). And I am really confused as to what causes my eye strain, I know too much brightness is implicated somehow, and I am wondering if the extra led used by the retinas will make the problem worse for me, not better. So I 've ended up looking at va panels (eizo, nec) with good contrasts for a solution there...Then there's the consideration of a light ag coating, good dot pitch. etc.
To tell you the truth I am more confused than ever, I have the new ipad, and despite the retina it's giving me more eye strain than my old ipad...
Edit: You know I haven't compared ipad models. I don't own one as I wouldn't use it. I noted that I didn't respond on that point, but I'm not sure what is causing the eye strain on the new one. My issue is with the idea of addressing contrast ratio as a linear thing where more is always better regardless of other compromises in the optimization of the device.
Edit:Edit: I can also kind of understand the desire for a somewhat high contrast ratio if you are viewing at full brightness so that the blacks don't start to drift to a point that feels more like dark grey when combined with ambient reflections, but even then writing a contrast ratio on a piece of paper won't tell you what to expect in terms of viewing experience. If they accomplished this via aggressive sharpening (yes lcds use somewhat of a sharpening algorithm, I think it's due to the issue of pixel pitch), that can have many adverse effects.
Yeah it's a kind of measurement device. A spectrophotometer would be another type. They're not perfect. Typically you're talking about $100-300 devices, and they can drift over time or vary moderately due to things like humidity, ambient temperature, etc. Typically the documentation will give a suggested operating range on these things. With most displays you can measure the display to rebuild a more accurate profile to describe the hardware response. These profiles describe the estimated behavior of the display within a device independent reference space (such as LAB, even though it has some issues), and the OS and graphics card compensate use this information to help refine the instructions that they feed to the display. In the case of some very expensive displays, they'll have something more like an LUT (lookup table) system. The device will then try to compensate internally. This is somewhat better when it comes to stabilizing behavior. They still drift, so the usable gamut can seem to shrink over time, but the goal is to maintain a neutral greyscale and accurate response per channel.
Note wiki here.
The response isn't linear. Gamma encoding in itself isn't really evil. It's used as a method of displaying imagery in a pleasing manner over a limited dynamic range or contrast ratio. Gamma 2.2 in this regard is basically the norm in "almost" every usage case.
You're not raising the brightness level of the lcd itself. You are changing the intensity of something that is shown through it. If you look at a stained glass window, it's a similar concept. .5 is definitely a bit high, but I find 200 cd/m2 gives me a headache. I calibrate around 90 on my Eizo (I tweaked it slightly to get a good match to a wide format epson under the RIP I was using at the time, even though my printer broke down recently). If I set the calibration for best greyscale tracking, it came out to something like 450:1. This is on a modern display. The tech specs say 800:1 or so. I don't personally care. It looks great. Usually when things look like crap is when the brightness gains are inconsistent across its range in one or more color channels (note the gamma wiki). The values are spaced well, although the shadow details are slightly inferior to the older models. These were made for 10 bit displayport connections, and Apple doesn't support 10 bit framebuffers on any of their cards. Regarding greyscale tracking, you've probably seen the numbers D65 and 6500k. 6500k is a black body temperature, and in the context of display profiling, it's just a typical standard as sRGB is based upon that. Most of the time you won't get it perfect. Apple has used white points as high as 7000k because they were closer to the native white point of the hardware. It doesn't present any real problems.
I can't stand really bright lights. They give me a headache, and those displays were extremely bright. If I turned them down, I was fine as it was just a brightness thing. The 30" display maintained a more visually pleasing image when turned down in brightness relative to the 23" or the other (20 or 21" or something like that). Overall they weren't really optimized to be used at those brightness levels. Note the stained glass window reference from before. It's also important to note that the backlights have a specific color/temperature of their own which can change based on brightness and physical temperature of the backlight (actually this was an earlier problem with LEDs). The way these things work only looks like simple math.
You should note that no display is 100% consistent across the board, which is why the Cydia profiles may not have worked so well. Realistic manufacturing tolerances are well within what the human eye can perceive, and colors can change during warm up and drift over time. It's just in a well manufactured display device, you shouldn't notice a significant difference unless two are viewed side by side as your mind compensates for minor differences. Some of the high end software products have correlation features so that you can attempt to match displays rather than optimize for the best possible use of their gamut. This can be helpful in environments that use many of the same display. Even Eizo only tests to within 3 Delta E which is a great enough difference to be noticeable side by side.
I'd probably end up trying a lower brightness setting on the ipad. Sometimes it feels weird or flat at first if you turn down the brightness on a device. Give it some time and see how it feels the next time you pick it up. The biggest problem is typically if turning something down in brightness crushes the shadows making the dark values indiscernible or totally hoses the contrast entirely. If you look at any printed media including magazines, paintings, printed photographs, etc. (yes I know they've become less common), those typically don't exceed 300:1 contrast ratios. I understand the difference between reflective and backlit/translucent media, yet more contrast for contrast sake doesn't necessarily improve the device if it just takes the same number of addressable values and moves them further apart. If you look at a 3d gamut plot, each color combination that can be displayed can be plotted into that from a finite pool. Even if they're advertising something like "16 bit internal processing" it's more a way of ensuring against rounding errors in the internal math. The display still has a certain number of addressable hardware values. The ability to display greater saturation or higher contrast ratios pushes these apart, thus the use of dithering.
Bleh I'm going on here. I just wanted to make the point that manufacturer spec data does not always hold up in actual use, and the contrast ratio thing is probably more important at very low contrast ratios.
Here is my last point about manufacturers and their marketing kool-aid. OMG 5,000,000:1!!!!.
I couldn't resist that. I'd post a manufacturer spec link rather than newegg if I could find one, but listing "dynamic contrast ratio" as in comparing the blackest black it can produce at minimum settings to the brightest white at full brightness is just so silly.
kev, wow man, great post, wish I could give you a reputation rating for it (that's why I ve been saying, can we please have upvotes and downvotes but also reputation ratings for posters...) I 've not yet given it a read yet, give me some time, I ll get back to you on it too, but I gotta run now. I just saw that dynamic contrast ratio linked, yeap, utter rubbish for sure.