Hey everyone.
I'm about to be pulling the trigger on a new MacPro
)) for my graphics work, and my 20" ACD is getting a bit dated. I'm starting to notice some color and luminance inconsistencies across the panel, and I think it's time to upgrade the monitor too.
I've been doing a bit of reading, and have come up with some interesting info.
First of all, I was looking at a few different displays. Both the Eizo CG 241W and the CG 221W, as well as the NEC 2690-WUXI-SV (And 2490-WUXI-SV, albeit it's smaller gamut than the 26) and the 2180-WG-LED-SV.
So, the smaller displays from each manufacturer are substantially more pricey, and a bit out of my range. However, the larger displays have less PPI capability (larger panel, but same pixel dimensions), and smaller color gamut's.
So, my question is...
Just how useful is a Wide Gamut (High bit depth monitor)? I came across this post by Karl Lang. Although slightly dated (2006), he goes on to discuss how these high bit depth, wide gamut displays are really technology ahead of their times, as the channel from Photoshop to the monitor is not yet 10 bit across the board. Basically, we're still stuck at an 8-bit bottleneck. (The exception being the more expensive and smaller Eizo and NEC's actually have a 10-bit DVI capability, but none of the computer components do yet).
However, what I don't understand is how people are able to profile these displays to show a large gamut, if the software can't push the 10-bit data to the monitor. Then again, I guess the gamut is only really showing the color spectrum, not the subtitles in tonality.
So.. Will it be money wasted to invest the high-gamut technology, or even the less expensive (but not COMPLETELY Adobe 1998) monitors? I do all of my editing in ProPhoto regardless, but since I'm still outputting to devices with gamut's close to and smaller than Adobe 1998, there is still color data there that I could be potentially missing in Soft Proof.
EDIT: What role does contrast ratio and maximum brightness play in my decision? I'll most likely be calibrating the monitor to 180cd/m regardless, but what about the contrast ratio? Seems like all the HIGH end ones have much smaller ratios in comparison.
EDIT: Upon further reading, I've come to conclude that the higher bit depth LUT's are still of advantage even if the entire channel is not 10 or 12-bit. It helps with profiling by adjusting the LUT of the monitor instead of the graphics card, preventing posterization of tones.
I'm about to be pulling the trigger on a new MacPro
I've been doing a bit of reading, and have come up with some interesting info.
First of all, I was looking at a few different displays. Both the Eizo CG 241W and the CG 221W, as well as the NEC 2690-WUXI-SV (And 2490-WUXI-SV, albeit it's smaller gamut than the 26) and the 2180-WG-LED-SV.
So, the smaller displays from each manufacturer are substantially more pricey, and a bit out of my range. However, the larger displays have less PPI capability (larger panel, but same pixel dimensions), and smaller color gamut's.
So, my question is...
Just how useful is a Wide Gamut (High bit depth monitor)? I came across this post by Karl Lang. Although slightly dated (2006), he goes on to discuss how these high bit depth, wide gamut displays are really technology ahead of their times, as the channel from Photoshop to the monitor is not yet 10 bit across the board. Basically, we're still stuck at an 8-bit bottleneck. (The exception being the more expensive and smaller Eizo and NEC's actually have a 10-bit DVI capability, but none of the computer components do yet).
However, what I don't understand is how people are able to profile these displays to show a large gamut, if the software can't push the 10-bit data to the monitor. Then again, I guess the gamut is only really showing the color spectrum, not the subtitles in tonality.
So.. Will it be money wasted to invest the high-gamut technology, or even the less expensive (but not COMPLETELY Adobe 1998) monitors? I do all of my editing in ProPhoto regardless, but since I'm still outputting to devices with gamut's close to and smaller than Adobe 1998, there is still color data there that I could be potentially missing in Soft Proof.
EDIT: What role does contrast ratio and maximum brightness play in my decision? I'll most likely be calibrating the monitor to 180cd/m regardless, but what about the contrast ratio? Seems like all the HIGH end ones have much smaller ratios in comparison.
EDIT: Upon further reading, I've come to conclude that the higher bit depth LUT's are still of advantage even if the entire channel is not 10 or 12-bit. It helps with profiling by adjusting the LUT of the monitor instead of the graphics card, preventing posterization of tones.