Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

edry.hilario

macrumors 6502a
Original poster
Aug 1, 2010
816
1
Hello everyone,

I need your help finding a monitor under 1,000 USD for the nMac Pro. I mostly work with photography/video and color accuracy and overall good color is kinda critical for me, I dont care if its 4k or not but I'd prefer it if it was 4k. So let me know what monitors you have/recommend.

Thank you.
 
Basically for color work you want a monitor that supports Adobe RGB (aRGB); lower-end consumer grade monitors only support the more narrow Standard RGB (sRGB).

As for size I prefer 1440p @ 27" (and so does Apple cuz the 27" iMac is 1440p) and 4k @ 32"; since the later is currently outside your price bracket I would suggest getting the following:

Dell U2713H
 
Basically for color work you want a monitor that supports Adobe RGB (aRGB); lower-end consumer grade monitors only support the more narrow Standard RGB (sRGB).

As for size I prefer 1440p @ 27" (and so does Apple cuz the 27" iMac is 1440p) and 4k @ 32"; since the later is currently outside your price bracket I would suggest getting the following:

Dell U2713H

would this be better than the apple's thunderbolt display since they are around the same price?
 
Hello everyone,

I need your help finding a monitor under 1,000 USD for the nMac Pro. I mostly work with photography/video and color accuracy and overall good color is kinda critical for me, I dont care if its 4k or not but I'd prefer it if it was 4k. So let me know what monitors you have/recommend.

Thank you.
I would suggest being a little more specific about how "critical" it is for you? I would think for someone who's work depended on it, they would be a little more specific?

If you just want a really nice 2nd gen 4K display, check out the Dell P2715Q... not without some minor issues, and maybe not up to snuff for your needs, but probably one of the best 4K values out there.

Basically for color work you want a monitor that supports Adobe RGB (aRGB); lower-end consumer grade monitors only support the more narrow Standard RGB (sRGB).

As for size I prefer 1440p @ 27" (and so does Apple cuz the 27" iMac is 1440p) and 4k @ 32"; since the later is currently outside your price bracket I would suggest getting the following:

Dell U2713H
I don't pretend to know about that stuff, but it was interesting to google aRGB vs. sRGB.

+1 on the Dell2713H - a great solid display.
 
I would suggest being a little more specific about how "critical" it is for you? I would think for someone who's work depended on it, they would be a little more specific?

If you just want a really nice 2nd gen 4K display, check out the Dell P2715Q... not without some minor issues, and maybe not up to snuff for your needs, but probably one of the best 4K values out there.


I don't pretend to know about that stuff, but it was interesting to google aRGB vs. sRGB.

+1 on the Dell2713H - a great solid display.

I wouldn't get the P2715Q for 2 reasons. First, for the OP, it's sRGB so it's not well suited for color work. Second, I personally think 4k at 27" is going to be too small at native resolution and HiDPI (aka retina mode) will make it look 1080p. So it'll have less screen real estate and everything will look kinda look too big. If the OP wasn't going color work I would recommend the BenQ or Acer 32" 4k monitor (both use the same LCD panel btw).
 
I wouldn't get the P2715Q for 2 reasons. First, for the OP, it's sRGB so it's not well suited for color work. Second, I personally think 4k at 27" is going to be too small at native resolution and HiDPI (aka retina mode) will make it look 1080p. So it'll have less screen real estate and everything will look kinda look too big. If the OP wasn't going color work I would recommend the BenQ or Acer 32" 4k monitor (both use the same LCD panel btw).
Just because I did a little googling, I'm not claiming to be an expert, but I got from just about every article I read on the subject, that it depends on what kind of photography you do. That sRGB is the standard for web photography and even certain types of dead tree publishing, and has an easier workflow, while aRGB really only shines with specific print photography (portrait and fine art photography commonly being cited, though there's even debate about how much of a practical difference it makes), is useless for web photography, and requires a little more work for the best results. That's why I think it might be beneficial for the OP to be a little more specific. I may not understand all of it, but when I read photography pros talking about this stuff, they generally seem very specific about what they require and their workflows. And again, not trying to start a debate or anything, because I don't know enough to respond.

I guess I don't understand your point about 27" 4K monitors. They're like regular 27" 1920x1080 or 2560x1440 displays, except really extra sharp. High megapixel photos and 4K video look amazing no matter what the HiDPI is set too. You can set it too 3840x2160 or 1280x800, and a photo will have essentially the same on screen pixel resolution (i.e. 4K). I admittedly don't have a 27" Apple TB Display to compare it to, but I can compare it to a 30" Dell UltraSharp U3014 2560x1600, which is also a fantastic display.

I'm sure 32" 4K displays are great as well, though having used 30" displays for many years, I think 32" might feel just a tad big for me personally. But maybe that's because I've never had one. I have friends that have a 32" tv set and sit across the room from it.
 
Just because I did a little googling, I'm not claiming to be an expert, but I got from just about every article I read on the subject, that it depends on what kind of photography you do. That sRGB is the standard for web photography and even certain types of dead tree publishing, and has an easier workflow, while aRGB really only shines with specific print photography (portrait and fine art photography commonly being cited, though there's even debate about how much of a practical difference it makes), is useless for web photography, and requires a little more work for the best results. That's why I think it might be beneficial for the OP to be a little more specific. I may not understand all of it, but when I read photography pros talking about this stuff, they generally seem very specific about what they require and their workflows. And again, not trying to start a debate or anything, because I don't know enough to respond.

I guess I don't understand your point about 27" 4K monitors. They're like regular 27" 1920x1080 or 2560x1440 displays, except really extra sharp. High megapixel photos and 4K video look amazing no matter what the HiDPI is set too. You can set it too 3840x2160 or 1280x800, and a photo will have essentially the same on screen pixel resolution (i.e. 4K). I admittedly don't have a 27" Apple TB Display to compare it to, but I can compare it to a 30" Dell UltraSharp U3014 2560x1600, which is also a fantastic display.

I'm sure 32" 4K displays are great as well, though having used 30" displays for many years, I think 32" might feel just a tad big for me personally. But maybe that's because I've never had one. I have friends that have a 32" tv set and sit across the room from it.

27" at 1080 and 1440p is a HUGE difference; 1440p is 1.8x, or almost twice, as many pixels as 1080p. So going from 1080p to 1440p increases the ppi. The screen becomes more sharp and you get 1.8x the amount of pixels to work with so the screen real estate sky rockets.

On the flip-side it's possible to increase the ppi too much. 4k at 27" would do just that. Sure it would look extra sharp and it would have a massive amount of screen real estate but everything would be kinda small. It's impractical. There are two solutions. The first is to use HiDPI (retina) mode which will make 4k look like 1080p. They would solve the small text issue BUT it would lose all the new screen real estate; even worse it'd be less space than 1440p. The other solution is to increase the screen size. 32" @ 4k would be that fix.
 
I do fine art photography/video.. I do my own color grading and lately i'm beginning to print my work, which is why I say color is kind of critical because i want my work to match prints as best as i can within my budget (I have a color calibrator btw). I also mean that color must trump other features, like refresh rate, features,etc.. (although it MUST be 60hz). I think a TN panel wouldn't be a good choice for me. I've had imacs and macbooks before so Ive never have had to deal with choosing a monitor in a long time, thats why im a bit lost. So thanks for the help.

----------

I'm pretty sure the ATD is closer to sRGB than aRGB.

Have you had any experience with the ASUS PB278Q?
 
Consider investigating the NEC PA series. While not cheap, they are excellent monitors for graphics.
 
Have you had any experience with the ASUS PB278Q?[/QUOTE said:
I have one coming from B&H Photo on Tuesday but my use will be as a secondary Digital Audio Workstation display with a Dell U3014 as my main display. I did read some good photography user reviews for the Asus PB27Q at B&H which helped with my purchasing decision along with the good price. I'd check out those reviews if you already haven't.
 
27" at 1080 and 1440p is a HUGE difference; 1440p is 1.8x, or almost twice, as many pixels as 1080p. So going from 1080p to 1440p increases the ppi. The screen becomes more sharp and you get 1.8x the amount of pixels to work with so the screen real estate sky rockets.

On the flip-side it's possible to increase the ppi too much. 4k at 27" would do just that...
You can never increase the PPI too much (unless Apple didn't have an appropriate HiDPI setting to accommodate it). If you want to set your 4K display to have the equivalent real estate of a 1440p, then you set it to 2560x1440 HiDPI, and you'll see the equivalent of 1440p, except there will actually be 3840x2160 pixels making up that "1440p", hence extra sharp. I don't want to get off track here - there's a bunch of info out there on how this works.

I do fine art photography/video...
IPS (or similar) and 60hz should be a given. I think color calibration is just one aspect of it. For instance, the Dell P2715Q is IPS and 60hz, and it comes from the factory with almost perfect color calibration, and can be further color calibrated (though from what I gather, not as endlessly as some of the really "serious" professional displays). But it "only" has sRGB, same as that Asus you're asking about BTW. That would only mean that the range of colors can't quite be matched on screen compared to the printed world. Again, the practical advantages of this are debatable. I have a family member who is a professional fine art photographer (she doesn't make a living at it, but she's had her work in local galleries, and won various jury awards, etc.), and she "only" has a rMBP (and got by with much less before that) and couldn't be happier.

But I'm not out to convince you to buy the P2715Q (again, there are a few "early adopter" hiccups with all 4K displays), just suggesting that everything you've said indicates you probably don't need a high-end aRGB display, and if you think you do, it might be worth doing a bit more research (and maybe take this discussion beyond the Mac Pro forum, which is mostly focused on how to upgrade the cMP). Best of luck with the photography! :)
 
Last edited:
Basically for color work you want a monitor that supports Adobe RGB (aRGB); lower-end consumer grade monitors only support the more narrow Standard RGB (sRGB)
Apple has steadfastly ignored the existence of aRGB for years, and plenty of designers happily use their sRGB cinema/thunderbolt displays. I wouldn't call the ACD/ATD "lower end consumer." Pretty much any of the 27" 2560x1440 panels that are so common now have IPS screens that support ~100% sRGB. Which is absolutely fine for the majority of professional work.

...aRGB really only shines with specific print photography, is useless for web photography, and requires a little more work for the best results.
This. aRGB is irrelevant for web or video. In fact, if your pipeline isn't designed for aRGB, it will make things worse. A non-calibrated aRGB display will make colors unnaturally intense and over-saturated. So if you are designing and trying to color-correct to that, and then view the content on an sRGB display, everything you do is going to end up looking desaturated.
_________________________________________________________

As far as 4k, I'd agree that on a 27" monitor it is a bit odd. Yes you can scale 4k to 2560x1440, but ultimately you are running at a non-native resolution (or multiplier thereof) and thus things aren't really as clear as they could be. The 27" 5k iMac works because it is exactly twice that, so you can run true hiDPI. I've got a 15" retina MBPro (2880x1800) that I run at 1920x1200, and yes it looks pretty good, but I don't love it - I would rather have the native 1920x1200 on my old (dead) 17" MBPro. A 4k 27" would be a similar situation. A 32" 4k isn't much better - at native res things would still be pretty tiny, and if you scale down you have the same issues as the 27". You'd need to get up to like a 40" 4k to have similar PPI to a 27" at 1440.

Have you had any experience with the ASUS PB278Q?
I think that would be absolutely fine for your use. The BenQ GW2765HT is another lower-cost one you could look at from a reputable manufacturer with a 3-yr warranty. I'd let the 4k thing mature for awhile, and imho unless you really have a specific need for aRGB, it is likely going to cause more headaches than anything.
 
As an Amazon Associate we earn from qualifying purchases.
well I think I opened a big can of worms here. This is going to be hard choice I dont really need 4k I mean itd be nice but I think color reproduction is more important to me, I think i need to narrow things within the budget. Well see.. I like where to conversation is going so thank you guys so much.
 
well I think I opened a big can of worms here. This is going to be hard choice I dont really need 4k I mean itd be nice but I think color reproduction is more important to me, I think i need to narrow things within the budget. Well see.. I like where to conversation is going so thank you guys so much.

If you don't care about adobe RGB then get the Acer or BenQ 32" 4K monitor. They retail for 1k and are on sale for 800 from time to time.
 
well I think I opened a big can of worms here. This is going to be hard choice I dont really need 4k I mean itd be nice but I think color reproduction is more important to me, I think i need to narrow things within the budget. Well see.. I like where to conversation is going so thank you guys so much.

The concept of color accuracy is a lot greater than what can be described in a few lines here. I'm not going to try, but the entire issue of sRGB vs Adobe RGB is mostly crap. It describes a boundary condition, not the deviation from the desired output. The simplest way I can put it is that most really good displays made recently have gamuts close to Adobe RGB, but having an Adobe RGB gamut isn't a sign that it's a good display. Make sense?

The requirements for video differ quite a bit from still photography, assuming this is to be printed for any purpose. If you're dealing with something to be printed, reference prints are essential and should be viewed under "correct" (there are lighting devices conforming to ISO specifications) lighting when making critical decisions. You should also be aware that the gamut of these printing devices and the metameric characteristics of the inks in a typical wide-format printer differ from those of a commercial press. There is still a case for corrected lighting and printing RIPs if you want a really good match. The same holds for color grading tools. Good ones are extremely expensive, but you can often get away with less as long as you rigorously test your workflow for consistency.

TLDR don't take an Adobe RGB display in itself as a sign of anything being accurate relative to your requirements.
 
The concept of color accuracy is a lot greater than what can be described in a few lines here. I'm not going to try, but the entire issue of sRGB vs Adobe RGB is mostly crap. It describes a boundary condition, not the deviation from the desired output. The simplest way I can put it is that most really good displays made recently have gamuts close to Adobe RGB, but having an Adobe RGB gamut isn't a sign that it's a good display. Make sense?

The requirements for video differ quite a bit from still photography, assuming this is to be printed for any purpose. If you're dealing with something to be printed, reference prints are essential and should be viewed under "correct" (there are lighting devices conforming to ISO specifications) lighting when making critical decisions. You should also be aware that the gamut of these printing devices and the metameric characteristics of the inks in a typical wide-format printer differ from those of a commercial press. There is still a case for corrected lighting and printing RIPs if you want a really good match. The same holds for color grading tools. Good ones are extremely expensive, but you can often get away with less as long as you rigorously test your workflow for consistency.

TLDR don't take an Adobe RGB display in itself as a sign of anything being accurate relative to your requirements.


I agree with you, with the whole rgb vs srgb the imacs are not 100%rgb and i liked them. I printed some of my work and they were fairly accurate after color calibration not super far off. Also not everyone has super monitors to view all the colors anyways. I just need something with reasonable enough color (probably ips) and just overall good value.
 
I agree with you, with the whole rgb vs srgb the imacs are not 100%rgb and i liked them. I printed some of my work and they were fairly accurate after color calibration not super far off. Also not everyone has super monitors to view all the colors anyways. I just need something with reasonable enough color (probably ips) and just overall good value.

Just holding something up next to the display is a very very rudimentary way of determining a match between the two, given the possible range of ambient lighting being reflected by your print. There's also the issue that gamut constraints are quite different if this is sent off to be printed. You should just be aware of these things so that you don't interpret any deviation as someone else's mistake. That's kind of the simple version.

That not everyone has super monitors is a common misunderstanding. Viewing devices can be off in any number of directions. A reference device should ideally be as close as possible so that in the worst possible case, the errors are not additive. If you want predictability and good uniformity, something along the lines of NEC + spectraview kit isn't a bad place to start as a generic solution. It at least gives you some ability to track hardware drift. Overall though if the imac display quality was sufficient for you and your clients, that isn't an extremely high bar and you could probably save a little money. Do note that most displays use anti-glare coatings which differ from those used by Apple. Some are stronger than others, and the overly aggressive ones create an irritating sparkle effect.
 
Just holding something up next to the display is a very very rudimentary way of determining a match between the two, given the possible range of ambient lighting being reflected by your print. There's also the issue that gamut constraints are quite different if this is sent off to be printed. You should just be aware of these things so that you don't interpret any deviation as someone else's mistake. That's kind of the simple version.

That not everyone has super monitors is a common misunderstanding. Viewing devices can be off in any number of directions. A reference device should ideally be as close as possible so that in the worst possible case, the errors are not additive. If you want predictability and good uniformity, something along the lines of NEC + spectraview kit isn't a bad place to start as a generic solution. It at least gives you some ability to track hardware drift. Overall though if the imac display quality was sufficient for you and your clients, that isn't an extremely high bar and you could probably save a little money. Do note that most displays use anti-glare coatings which differ from those used by Apple. Some are stronger than others, and the overly aggressive ones create an irritating sparkle effect.

I see what youre saying.. There are many variables.. i just need something that when you look at a print is not completely off like one looks completely green and the other looks yellow.

I do want one of those NEC monitors or EIZIO but they do seem to get expensive rather quickly... maybe down the line will pick one those up. Just right now i need something to get me through. I highly appreciate your help though it is really helpful.
 
I see what youre saying.. There are many variables.. i just need something that when you look at a print is not completely off like one looks completely green and the other looks yellow.

I do want one of those NEC monitors or EIZIO but they do seem to get expensive rather quickly... maybe down the line will pick one those up. Just right now i need something to get me through. I highly appreciate your help though it is really helpful.

I'm not suggesting you need to purchase one of those. In fact to have a really robust solution, that's a small part of the overall cost of setup. There are some advantages to a quality display on its own, and you should consider a few factors when determining a display purchase. Here's my list.

Warmup time, uniformity, viewing angles, measured color temperature and its deviation from D65 (don't try to tweak this via profile creation), shadow detail.

What I was saying before about sRGB vs Adobe RGB is that the results of a patch test for a display and accompanying matrix profile may not strongly correlate with display gamut. There are quite a few colors that fall either inside sRGB or outside Adobe RGB where the thing of importance is the deviation between the color actually measured and the intended color relative relative to the coordinate basis used. I had to be very specific about that, because Adobe RGB isn't exactly a scaled superset of sRGB. Edit: Rewriting this part, by different topologies, I meant that if you look at a geometric representation of their gamuts in a profile viewer, you will see different shapes, rather than the same shape at different sizes.

I'm going a little further with this than I intended, but the criteria I mentioned above are things that do differ between displays, and they relate to things where there is a measurable benefit without having a completely fleshed out system in place.
 
I see what youre saying.. There are many variables.. i just need something that when you look at a print is not completely off like one looks completely green and the other looks yellow.

I do want one of those NEC monitors or EIZIO but they do seem to get expensive rather quickly... maybe down the line will pick one those up. Just right now i need something to get me through. I highly appreciate your help though it is really helpful.

Like I said, go with Acer B326HK or BenQ BL3201PH (same panel).

You can never increase the PPI too much (unless Apple didn't have an appropriate HiDPI setting to accommodate it). If you want to set your 4K display to have the equivalent real estate of a 1440p, then you set it to 2560x1440 HiDPI, and you'll see the equivalent of 1440p, except there will actually be 3840x2160 pixels making up that "1440p", hence extra sharp. I don't want to get off track here - there's a bunch of info out there on how this works.

The issue is 4k is 4x 1080p and 5k is 4x 1440p. So setting a 4k display to 1440p HiDPI causes some interpolation which ruins sharpness.

As far as 4k, I'd agree that on a 27" monitor it is a bit odd. Yes you can scale 4k to 2560x1440, but ultimately you are running at a non-native resolution (or multiplier thereof) and thus things aren't really as clear as they could be. The 27" 5k iMac works because it is exactly twice that, so you can run true hiDPI. I've got a 15" retina MBPro (2880x1800) that I run at 1920x1200, and yes it looks pretty good, but I don't love it - I would rather have the native 1920x1200 on my old (dead) 17" MBPro. A 4k 27" would be a similar situation. A 32" 4k isn't much better - at native res things would still be pretty tiny, and if you scale down you have the same issues as the 27". You'd need to get up to like a 40" 4k to have similar PPI to a 27" at 1440.

Same, I got a MBP that i run at 1920x1080. I can instantly notice that it's less sharp but the screen real estate is worth it.

4K @ 32" actually looks really good. Sure the ppi is higher but it's not so high that it overwhelms the ability of the OS to scale. I would say at 37-42" I would want a legit 5k display.
 
Last edited:
The issue is 4k is 4x 1080p and 5k is 4x 1440p. So setting a 4k display to 1440p HiDPI causes some interpolation which ruins sharpness.

Same, I got a MBP that i run at 1920x1080. I can instantly notice that it's less sharp but the screen real estate is worth it.

4K @ 32" actually looks really good. Sure the ppi is higher but it's not so high that it overwhelms the ability of the OS to scale. I would say at 37-42" I would want a legit 5k display.
If you (and others) don't want to understand how 4K and OS X HiDPI scaling works, that's on you.
 
Like I said, go with Acer B326HK or BenQ BL3201PH (same panel).

For the OP's purposes same panel != same display. Some can be close, but there isn't much much in the way of panel difference at this point across the majority of the price spectrum. Perhaps if you get into specialized equipment that costs several thousand dollars. Otherwise you will the same panel show up, yet performance not all performance characteristics are equivalent.
 
If you (and others) don't want to understand how 4K and OS X HiDPI scaling works, that's on you.

I don't think you understand how HiDPI works. 5k is exactly 4 times the pixels of 1440p. So what apple does for HiDPI is they subdivide 1 old pixel into 4 new pixels. Then they add native 5k elements. So you get the effective appearance [screen real estate] of 1440p but with the sharpness of 5k. This only works for resolutions that are perfect multiples; so 4K:1080p and even 1440p:720p (although no one would care about doing that). Anything else is just regular scaling which results in ugliness.

For the OP's purposes same panel != same display. Some can be close, but there isn't much much in the way of panel difference at this point across the majority of the price spectrum. Perhaps if you get into specialized equipment that costs several thousand dollars. Otherwise you will the same panel show up, yet performance not all performance characteristics are equivalent.

Ya that's true but talking about the differences in response time causes by the software or other technical points is really outside the scope of this discussion; I would advice the OP to buy either depending on which is cheaper. If they were 600-700 dollars and had G-sync I would buy one as well.
 
Last edited:
4K @ 32" actually looks really good. Sure the ppi is higher but it's not so high that it overwhelms the ability of the OS to scale. I would say at 37-42" I would want a legit 5k display.
Aside from the first sentence, this is just a nonsensical statement.

I don't think you understand how HiDPI works. 5k is exactly 4 times the pixels of 1440p. So what apple does for HiDPI is they subdivide 1 old pixel into 4 new pixels. Then they add native 5k elements. So you get the effective appearance [screen real estate] of 1440p but with the sharpness of 5k.
LOL, well, whether you understand it or not, you seem to be drawing the wrong conclusion. I'd suggest you check out this Apple Developer article on HiDPI explained. Also, it's actually 1 point in user-space that represents the 4 pixels, which normally I would say is being pedantic, but in this case, maybe it would help you to understand the concepts better.
This only works for resolutions that are perfect multiples; so 4K:1080p and even 1440p:720p (although no one would care about doing that). Anything else is just regular scaling which results in ugliness.
You are simply wrong about this. Have you actually used a 4K display and tried the various HiDPI settings? If you had, you'd know that isn't true. Because points in HiDPI can be expressed as floating points, it doesn't have to be "perfect multiples".

Think of it this way...
  1. Take a screenshot of a desktop on a 27" 5K iMac (5120x2880) using HiDPI 1440p.
  2. Make two copies, and using photoshop, scale one copy down to 3840x2160 and the other down to 1920x1440.
  3. Display the original screenshot on the 5K iMac, the 4K reduction on a 27" 4K display and the 1440 reduction on a 27" native 1440p display respectively.
  4. The results in order of best looking is: 5K iMac, 4K display, 1440p display.
If you want to insist that the native 1440p looks better than the non-"perfectly" scaled 4K, you're entitled to see it that way, but that's kind of like sticking your head in the sand.

I really wouldn't care all that much to carry this on, but there are so many users just finding out about 4K, and they read this stuff, and misinformation just gets spread over and over again.

Starting with the aRGB, you've been offering a lot of suspect advice in this thread and being rather pushy about it. :rolleyes:
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.