Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
My iMac with 5K Retina display arrived the other day. I've had a chance to put it through some initial tests as far as Lightroom is concerned and thought it might be of some interest to people here. If you've got any questions or anything you'd like me try (preferably from a photographers perspective) do let me know.

http://www.tony-hart.com/blog/essays/2014/10/imac-with-retina-5k/
I don't understand the difference between ...

Best for display and Scaled with Best (Retina) selected. Can somebody explain this?

Display-Prefs,large.1414433416.png
 
I don't understand the difference between ...

Best for display and Scaled with Best (Retina) selected. Can somebody explain this?

Image

I believe (but I may be wrong) that there is no difference. I think the 'Best for Display' option is basically there for those who don't want to mess around with settings. The scaled option basically gives the user a degree of control over how the interface is rendered - be it bigger, or smaller. The Best (Retina) option, is I believe identical to the 'Best for Display' option.
 
I believe (but I may be wrong) that there is no difference. I think the 'Best for Display' option is basically there for those who don't want to mess around with settings. The scaled option basically gives the user a degree of control over how the interface is rendered - be it bigger, or smaller. The Best (Retina) option, is I believe identical to the 'Best for Display' option.
Thanks.
 
Many thanks to the OP. I service a travel crowd who live in lightroom/macs and almost all shoot with a Mk3/Raw. Lightroom performance on this machine will be a hot topic.

Expected is the very interesting crop size of "Almost 100%" on the display. Is this available in normal mode?
Also expected is a typical 7-9% increase with the updated i7, which is where most of the performance gain can happen in Adobe's software IF it's there.

Best
 
Interesting insights Fred. Some thought. The calibration comments are made in the following light:

- Firstly when I say that the colour accuracy of the 5K iMac is similar to the 2011 iMac, what I mean is they both have very similar colour response. Similar saturation and similar white points.

- Second, my printing experience involves printing my wedding work (and some personal stuff) on my Epson 3880 using Hahnemuhle Photo Rag 308. Although the prints don't represent exactly what I see on screen - never possible comparing transmissive vs reflective media and anyway matt fine art papers have a relatively limited D-Max - overall I feel like they match up pretty well.

As for the Handbrake test, extremely interesting. What are you using to keep an eye on processor speed etc? I will try and find time to run the test under yosemite on the old machine but it's being sold on Saturday so may not have time. Feel free to send me your own Test Mule scores if you like!
Right you are about getting exact representation on print that you see on screen. I use the same printer as you, and have been generally happy that what comes off of it is close to what I see on screen. I'm going to try to find a program that will let me profile the new iMac with my Eye one puck.

I use the Intel Power Gadget program to watch what the processor speed is and iStat Menus for fan speed and temperature.

https://software.intel.com/en-us/articles/intel-power-gadget-20
http://bjango.com/mac/istatmenus/
 
Dan, why do you say the iMac is so accurate? I'm not disputing you at all, just wondering on what you base that statement. I've been using an NEC Multisync monitor for several years, which I calibrate using their software and an Eye-one puck. The new iMac is very different in character to that monitor, being much whiter and less saturated. That's not to say that the NEC is more or less accurate, but that's what the calibration gives me. Having used it for so long, the difference side by side is really apparent. I'll have to recalibrate the NEC and see if I can't find a program that would let me calibrate the iMac with the Eye-one. Color aside, I can't believe how amazing it is to look at my photos on this screen!

It should take a bit of time to adjust to a new monitor, especially since the one you were using is excellent.

When I said "accurate" in this case, I was just comparing the Retina with default profile to my calibrated 2012 iMac. The neutral colors, shadow and highlight detail were all very comparable. That's quite a different experience than what I've had with iMacs before, where the default profile needed IMMEDIATE calibration.

I'm not sure what steps Apple used to take with prior iMacs, in terms of factory calibration. But I like what I see so far from the Retina.

----------

Hi Misho,

What I have noticed is that when you preview an image in full screen mode (using the 'F' key) the full screen image does take a second or two to load. Again, this doesn't pose any real world issue for me because I only use this mode occasionally and then to simply show a single image. Hit fullscreen mode, wait a max of 1.5 seconds. Blammo. Enjoy ridiculously hi-def image presentation!

To speed things up and virtually eliminate that 1-2 second delay, you can render images with 1:1 previews. Library module - Library - Previews. The render process takes a bit of time (depending on the number of photos), and it does take up more storage space, so use it when you need it.

I generally only do it in heavy editing situations when I'm making selects. There's always a choices - which image works best? Lots of A/B/C etc. comparisons, so it helps if the preview is instantaneous.

I also do 1:1 previews of all images that make it to my final portfolio. It works great.
 
Hi Misho,

My first question would be which GPU do you have? I can only speak to the upgraded M295X. The numbers relate to exports largely, and these don't actually show anything tangible on the screen when they are occurring.

In terms of day to day editing use, here are my so far half-baked findings. Compared to my Mid 2011 iMac, I can't tell any difference in terms of how long it takes to render a file for working on in the develop module. Adjustments seem to be made extremely quickly, pretty much immediately and in such a way that I can easily make (and see) fine adjustments. If I was pushed, I might say that I *think* I perceive an ever so slight wait between commanding the adjustment and seeing the result. It's utterly marginal however and I might be imagining it. In practice it doesn't pose any issue as far as I can see

What I have noticed is that when you preview an image in full screen mode (using the 'F' key) the full screen image does take a second or two to load. Again, this doesn't pose any real world issue for me because I only use this mode occasionally and then to simply show a single image. Hit fullscreen mode, wait a max of 1.5 seconds. Blammo. Enjoy ridiculously hi-def image presentation!

I've also noticed an occasional and hardly perceivable stutter when using Mission Control with lots of windows open. I'll be honest, although it can be spotted, again, it's of no consequence. If that's the trade off of such a high res screen, I'll take it as the resolution is just breathtaking.

Thanks for the response.
the 4GB GPU I was referring to is the M295X cause the M290X is only 2GB.
Actually I figured out what was going wrong. it's all in the settings
In the previous model (late 2013) lightroom, I set the build preview to 1:1 & smart preview options in the importing module. so preview pictures are always rendered 1:1 right after import which make lightroom work flawlessly. I forgot to change this setting when I installed lightroom on the retina and left it to standard. which build the previews on the fly and make things slower.

now i am more relieved.
still things a little bit stuttering but as you mentioned it's of no consequence as a trade off for this beautiful screen. I am sure these things will be ironed out in the coming OS X updates
 
It should take a bit of time to adjust to a new monitor, especially since the one you were using is excellent.

When I said "accurate" in this case, I was just comparing the Retina with default profile to my calibrated 2012 iMac. The neutral colors, shadow and highlight detail were all very comparable. That's quite a different experience than what I've had with iMacs before, where the default profile needed IMMEDIATE calibration.

I'm not sure what steps Apple used to take with prior iMacs, in terms of factory calibration. But I like what I see so far from the Retina.

----------



To speed things up and virtually eliminate that 1-2 second delay, you can render images with 1:1 previews. Library module - Library - Previews. The render process takes a bit of time (depending on the number of photos), and it does take up more storage space, so use it when you need it.

I generally only do it in heavy editing situations when I'm making selects. There's always a choices - which image works best? Lots of A/B/C etc. comparisons, so it helps if the preview is instantaneous.

I also do 1:1 previews of all images that make it to my final portfolio. It works great.

Yes. I figured oit this few hours ago. I forgot to swich the 1:1 preview option on in the import module. (Was on in the previous model) thats why i noticed the big difference
 
I don't do much video editing at all. The only thing that I probably do with video is cutting the video to shorten the footage. Pretty much things like that. I mostly use Lightroom. Should the base M290 be sufficient for my needs? I don't game and I'm not sure if I can justify the expense on the M295.

Can the iMac 5K power an external LCD? If possible, I would like to output to two additional 1440p LCDs. So the iMac 5K will be the primary display and then 2 other 27" 1440p displays.
 
If you are running additional displays I think you shouldn't think twice about the 295.
 
If you are running additional displays I think you shouldn't think twice about the 295.

I don't want to spend the money if I don't need to. No one here is doubting that the 295 is faster. I find people on Macrumors to be the speculating type. Probably more so than any other hardware forums I've been to. I can't make a good decision base on opinions. I think it's time that we start substantiating our claims. Opinions are not facts.
 
I don't do much video editing at all. The only thing that I probably do with video is cutting the video to shorten the footage. Pretty much things like that. I mostly use Lightroom. Should the base M290 be sufficient for my needs? I don't game and I'm not sure if I can justify the expense on the M295.

Can the iMac 5K power an external LCD? If possible, I would like to output to two additional 1440p LCDs. So the iMac 5K will be the primary display and then 2 other 27" 1440p displays.

The iMac 5K will power an external display. That I know for sure. I also know that the iMac 5K will not power an external 5K display due to limitations with DisplayPort 1.2. Neither can the iMac 5K will be used in target display mode.

What I do not know is how well external monitors work as I haven't tested this. There are however other threads where Users report success.
 
I don't do much video editing at all. The only thing that I probably do with video is cutting the video to shorten the footage. Pretty much things like that. I mostly use Lightroom. Should the base M290 be sufficient for my needs? I don't game and I'm not sure if I can justify the expense on the M295.

Can the iMac 5K power an external LCD? If possible, I would like to output to two additional 1440p LCDs. So the iMac 5K will be the primary display and then 2 other 27" 1440p displays.

Yes, you'll be just fine. I'm in the same boat as you. It's your money - and you'll be perfectly happy with the M290 for Lightroom. Lightroom doesn't even use the GPU much if at all.
 
dandrewk said:
Yes, it's true, it might not match the $3k NEC (which doesn't match Retina resolution)
Hello,
About color, the imac 5k seems nearby 100% of sRGB when the NEC should be 100% of adobe RGB which is a huge difference. But of course as you said it's a huge difference for those who matters about color space... And some people will prefer sRGB on Adobe RGB. It's ok.

My question is about resolution.
When you say it will not match resolution, is it because the imac is 5k and the NEC is 4k or more about the retina technology.
I would like to compare the imac retina 5k to an other "classic" 4k screen but obviously my local dealer will not let me do what i want with 3k$ products.
As i don't understand well the advantage of Retina technology, i'm interested by this point. If you could explain me a little bit, so i can imagine the difference.

Thanks.
Colin
 
Hello,
About color, the imac 5k seems nearby 100% of sRGB when the NEC should be 100% of adobe RGB which is a huge difference. But of course as you said it's a huge difference for those who matters about color space... And some people will prefer sRGB on Adobe RGB. It's ok.

My question is about resolution.
When you say it will not match resolution, is it because the imac is 5k and the NEC is 4k or more about the retina technology.
I would like to compare the imac retina 5k to an other "classic" 4k screen but obviously my local dealer will not let me do what i want with 3k$ products.
As i don't understand well the advantage of Retina technology, i'm interested by this point. If you could explain me a little bit, so i can imagine the difference.

Thanks.
Colin

As you mentioned, color space is a big issue, but only if your application demands it. That applies to only a very tiny subset of users, all of whom are quite aware of their requirements.

So, comparing the NEC to the iMac is usually just a matter of what you require. Assuming the iMac's excellent color accuracy (default or with user software calibration) meets your needs, the main difference is resolution - 4k vs 5k.

To answer your question, "Retina" is just a marketing term, which in the case of a 27" iMac means 5k resolution. If you were to compare the 5k iMac side by side with the NEC, you'd be hard pressed to find a difference. The real advantage of the iMac (besides the price and the fact you have a great computer built in) is screen space. e.g. If you work with 4k video, you can have 1:1 in your editing and have room for all menus and columns - no zooming/scaling required.
 
Tony,

What have your experiences been so far running Lightroom on your iMac? On mine I'm finding that there is a very pronounced lag to almost all operations in the Develop module. Move a slider and wait a second or so to see the result. Scrolling in Grid view is also choppy, but I think that's probably just Lightroom, but it may look a bit worse on the retina screen.
 
I thought I read somewhere you cannot calibrate this display and it comes "pre-calibrated" from Apple.

Hello,
About color, the imac 5k seems nearby 100% of sRGB when the NEC should be 100% of adobe RGB which is a huge difference. But of course as you said it's a huge difference for those who matters about color space... And some people will prefer sRGB on Adobe RGB. It's ok.

My question is about resolution.
When you say it will not match resolution, is it because the imac is 5k and the NEC is 4k or more about the retina technology.
I would like to compare the imac retina 5k to an other "classic" 4k screen but obviously my local dealer will not let me do what i want with 3k$ products.
As i don't understand well the advantage of Retina technology, i'm interested by this point. If you could explain me a little bit, so i can imagine the difference.

Thanks.
Colin

Adobe 1998 can help in some cases. It's often easier to spot if something is grossly over-saturated. Don't make the common mistake of buying into the idea that bigger gamut = better accuracy. First you should be aware that these things don't exactly plot spectral alignment. It's a quasi-perceptive model, which assumes a narrow field of view. It assumes a low level of ambient lighting, as that can influence cone response. From there it theoretically depicts the deviation between the perception of a color seen on the display and the desired perception. This is not the same as requiring the same composition of light scaled to the appropriate brightness level.

I wanted to include that to assist with this point. The error on screen relates to the actual color displayed, which may be more or less on the wider gamut display.

Just to include this, the software you use to calibrate has no real direct path to the underlying hardware itself. It first measures the maximum primaries and greyscale of the display, and remember we're limited by the tolerance of the device. It calls for (0xFF,0,0), (0,0xFF,00), and (0,0,0xFF) and measures the chromacity of each. It then measures points (a,b,c) such that a=b=c. Based on those things it builds an ICC profile that describes the basic response of the hardware. The patches that are then measured and depicted in terms of Delta E values refer approximately to the ****coordinate distance from the perceived display color to the desired color within the reference color model plus or minus the tolerance of the measurement device which may vary between units.

Now you may wonder what would be the difference with displays that claim hardware calibration. Regardless of testing at the factory, all displays drift over time due to shifts in both the backlight and LCD response. If you are trying to maintain a specific target over time, that is achieved by a linear mapping between the hardware gamut and a subset of that gamut that falls within the desired target range, and there is an ISO specification for how that should be applied, which is also referred to in ICC's documentation. You can only map to values within range, so you are mapping fewer total values in this way. Hardware calibration does something similar post framebuffer using the display's internal processing. It may have greater amount of low level access, but it attempts to meet the desired target and provide the most desirable hardware response.

In my own experience they tend to sacrifice shadow detail before other things. NEC, Eizo, and some of the others also use a form of local dimming via panel blocking to improve the brightness uniformity of the displays. NEC allows that feature to be toggled on or off, and it does reduce overall brightness somewhat. I think it's fine because you shouldn't run them at max brightness anyway.

TLDR it's complicated, and this was a very shallow explanation that probably contained some errors. It definitely contains some intentional omissions.


**** Correction : I didn't mean coordinate distance. It refers to the magnitude of the vector depicting the coordinate difference between the two points.
 
Last edited:
^^^

I did the first calibration using Spyder 4 Elite. The factory profile was quite good, but just a tad greenish on my RiMac. Easy to fix.

No matter what iMac or monitor you use, a good colorimeter and calibration software is a must if color accuracy is important.
 
Adobe 1998 can help in some cases. It's often easier to spot if something is grossly over-saturated. Don't make the common mistake of buying into the idea that bigger gamut = better accuracy. First you should be aware that these things don't exactly plot spectral alignment. It's a quasi-perceptive model, which assumes a narrow field of view. It assumes a low level of ambient lighting, as that can influence cone response. From there it theoretically depicts the deviation between the perception of a color seen on the display and the desired perception. This is not the same as requiring the same composition of light scaled to the appropriate brightness level.

I wanted to include that to assist with this point. The error on screen relates to the actual color displayed, which may be more or less on the wider gamut display.

Just to include this, the software you use to calibrate has no real direct path to the underlying hardware itself. It first measures the maximum primaries and greyscale of the display, and remember we're limited by the tolerance of the device. It calls for (0xFF,0,0), (0,0xFF,00), and (0,0,0xFF) and measures the chromacity of each. It then measures points (a,b,c) such that a=b=c. Based on those things it builds an ICC profile that describes the basic response of the hardware. The patches that are then measured and depicted in terms of Delta E values refer approximately to the coordinate distance from the perceived display color to the desired color within the reference color model plus or minus the tolerance of the measurement device which may vary between units.

Now you may wonder what would be the difference with displays that claim hardware calibration. Regardless of testing at the factory, all displays drift over time due to shifts in both the backlight and LCD response. If you are trying to maintain a specific target over time, that is achieved by a linear mapping between the hardware gamut and a subset of that gamut that falls within the desired target range, and there is an ISO specification for how that should be applied, which is also referred to in ICC's documentation. You can only map to values within range, so you are mapping fewer total values in this way. Hardware calibration does something similar post framebuffer using the display's internal processing. It may have greater amount of low level access, but it attempts to meet the desired target and provide the most desirable hardware response.

In my own experience they tend to sacrifice shadow detail before other things. NEC, Eizo, and some of the others also use a form of local dimming via panel blocking to improve the brightness uniformity of the displays. NEC allows that feature to be toggled on or off, and it does reduce overall brightness somewhat. I think it's fine because you shouldn't run them at max brightness anyway.

TLDR it's complicated, and this was a very shallow explanation that probably contained some errors. It definitely contains some intentional omissions.
Fascinating! Thank you very much. But, in the end how much value is there in calibrating one of these screens?
 
Tony,

What have your experiences been so far running Lightroom on your iMac? On mine I'm finding that there is a very pronounced lag to almost all operations in the Develop module. Move a slider and wait a second or so to see the result. Scrolling in Grid view is also choppy, but I think that's probably just Lightroom, but it may look a bit worse on the retina screen.

LR5 is a bit slower when flipping through images, especially in full frame mode. This goes away with 1:1 previews. Adjustments are every bit as instantaneous as my late 2012 iMac. My RiMac is maxed out though, so YMMV.

LR5 has never been a speed demon, and now it has to move around 4x the pixels. Naturally that is going to slow things down.

Good news is not far away, though. I've been keeping tabs on the latest LR rumors. The next generation has been announced by Adobe for March, 2015. Betas will probably precede that by a month or two. There has been repeated mention of "70% faster", which can only mean that LR will now make use of the GPU and perhaps multi-threaded processors. That will speed up things dramatically.

What remains to be seen (besides added features) is if it will be LR6, or LR Cloud.
 
LR5 is a bit slower when flipping through images, especially in full frame mode. This goes away with 1:1 previews. Adjustments are every bit as instantaneous as my late 2012 iMac. My RiMac is maxed out though, so YMMV.
So basically you don't see a problem, correct? That's good news and bad news: good that there's not something fundamentally wrong with the machines, bad in that I still have to figure out what is going on with mine. Thanks for the reply.
 
So basically you don't see a problem, correct? That's good news and bad news: good that there's not something fundamentally wrong with the machines, bad in that I still have to figure out what is going on with mine. Thanks for the reply.

What configuration iMac do you have? Having the base GPU will probably make things a bit slower.
 
What configuration iMac do you have? Having the base GPU will probably make things a bit slower.

Why? Lightroom doesn't make any OpenGL calls, and it isn't used in any of the Lightroom filters. What would make it run slower? I don't doubt you that it lags. I just don't think it has anything to do with the difference between gpu options.

Fascinating! Thank you very much. But, in the end how much value is there in calibrating one of these screens?

No problem. My past reading list on this stuff was enormous. The post could contain some errors because I didn't exactly fact check myself while typing.

I can only tell you that from my own anecdotal experience. NEC used to have a lot of software bugs, but their more recent generations seem really great. The early 90s series and 80s series ones from around 2004-2006 were nowhere near as good as what you have available today, but in terms of actual value, I had far fewer headaches than I did with Apple displays of similar eras. As I mentioned they tend to have some compensation for display uniformity, which makes it easier if you have to judge versions of an image side by side. It was definitely easier to get a satisfying greyscale, and I noticed less inconsistency from measurement to measurement in terms of the end result with a well engineered solution rather than one where the display to be measured was completely unknown by the software oem. I realize that may not be a satisfying answer, but I certainly wouldn't base my decision based solely on advertised gamut when for a given set of colors the one with the wider gamut is not guaranteed to provide something closer to the desired response.

The only remaining thing that I kind of dislike about NEC is their displays often don't perform well at low brightness settings, much like with Apple. For me it's easier to maintain consistency in image editing using a low standardized brightness. Prepress standards often dictate 160cd/m^2, but those standards also dictate a very bright and very diffused illuminant. Viewing booths/tables that conform to that specification are extremely expensive (upwards of $5k) and require more space than I currently have available.
 
What configuration iMac do you have? Having the base GPU will probably make things a bit slower.
I have the upgraded CPU and GPU, but I don't think that Lightroom makes much if any use of the GPU anyway. There must be something fundamentally off somewhere in this machine, but I just can't figure out how to find what it is. I don't think it's hardware. I may try reinstalling OS.
 
Why? Lightroom doesn't make any OpenGL calls, and it isn't used in any of the Lightroom filters. What would make it run slower? I don't doubt you that it lags. I just don't think it has anything to do with the difference between gpu options.

Yes, totally meant to type "CPU". As I mentioned in the prior post, LR does not use the GPU.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.