Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Yes, totally meant to type "CPU". As I mentioned in the prior post, LR does not use the GPU.

I just read the prior ones. I wrote the other post in chunks, so when I responded before, I had only seen that one after hitting submit on my very lengthy post. I hadn't heard that about the supposed 70% faster, but I've noticed from other Adobe applications that in actual use, you typically require a very significant difference between gpus to make a real performance difference, with the big gap being between something running entirely on the cpu and that same function with extensive OpenCL calls.

In that sense the big difference comes down to whether a gpu is supported. I think with the current generation everything with the possible exception of the macbook air would make that cut, although the dedicated vram is nice. I'm still skeptical because you can never tell with Adobe, but it could be more significant for Lightroom than it was for photoshop. Lightroom has more in the way of highly parallel floating point computation problems. It uses a gamma 1.0 version of the original Prophoto RGB for raw images, presumably transformed from camera specific input profiles. I'm also inclined to assume they're storing in floating point values due to the extended range of the data.

I don't know the details of it, as Adobe hasn't published them and I haven't attempted to reverse engineer it. It would seem well aligned with gpu based computation either way. I think on Adobe's end they probably don't want all of this determined at runtime. OpenCL had some stability issues, and obviously they don't want slightly different results output if color correction computation is made on the gpu as opposed to the cpu, so they may have been waiting for the lowest system they intend to support to be capable of leveraging the gpu for certain functions. This is assuming they don't limit it to things such as filters where the level of user controlled precision is a bit coarser.

My responses are longer than intended today.
 
I guess there must be something wrong with the machine, though I don't know what it could be. I erased the main drive and did a fresh install of OS and Lightroom, nothing else, and the problem persists. Big bummer.
 
I just read the prior ones. I wrote the other post in chunks, so when I responded before, I had only seen that one after hitting submit on my very lengthy post. I hadn't heard that about the supposed 70% faster, but I've noticed from other Adobe applications that in actual use, you typically require a very significant difference between gpus to make a real performance difference, with the big gap being between something running entirely on the cpu and that same function with extensive OpenCL calls.

In that sense the big difference comes down to whether a gpu is supported. I think with the current generation everything with the possible exception of the macbook air would make that cut, although the dedicated vram is nice. I'm still skeptical because you can never tell with Adobe, but it could be more significant for Lightroom than it was for photoshop. Lightroom has more in the way of highly parallel floating point computation problems. It uses a gamma 1.0 version of the original Prophoto RGB for raw images, presumably transformed from camera specific input profiles. I'm also inclined to assume they're storing in floating point values due to the extended range of the data.

I don't know the details of it, as Adobe hasn't published them and I haven't attempted to reverse engineer it. It would seem well aligned with gpu based computation either way. I think on Adobe's end they probably don't want all of this determined at runtime. OpenCL had some stability issues, and obviously they don't want slightly different results output if color correction computation is made on the gpu as opposed to the cpu, so they may have been waiting for the lowest system they intend to support to be capable of leveraging the gpu for certain functions. This is assuming they don't limit it to things such as filters where the level of user controlled precision is a bit coarser.

My responses are longer than intended today.

No worry about the responses, they are clear and succinct as needs be.

I can't Adobe see not supporting GPUs this time around. With advent of 4/5k displays, it has to be a requirement for any version update.

edit: fixed a real bad autocorrect

----------

I guess there must be something wrong with the machine, though I don't know what it could be. I erased the main drive and did a fresh install of OS and Lightroom, nothing else, and the problem persists. Big bummer.

You might try going to the Lightroom forum on Adobe.com. You might get somebody there with similar issues with the RiMac.
 
Last edited:
thekev said:
Adobe 1998 can help in some cases...

Thanks for this answer.
I think, An ICC "virtual" calibration is enough for me actually. But i want it to be as close as possible to the prints i will make. Not just "oh, yeah, it looks ok".
I know that you can't have similar results between a screen and a fine art mat print but with glossy and luster paper prints that i've seen in a professional lab, it was very very close to screen display. Well, ok it was probably a 10k$ equipment (or even more).

Anyway i will be very busy in the next months and I will buy my new computer probably in February/March 2015.
The new Nec 4k 32" will be tested then and i will have to make a choice.

MacPro + NEC PA322UHD can be a nice solution. More expansive of course. But with 32" and a computer more powerful that will still be efficient in a few years, handling well the standard software of the future.

Thanks.
 
No worry about the responses, they are clear and succinct as needs be.

I can't Adobe see not supporting GPUs this time around. With advent of 4/5k displays, it has to be a requirement for any version update.

edit: fixed a real bad autocorrect



Yeah the auto-correct made me laugh. Adobe could technically the things necessary for Lightroom to look good at 4 or 5k without rewriting their adjustment tools using OpenCL.

Thanks for this answer.
I think, An ICC "virtual" calibration is enough for me actually. But i want it to be as close as possible to the prints i will make. Not just "oh, yeah, it looks ok".
I know that you can't have similar results between a screen and a fine art mat print but with glossy and luster paper prints that i've seen in a professional lab, it was very very close to screen display. Well, ok it was probably a 10k$ equipment (or even more).

Anyway i will be very busy in the next months and I will buy my new computer probably in February/March 2015.
The new Nec 4k 32" will be tested then and i will have to make a choice.

MacPro + NEC PA322UHD can be a nice solution. More expansive of course. But with 32" and a computer more powerful that will still be efficient in a few years, handling well the standard software of the future.

Thanks.

It's not exactly virtual. The color engine used by the OS will store a hardware profile that describes the device and provides any pre-framebuffer color transformations that are to be applied. When you use a colorimeter such as the spyder along with its accompanying software, you are simply writing a new profile for your specific display based on the measurements taken by that device and the interpretations of that software package. This means the final profile that is created can differ between software packages and hardware devices, but in each case it is unique to that display.

The setup at a given lab is quite different. Depending on your market, they may have decent viewing equipment. Ideal viewing conditions would be very low lighting and a print viewer of some kind such as a normalicht that provides a very diffused source of illumination that matches both the color temperature and brightness of your display. Ideally when comparing that and your display should be the only light sources. After that as long as the ink balance is good, it should look good under other light sources. This is because while the reflected color will appear difference, your eyes are highly contextual. Everything else will also be seen under that lighting. This of course assumes your printer uses a good ink combination. The various inks have differing transmissive characteristics, which can be visually non-uniform relative to the amount of ink applied to the page.

Anyway NEC is good. If you keep it calibrated and have a good viewing environment, you can maintain decent consistency. The highest standard of accuracy would be if the color reflected from a given point on the printed page was identical in brightness and composition to what is emitted from the display. That would be referred to as spectral alignment, but that isn't feasible with available technology.
 
LR5 is a bit slower when flipping through images, especially in full frame mode. This goes away with 1:1 previews. Adjustments are every bit as instantaneous as my late 2012 iMac. My RiMac is maxed out though, so YMMV.
Back one more time to make sure I understand. If you do something like set a gradient filter and then move an adjustment slider, such as saturation, back and forth, are the visual changes on screen smooth or abrupt. On my machine they are very abrupt, and from my current understanding they should be on all of these iMacs. Would you look here to see what I'm talking about?

https://forums.macrumors.com/posts/20328918/
 
Back one more time to make sure I understand. If you do something like set a gradient filter and then move an adjustment slider, such as saturation, back and forth, are the visual changes on screen smooth or abrupt. On my machine they are very abrupt, and from my current understanding they should be on all of these iMacs. Would you look here to see what I'm talking about?

https://forums.macrumors.com/posts/20328918/

I've tried all the things you listed in this thread and elsewhere. I've used gradient filters, adjustment filter brushes, I've zoomed in/out, cropped and rotated. I've gone into lens corrections and done manual scaling, perspective shifts, etc.

I have no issues. It seems to be a smooth as it was on my late 2012 iMac. All slider corrections are fast and dynamically displayed. This probably isn't what you wanted to hear, and I totally understand the frustration you and others are experiencing. The issues, as described, are not unique to you but do not seem to be universal.

I did notice one thing - when displaying full screen, full rez images, it scrolls a lot slower in develop module, even with 1:1 previews. Switch to library module and things quickly improve.

If you want, you can link to a full res RAW image that you know you have had difficulty with, and give me the exact steps that have shown the sluggishness. I'll see if I can repeat it.

I would suggest checking permissions on your library folders and make sure all are "read/write". Also do a "repair permissions" with disk utility. All standard advice, I know, but (ESPECIALLY with Yosemite) this has fixed issues I have had with other applications. It doesn't take long and there is only upside to doing the repairs.
 
I've tried all the things you listed in this thread and elsewhere. I've used gradient filters, adjustment filter brushes, I've zoomed in/out, cropped and rotated. I've gone into lens corrections and done manual scaling, perspective shifts, etc.

I have no issues. It seems to be a smooth as it was on my late 2012 iMac. All slider corrections are fast and dynamically displayed. This probably isn't what you wanted to hear, and I totally understand the frustration you and others are experiencing. The issues, as described, are not unique to you but do not seem to be universal.

I did notice one thing - when displaying full screen, full rez images, it scrolls a lot slower in develop module, even with 1:1 previews. Switch to library module and things quickly improve.

If you want, you can link to a full res RAW image that you know you have had difficulty with, and give me the exact steps that have shown the sluggishness. I'll see if I can repeat it.

I would suggest checking permissions on your library folders and make sure all are "read/write". Also do a "repair permissions" with disk utility. All standard advice, I know, but (ESPECIALLY with Yosemite) this has fixed issues I have had with other applications. It doesn't take long and there is only upside to doing the repairs.
You've got that right! Thanks for the offer to try a test image, but there isn't one type of problem image, it's essentially everything. I have cache and library on SSD (I assume you're using SSD?) and I've tried catalog both on SSD and on external drive. I went ahead and repaired permissions and checked read/write permissions as you suggested. Last evening I even did a complete fresh (wiped SSD) install of OS and then installed just Lightroom, no difference. I have to say that it's performing better than when I first raised the issue, and maybe by now I'm picking nits. Most adjustments are OK. I keep using the graduated filter as a measure, because it most clearly demonstrates the lag. Example: my highest resolution images are 5472x3648. If I set a graduated filter with the image at "FIT", when I move the saturation slider very rapidly from one side to the other, there is a lag before the effect is seen. Moving the slider more slowly will show changes as a series of steps. If I now zoom to 2:1, therefore displaying fewer pixels, the same slider movements are smooth. Similarly, if I switch the display to 2880x1440 low resolution, slider movements are smooth even with the image at "FIT". Well there's not much more for me to do, but thanks for taking the time to give me feedback.
 
You've got that right! Thanks for the offer to try a test image, but there isn't one type of problem image, it's essentially everything. I have cache and library on SSD (I assume you're using SSD?) and I've tried catalog both on SSD and on external drive. I went ahead and repaired permissions and checked read/write permissions as you suggested. Last evening I even did a complete fresh (wiped SSD) install of OS and then installed just Lightroom, no difference. I have to say that it's performing better than when I first raised the issue, and maybe by now I'm picking nits. Most adjustments are OK. I keep using the graduated filter as a measure, because it most clearly demonstrates the lag. Example: my highest resolution images are 5472x3648. If I set a graduated filter with the image at "FIT", when I move the saturation slider very rapidly from one side to the other, there is a lag before the effect is seen. Moving the slider more slowly will show changes as a series of steps. If I now zoom to 2:1, therefore displaying fewer pixels, the same slider movements are smooth. Similarly, if I switch the display to 2880x1440 low resolution, slider movements are smooth even with the image at "FIT". Well there's not much more for me to do, but thanks for taking the time to give me feedback.

When you say you are setting resolution - is the default resolution "Best (Retina)"? Technically speaking, this scales everything to 2560x1440, with each pixel having 4x pixels. Otherwise all your text and icons would get real tiny. If you <option>click "scale", it should still show 2560 as being "best".
 
When you say you are setting resolution - is the default resolution "Best (Retina)"? Technically speaking, this scales everything to 2560x1440, with each pixel having 4x pixels. Otherwise all your text and icons would get real tiny. If you <option>click "scale", it should still show 2560 as being "best".
Yes, default is Best for Retina.
 
I am having the same problem.....
A staff member of Adobe community on this thread https://forums.adobe.com/thread/1630959
acknowledged the problem

"We are investigating. Reducing the size of the window is most likely to help at this point, as the challenge here is the number of pixels being served up on the 5K screen."
 
Can you explain?

I meant that it could be an issue of the amount of information that needs to be sent to the gpu there. With lightroom it could be bottlenecked at a number of points. Some of the others are different.
 
I meant that it could be an issue of the amount of information that needs to be sent to the gpu there. With lightroom it could be bottlenecked at a number of points. Some of the others are different.
I see, thanks. Whatever Lightroom is doing in Develop, it certainly performs differently from some other photo editing programs. I've given iPhoto, Aperture, and Pixelmatr some workout, and they all seem to run more smoothly, though Aperture scrolls thumbnails in grid view just as jerkily as Lightroom. Not so iPhoto.
 
I see, thanks. Whatever Lightroom is doing in Develop, it certainly performs differently from some other photo editing programs. I've given iPhoto, Aperture, and Pixelmatr some workout, and they all seem to run more smoothly, though Aperture scrolls thumbnails in grid view just as jerkily as Lightroom. Not so iPhoto.

I can't tell you what each is computing before it draws something to screen or how much information is being pushed through. Lightroom may push quite a lot. It can deal with raw files, although they wouldn't be able to re-rasterize each one all the time. That would take too long. Overall I'm not entirely sure.
 
Today I've upgraded to LR 5.7 (released today) and OS X 10.10.1. I didn't test in between updates (I have work to do!) but I am finding that the crop tool slow down seems to have reduced significantly. It's not gone entirely, but the beach balling plus 3-5 sec wait that I (and others) encountered occasionally with this tool beforehand seems to have improved a lot. There's the occasional delay now, but more on the order of a second or under and sometimes it's gone entirely.

Still think that Adobe need to leverage the GPU more in LR 6. I think it's pretty obvious that on something like a crop/rotate - which is essentially an image transform operation - a GPU should be able to have a role in accelerating the process.

Nice to see an improvement though.
 
Today I've upgraded to LR 5.7 (released today) and OS X 10.10.1. I didn't test in between updates (I have work to do!) but I am finding that the crop tool slow down seems to have reduced significantly. It's not gone entirely, but the beach balling plus 3-5 sec wait that I (and others) encountered occasionally with this tool beforehand seems to have improved a lot. There's the occasional delay now, but more on the order of a second or under and sometimes it's gone entirely.

Still think that Adobe need to leverage the GPU more in LR 6. I think it's pretty obvious that on something like a crop/rotate - which is essentially an image transform operation - a GPU should be able to have a role in accelerating the process.

Nice to see an improvement though.
I too notice an improvement in the crop tool. Not much change, if any, to other adjustments, however. Overall I find LR to be perfectly useable, though not as smooth as I'm used to.
 
Still think that Adobe need to leverage the GPU more in LR 6. I think it's pretty obvious that on something like a crop/rotate - which is essentially an image transform operation - a GPU should be able to have a role in accelerating the process.

Nice to see an improvement though.

Or at all for that matter! Everything I've read says LR currently doesn't use GPU for much of anything. Even on my old MacPro PS crop/rotate/zoom is smooth and silky but in LR it's pokey at best, choppy and screen tearing at it's worst.

Also thanks for all the tests and updates.
 
Today I've upgraded to LR 5.7 (released today) and OS X 10.10.1. I didn't test in between updates (I have work to do!) but I am finding that the crop tool slow down seems to have reduced significantly. It's not gone entirely, but the beach balling plus 3-5 sec wait that I (and others) encountered occasionally with this tool beforehand seems to have improved a lot. There's the occasional delay now, but more on the order of a second or under and sometimes it's gone entirely.

Still think that Adobe need to leverage the GPU more in LR 6. I think it's pretty obvious that on something like a crop/rotate - which is essentially an image transform operation - a GPU should be able to have a role in accelerating the process.

Nice to see an improvement though.

I too notice an improvement in the crop tool. Not much change, if any, to other adjustments, however. Overall I find LR to be perfectly useable, though not as smooth as I'm used to.

I have no idea why you found any improvements in the crop tool. I found exactly zero improvement on my end.
 
To be honest, Lightroom was and remains a GREAT tool. I'm using it (at this very moment) to edit professional work (weddings) and it's very useable, even for a 'power user' such as myself. I'm sure some people have better experiences, and some less good, but I don't think there is ANY cause for panic.

As the 5K matures, software will improve to cater for it and it will become a better and better tool.
 
I am a heavy user of lightroom, we shoot a bunch of weddings with Nikon d750s RAW files. I use the sliders via keyboard so I am extremely fast and there usually a delay for the computer to catch up to me on my Mac Mini. I can tell you lightroom is about the same speed as my Mac Mini i7 2.6 2012 16gb at 4k resolution (3880x2???).

After reading this post I switched it over to 2560x 1??? (or whatever it is, I don't remember the exact resolutions.) and it sped up lightroom dramatically. It is almost an immediate response to my keyboard inputs. I hope whatever the issue is that it is resolved via software and not hardware. It would be a shame if I bought an $3,500 iMac and am not able to use it at 4k in lightroom.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.