Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
No, but Apple is promoting performance, so its natural to expect a brand new computer to be able to handle an existing App. Its not Adobe's fault, per say. When I buy a new computer, I expect my apps to run faster not slower
View attachment 519912


Agreed but people are throwing mud at Adobe as if Apple gave them sufficent time to update their applications. I know Apple promoted performance from it's own software like Final Cut Pro X but I don't recall them making any claims on Lightroom. If you want to sling mud then sling it at Apple not Adobe.
 
Agreed but people are throwing mud at Adobe as if Apple gave them sufficent time to update their applications. I know Apple promoted performance from it's own software like Final Cut Pro X but I don't recall them making any claims on Lightroom. If you want to sling mud then sling it at Apple not Adobe.

I'm afraid that doesn't make much sense. The iMac, particularly tricked out, is an extremely powerful computer. Sure, it has a 5K screen attached, but by any measure it's a powerful beast. Yes, you could build a more powerful machine, but it's fairly top end. 4K screens have been around for a while, and 5K was always going to happen. As will higher-res screens in time. Adobe have sat on their laurels as far as performance is concerned. There's nothing vastly different or unpredictable about the 5K iMac, it simply has a much higher resolution. Clearly that requires more power to run it, but it's not like Apple suddenly decided to launch some strange new architecture or Mac OS Y. Apple have provided a high res screen and plenty of hardware power to run it. Of course, a higher end GPU would be nice, but Lightroom isn't even using the GPU.

Slinging mud in any direction is a pointless exercise, but, in my view, Apple have met their manifesto pledges, in releasing a powerful machine and a high-res display. Adobe are playing catchup. It's for that reason that there's a slightly negative sentiment towards them in this thread.
 
I'm afraid that doesn't make much sense. The iMac, particularly tricked out, is an extremely powerful computer. Sure, it has a 5K screen attached, but by any measure it's a powerful beast. Yes, you could build a more powerful machine, but it's fairly top end. 4K screens have been around for a while, and 5K was always going to happen. As will higher-res screens in time. Adobe have sat on their laurels as far as performance is concerned. There's nothing vastly different or unpredictable about the 5K iMac, it simply has a much higher resolution. Clearly that requires more power to run it, but it's not like Apple suddenly decided to launch some strange new architecture or Mac OS Y. Apple have provided a high res screen and plenty of hardware power to run it. Of course, a higher end GPU would be nice, but Lightroom isn't even using the GPU.

Slinging mud in any direction is a pointless exercise, but, in my view, Apple have met their manifesto pledges, in releasing a powerful machine and a high-res display. Adobe are playing catchup. It's for that reason that there's a slightly negative sentiment towards them in this thread.


Alchemist, I’m a bit confused how you can make an assumption that the new architecture and OS isn't going to require a lot of recoding on Adobe's end to optimize LR. Unless you're a software engineer who is privy to the behind the scenes at Adobe you can't assume it isn't a big change just because you don't think it is. I disagree that the release of the Retina iMac isn't a big change. It’s a 5K display and new OS. That’s a big change. Apple doesn't even have all the bugs worked on Yosemite and Adobe was expected to have all their apps working perfect from the get go? Working in Tech myself I've seen simple OS level changes on a Unix OS cause complete havoc on software and I've had to listen to clients with no Tech background tell me it wasn't a big change.

Obviously Adobe are playing catchup. How could they not unless Apple was working with them well in advance? I'm just asking everyone to put themselves in the developers shoes for once and not make assumptions like "since 4k is out then they should of have their apps already optimized for 5k and Yosemite" . Analogies like that don’t float in real life.
 
Alchemist, I’m a bit confused how you can make an assumption that the new architecture and OS isn't going to require a lot of recoding on Adobe's end to optimize LR. Unless you're a software engineer who is privy to the behind the scenes at Adobe you can't assume it isn't a big change just because you don't think it is. I disagree that the release of the Retina iMac isn't a big change. It’s a 5K display and new OS. That’s a big change. Apple doesn't even have all the bugs worked on Yosemite and Adobe was expected to have all their apps working perfect from the get go? Working in Tech myself I've seen simple OS level changes on a Unix OS cause complete havoc on software and I've had to listen to clients with no Tech background tell me it wasn't a big change.

Obviously Adobe are playing catchup. How could they not unless Apple was working with them well in advance? I'm just asking everyone to put themselves in the developers shoes for once and not make assumptions like "since 4k is out then they should of have their apps already optimized for 5k and Yosemite" . Analogies like that don’t float in real life.

First off, I think you may have misread my post. I said that Apple have not launched a new architecture or whole new OS. The iMac 5K has the same Haswell chips as many other machines and while the screen has more resolution, that was an entirely foreshadowed event. As for the OS, Apple now releases annual updates and Yosemite doesn't represent some sort of paradigm shift. Yes it's new, but frankly I, a non-developer, was using the Public Beta for months before it's launch so any suggestion that Adobe couldn't get hands on with Yosemite before official launch is bunk.

I'm happy to put myself in the developers shoes and, as I said in my previous post, I agree that slinging mud isn't particularly helpful, but the issue here is that Adobe hasn't planned for the future. Many people were clamouring for GPU leveraging when we were all on 2560x1440 screens, not because it was needed per se, but because the hardware was there and it seemed appropriate to use it. Adobe didn't pursue this (in fact they suggested it wasn't achievable or wasn't worthwhile - a claim I dispute) and now that we do need it they find themselves behind the curve.

You're talking as if the 5K iMac is some sort of strange unpredicted beast. It's really not. It's a computer with a higher pixel pitch than previous displays and a higher max resolution. Sure, it's a large leap resolution wise, but in many ways it's no different from other increases in available resolution. There was a time we were all on 640x480 screens - the jump to 800x600 was predicted and while hardware sometimes wasn't up to it, the software invariably was. In this case, the hardware is up to it, but Lightroom - specifically Lightroom - struggles. That's why I have I point (a fairly non-aggressive finger) at Adobe.

I expect to see improvements in LR 6 but it annoys me that as a company they are sometimes more reactive than proactive. There main client base is comprised of working pros for who time is quite literally money. I wish they'd put as much time into optimisation and performance as they did into trying to get me to buy CC.
 
Very choppy

The new iMac 5k is very choppy running Lightroom which was a bad surprise.

i7 4ghz
16Gb memory
AMD Radeon R9 m295X 4MB
Fusion drive

Get the beachball a lot just making minor adjustments. Very frustrating.

The display is great but the processing is in the toilet.
 
I expect to see improvements in LR 6 but it annoys me that as a company they are sometimes more reactive than proactive. There main client base is comprised of working pros for who time is quite literally money. I wish they'd put as much time into optimisation and performance as they did into trying to get me to buy CC.

The few people I know who are interested in the rimac are wealthy travelers.
They all use Lightroom and could care less about the cost of CC.
They want fast just as much as a pro. They consider time more valuable than money and spend a bunch to quicken any asset.

Can't recommend the new machine until it's an actual performance upgrade.
For the moment, I'm passing the word along to wait. The screen is sweet, but I can hear the howls when they hold down the right arrow key in Develop and watch beach balls.
 
....the 5K iMac....it's a large leap resolution wise, but in many ways it's no different from other increases in available resolution....

I haven't seen anybody complaining about Lightroom performance at 4k on the nMP running Yosemite. Is that correct, or did I miss that?

There's not that much difference between 4k and 5k re data management. The four and six-core nMP doesn't have appreciably more CPU performance than a 4Ghz retina iMac. LR makes little to no GPU use, so the nMP GPUs can't be helping.

I wonder what exactly is the difference? Why is Lightroom running perfectly OK at 4k on the nMP but not at 5k on the Retina iMac?
 
First off, I think you may have misread my post. I said that Apple have not launched a new architecture or whole new OS. The iMac 5K has the same Haswell chips as many other machines and while the screen has more resolution, that was an entirely foreshadowed event. As for the OS, Apple now releases annual updates and Yosemite doesn't represent some sort of paradigm shift. Yes it's new, but frankly I, a non-developer, was using the Public Beta for months before it's launch so any suggestion that Adobe couldn't get hands on with Yosemite before official launch is bunk.

I'm happy to put myself in the developers shoes and, as I said in my previous post, I agree that slinging mud isn't particularly helpful, but the issue here is that Adobe hasn't planned for the future. Many people were clamouring for GPU leveraging when we were all on 2560x1440 screens, not because it was needed per se, but because the hardware was there and it seemed appropriate to use it. Adobe didn't pursue this (in fact they suggested it wasn't achievable or wasn't worthwhile - a claim I dispute) and now that we do need it they find themselves behind the curve.

You're talking as if the 5K iMac is some sort of strange unpredicted beast. It's really not. It's a computer with a higher pixel pitch than previous displays and a higher max resolution. Sure, it's a large leap resolution wise, but in many ways it's no different from other increases in available resolution. There was a time we were all on 640x480 screens - the jump to 800x600 was predicted and while hardware sometimes wasn't up to it, the software invariably was. In this case, the hardware is up to it, but Lightroom - specifically Lightroom - struggles. That's why I have I point (a fairly non-aggressive finger) at Adobe.

I expect to see improvements in LR 6 but it annoys me that as a company they are sometimes more reactive than proactive. There main client base is comprised of working pros for who time is quite literally money. I wish they'd put as much time into optimisation and performance as they did into trying to get me to buy CC.

5kiMac was a foreshadowed event? When? I seriously didn't hear about the 5k iMac until the very end of September. I heard rumors of Apple in the works of a 4k display in the summer but nothing close to concrete. Even at that I think most were surprised 5K was shipping right after the announcement.

Correct, Haswell has remained the same and the issue from all accounts is pointing to the resolution. I understand Yosemite was beta and I’m sure Adobe had their hands on it, but it's still a new OS that makes the equation a little bit bigger(requires resources no matter what).

Yes I agree the hardware is definitely up to the task of 5k. Funny enough Apple didn't get the memo themselves as this forum has been littered with complaints about Apple's own OS being 'laggy' and 'slow' for certain tasks. I'd venture to say you're simplifying the issue of the resolution a bit considering there are tons of reported issues just with Apple's own software let alone a third party app like LR who has to play catchup.

You said the 5K iMac is not some sort of strange unpredicted beast. I don't know how to answer that since I'm not a software engineer at Adobe and at the end of the day probably nobody is smart enough to actually technically answer that question on these forums but someone reported that the Adobe engineers said "the new 5k screen is a beast for LR to push all those pixels". Is this Adobe’s fault for non-GPU use of software? Perhaps. To be fair there is a good shot they were saving this for LR 6 and probably didn’t considered Apple would realistically have a 5k display out to the masses in 2014. (To be a fair a 5k iMac in 2014 was definitely a surprise to many and a major move by Apple).

I think the real issue is expectations Apple lures users into. They are so secret on their projects that companies get no real notice to realistically update their software in time and people who rely on their particular software lose their minds when they are the first in line to buy them and not all of their software works without glitches. I’m not an Adobe apologist and not a fan of the cloud but the only way there is a chance things are going to work from the get go is for Apple to actively work with companies like Adobe in advance.
 
There's not that much difference between 4k and 5k re data management. The four and six-core nMP doesn't have appreciably more CPU performance than a 4Ghz retina iMac. LR makes little to no GPU use, so the nMP GPUs can't be helping.

I wonder what exactly is the difference? Why is Lightroom running perfectly OK at 4k on the nMP but not at 5k on the Retina iMac?

The difference between 4k and 5k is almost double the size in megapixels.

Dont let the seemingly small 1k difference fool you.
monitor_5Kchart.jpg
 
The difference between 4k and 5k is almost double the size in megapixels...Dont let the seemingly small 1k difference fool you....

Thanks for posting that. Still it's only 2x more data. The i7-4790K has a transfer bandwidth of 27.2 GB/sec, and in theory could re-write a 5k frame buffer 500 times per second. The i7-4790K can do 117 gigaflops, which is over 1000x faster than a Cray-1 supercomputer from the late 1970s.

The question is who squandered that performance, and how?

Have there been reports of Aperture, Photoshop cc or Premiere Pro on the retina iMac also having major performance problems? Or is it mainly just Lightroom?
 
Well considering Aperture has no problem adjusting and cropping my D800 files… and that program has be EOL'ed

I'd have to say its Adobe.:rolleyes:
 
Hey Guys,
I am a professional user of LR and PS and am lucky enough to be able to see and talk with some of the Adobe engineers on the LR and PS team. I have been considering a 5k iMac and asked them about issues in the LR develop module. They can never say much about future releases so they are always careful with words. However they did acknowledge that the new 5k screen is a beast for LR to push all those pixels. LR does slow down with certain develop actions. You can see on the adobe forums they admit it there. They did say they are working very hard on LR 6 and are confident we will all be happy with how it performs on the new 5k screen. They couldn't give a release date for LR 6 but did say it was for sure NOT this year. I have a lot of confidence in the adobe team. They know the issue and are working hard to fix it. Hopefully we don't have to wait too long.....

Cheers,
Thanks very much for that information! We have something to look forward to.
 
Well considering Aperture has no problem adjusting and cropping my D800 files… and that program has be EOL'ed

I'd have to say its Adobe.:rolleyes:

See my post above. I have tested other applications that don't seem to have the same problems with images.

BTW, Preview also has issues.
 
The new iMac 5k is very choppy running Lightroom which was a bad surprise.

i7 4ghz
16Gb memory
AMD Radeon R9 m295X 4MB
Fusion drive

Get the beachball a lot just making minor adjustments. Very frustrating.

The display is great but the processing is in the toilet.

Check the Spotify app on the full screen :D
 
Well considering Aperture has no problem adjusting and cropping my D800 files… and that program has be EOL'ed

I'd have to say its Adobe.:rolleyes:

My guess would be that Lightroom doesn't utilize the GPU at all. If Adobe would finally add GPU support that would speed up rotate, crop and faster adjustments etc. Here's hoping LR6 shows up sooner rather than later.
 
My guess would be that Lightroom doesn't utilize the GPU at all. If Adobe would finally add GPU support that would speed up rotate, crop and faster adjustments etc. Here's hoping LR6 shows up sooner rather than later.

While it's true that LR doesn't appear to use the GPU, I don't see Photoshop using it much either. When I monitor the GPU activity level of my 2013 iMac 27 with iStat Menus 5.03, Photoshop cc 14.2.1 hardly touches the GPU during crop and rotate. It does use the GPU a lot for Smart Sharpen and a few other things.

Comparing that to LR 5.7, I likewise see little GPU activity during crop & rotate or any other activity.

For the simple, common LR actions people are complaining about on the retina iMac, I don't see Photoshop using the GPU for those either. Hence I don't see how simply using the GPU will fix that for LR. The problem must be deeper than that.

A full 5k image is about 87% bigger than a 4k image on nMP. However the 4Ghz retina iMac's CPU is at least 14% faster than a quad-core nMP, so the lessens the difference somewhat. Neither iMac or nMP are using the GPU in Lightroom. However nobody is complaining about LR performance on the nMP at 4k.

The behavior is basic LR operations like crop/rotate are *not* a little slower at 5k vs 4k -- they are vastly slower. I can't see how an 87% larger image on a 14% faster machine (call the difference about 70%) accounts for such a huge difference. There must be more involved.
 
I'm not disputing your experience but when PS started using GPU acceleration rotate and zoom features were much faster and smoother. My hope would be for a CPU/GPU in LR6 that would help especially pushing 5K displays like the iMac.

While it's true that LR doesn't appear to use the GPU, I don't see Photoshop using it much either. When I monitor the GPU activity level of my 2013 iMac 27 with iStat Menus 5.03, Photoshop cc 14.2.1 hardly touches the GPU during crop and rotate. It does use the GPU a lot for Smart Sharpen and a few other things.

Comparing that to LR 5.7, I likewise see little GPU activity during crop & rotate or any other activity.

For the simple, common LR actions people are complaining about on the retina iMac, I don't see Photoshop using the GPU for those either. Hence I don't see how simply using the GPU will fix that for LR. The problem must be deeper than that.

A full 5k image is about 87% bigger than a 4k image on nMP. However the 4Ghz retina iMac's CPU is at least 14% faster than a quad-core nMP, so the lessens the difference somewhat. Neither iMac or nMP are using the GPU in Lightroom. However nobody is complaining about LR performance on the nMP at 4k.

The behavior is basic LR operations like crop/rotate are *not* a little slower at 5k vs 4k -- they are vastly slower. I can't see how an 87% larger image on a 14% faster machine (call the difference about 70%) accounts for such a huge difference. There must be more involved.
 
The behavior is basic LR operations like crop/rotate are *not* a little slower at 5k vs 4k -- they are vastly slower. I can't see how an 87% larger image on a 14% faster machine (call the difference about 70%) accounts for such a huge difference. There must be more involved.
Other than the fact that you haven't seen complaints from nMP 4K users, what is your basis for stating it is vastly slower? I don't find it to be vastly slower when compared to using it on even 2K displays. I think the speed differences are getting way overblown.
 
Other than the fact that you haven't seen complaints from nMP 4K users, what is your basis for stating it is vastly slower? I don't find it to be vastly slower when compared to using it on even 2K displays. I think the speed differences are getting way overblown.

try to edit +500 photos in a single session and use the crop tool with each one of them on the 5K. And good luck
 
joema2 said:
The behavior is basic LR operations like crop/rotate are *not* a little slower at 5k vs 4k -- they are vastly slower.

Other than the fact that you haven't seen complaints from nMP 4K users, what is your basis for stating it is vastly slower? I don't find it to be vastly slower when compared to using it on even 2K displays. I think the speed differences are getting way overblown.

https://www.youtube.com/watch?v=sczrtlZKZQM

To clarify, I'm talking about basic LR operations at 5k on the top-spec retina iMac on Yosemite, vs similar operations at 4k on quad-core nMP.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.