Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Paulshaqz

macrumors 6502a
Original poster
Sep 10, 2009
905
174
sure there's more threads on this but i haven't seen any true answer anyways i was curious to know i'm planning on buying a new 5k iMac with 4ghz 1tb fushion drive and as far as graphics card not sure however i'm not really a gamer when it comes to a mac but i wanted to know is if i upgrade to the M295X would i see an increase for better speeds and graphics for movies and apps or is it just for games? not sure if it's worth the $250 upgrade? thanks in advance.
 
I asked an "retina iMac tech" from Apple the very same question when I ordered my retina model today. I mostly just do 2D photo editing and didn't see the need for the premium GPU.

He explained that some photo editing software vendors are beginning to program their apps to use the GPU and he expects that will continue. In these cases the premium GPU will perform better even when simply refreshing 2D images. He also made a point that as a computer ages, the single most important factor that may prevent the latest OS-X OS run on legacy equipment is the GPU. Basically, he said if you're planning on keeping the computer a long time, the premium GPU may provide "upgrade insurance".

Anyway, he convinced me it was worth the upgrade as I'm planning on keeping my new riMac for a long, long time.

I'd suggest calling Apple and ask specifically for a retina iMac tech and see what he/she says about your needs.
 
I had a very difficult time reading the original question in this thread due to the lack of punctuation/run-on sentence. I nearly gave up on and pressed on, but I finally made it through.

What Gary has shared with you is very good advice. In my opinion, an upgraded GPU is almost more important than an upgraded CPU. Several years down the road, the first thing to bring a computer to its knees will be the GPU. I think it is best to upgrade it now, especially in an all-in-one computer such as the iMac.

Cheers,
Bryan
 
I asked an "retina iMac tech" from Apple the very same question when I ordered my retina model today. I mostly just do 2D photo editing and didn't see the need for the premium GPU.

He explained that some photo editing software vendors are beginning to program their apps to use the GPU and he expects that will continue. In these cases the premium GPU will perform better even when simply refreshing 2D images. He also made a point that as a computer ages, the single most important factor that may prevent the latest OS-X OS run on legacy equipment is the GPU. Basically, he said if you're planning on keeping the computer a long time, the premium GPU may provide "upgrade insurance".

Anyway, he convinced me it was worth the upgrade as I'm planning on keeping my new riMac for a long, long time.

I'd suggest calling Apple and ask specifically for a retina iMac tech and see what he/she says about your needs.
i honestly was thinking bout that too i'd figure if it was just for gaming i wouldn't need to invest but if photos and apps make a difference then i probably will just invest in the long run.
I had a very difficult time reading the original question in this thread due to the lack of punctuation/run-on sentence. I nearly gave up on and pressed on, but I finally made it through.

What Gary has shared with you is very good advice. In my opinion, an upgraded GPU is almost more important than an upgraded CPU. Several years down the road, the first thing to bring a computer to its knees will be the GPU. I think it is best to upgrade it now, especially in an all-in-one computer such as the iMac.

Cheers,
Bryan
i apologize for my typing :(
 
I asked an "retina iMac tech" from Apple the very same question when I ordered my retina model today. I mostly just do 2D photo editing and didn't see the need for the premium GPU.

He explained that some photo editing software vendors are beginning to program their apps to use the GPU and he expects that will continue. In these cases the premium GPU will perform better even when simply refreshing 2D images. He also made a point that as a computer ages, the single most important factor that may prevent the latest OS-X OS run on legacy equipment is the GPU. Basically, he said if you're planning on keeping the computer a long time, the premium GPU may provide "upgrade insurance".

Anyway, he convinced me it was worth the upgrade as I'm planning on keeping my new riMac for a long, long time.

I'd suggest calling Apple and ask specifically for a retina iMac tech and see what he/she says about your needs.
Sounds like he's just selling you the higher-specced one to be honest. It's extremely unlikely that you will see difference in 2D performance between the M290X and M295X. OSX does use some compositioning in its different UI elements, but nowhere near enough that you won't be able to upgrade to future versions of OSX.

The M295X is only need if you're gaming, doing 3D stuff, or doing video editing.
 
+1 on speccing in the 295 GPU for the RMac; anyone who spends this amount of cash needs to complete the job with the decent GPU. The 'standard' 290 GPU is old tech, the 295 is new tech, infact brand new.

Think about it logically, huge screen and resolution and a mobile GPU running it, you need the horses under the bonnet otherwise you are going to be disappointed in the future when you start to push the display which you will.
 
It's a 5k display with huge amount of pixels, 295 with 4g vRam would definitely benefit you when comparing to 290's 2g

After all, when the warranty expires, you can replace almost ANYTHING Except gpu. Cpu, hard drive, ram are all user upgradable after you take the screen off. Only gpu isnt. So if you are planning to keep it for a long time, gpu should be the first to upgrade.
 
Consider this:

http://www.barefeats.com/imac5k5.html

"...the M290X exceeds the M295X on core clock speed, boost speed, and pixel fillrate."

One must also consider HEAT. Does the 295 generate more heat than the 290? If so, such a compact frame of the iMac may result in a shorter life for the 295 in the long run — a big concern for those of you looking for a "future proof" machine that you can use for the next 5 years. This is a big issue for me, as I had to have my high end graphic card in my late 2009 iMac replaced due to it burning out due to heat.
 
Consider this:

http://www.barefeats.com/imac5k5.html

"...the M290X exceeds the M295X on core clock speed, boost speed, and pixel fillrate."

One must also consider HEAT. Does the 295 generate more heat than the 290? If so, such a compact frame of the iMac may result in a shorter life for the 295 in the long run — a big concern for those of you looking for a "future proof" machine that you can use for the next 5 years. This is a big issue for me, as I had to have my high end graphic card in my late 2009 iMac replaced due to it burning out due to heat.

Thanks for that link, kinda puts my mind at rest having ordered the base system and reading many comments on here.
My use is photography, general web, rarely a game and some Paralells use for Windows only software.

Base spec, according to that article, should be able to cover that no problem. Phew!
 
Consider this:

http://www.barefeats.com/imac5k5.html

"...the M290X exceeds the M295X on core clock speed, boost speed, and pixel fillrate."

One must also consider HEAT. Does the 295 generate more heat than the 290? If so, such a compact frame of the iMac may result in a shorter life for the 295 in the long run — a big concern for those of you looking for a "future proof" machine that you can use for the next 5 years. This is a big issue for me, as I had to have my high end graphic card in my late 2009 iMac replaced due to it burning out due to heat.

I have a related concern about power consumption:

https://forums.macrumors.com/threads/1812439/

But the maximum potential heat difference between base and max specs is also significant:

http://support.apple.com/kb/HT3559
 
M290X vs. M295X, side-by-side spec comparison:

http://gpuboss.com/gpus/Radeon-R9-M295X-vs-Radeon-R9-M290X

I eyed 2 specs at the end:

M295X ••• M290X
Tonga ••• Neptune
800Mhz ••• 850MHz
250W ••• 100W

That power spec sounds fishy to me seeing both are "mobile" GPUs. I would also expect to see a higher clocked chip consume more power unless we're talking about different different process technologies.

I myself only use FCPX once in a while, but when I use it I want speed. So I am curious if there is a "discernible" speed difference between the 2 GPUs in FCPX. Of course, only someone who has had first-hand access to both GPUs on the 5K iMac could answer that. But obviously if there's no noticeable difference in FCPX, I would opt for the M290X.
 
Last edited:
+1 on speccing in the 295 GPU for the RMac; anyone who spends this amount of cash needs to complete the job with the decent GPU. The 'standard' 290 GPU is old tech, the 295 is new tech, infact brand new.

Think about it logically, huge screen and resolution and a mobile GPU running it, you need the horses under the bonnet otherwise you are going to be disappointed in the future when you start to push the display which you will.

This is what I did. It was not an easy decision as I am always very aware of heat. I think it sucks that engineers have not really solved the heat issue in general. I HATE running at the edge of the heat envelope. Because I value reliability very highly.

But in this case, I erred on the side of the GPU upgrade for possible future proofing and also because it is so many pixels that I figured the extra power would come in handy at some point for something.

It arrives tomorrow, weather permitting, and that will be very interesting, for sure. As I am really going to pay attention and see what is up with this machine.
 
Both the CPU and GPU upgrades are no-brainers in this iMac. If you can’t afford both, definitely do get the GPU upgraded. I plan on this to last me at least 4-5 years and you’ll regret not getting the top of the line BTO upgrades that can’t be changed later on.
 
M290X vs. M295X, side-by-side spec comparison:

http://gpuboss.com/gpus/Radeon-R9-M295X-vs-Radeon-R9-M290X

I eyed 2 specs at the end:

M295X ••• M290X
Tonga ••• Neptune
800Mhz ••• 850MHz
250W ••• 100W

That power spec sounds fishy to me seeing both are "mobile" GPUs. I would also expect to see a higher clocked chip consume more power unless we're talking about different different process technologies.
250W is wrong. Most likely it is 125W. Push the GPU hard ant the total watts is about 175, and that includes screen and CPU.
 
Both the CPU and GPU upgrades are no-brainers in this iMac. If you can’t afford both...

While you are right about the CPU (i7 all the way!), the GPU is in fact "a brainer." We must use our brains to ponder it deeply, not merely because it costs more, but because of HEAT. Read previous posts and specs on that point. Note that the M290X has a higher clock speed too, which is most likely why it beat the M295X in at least one Bare Feats test.

I speak from experience having maxed out my late 2009 iMac i7 with the latest GPU. Guess what failed after two years of moderate to light use? Yes, the GPU. In fact, Apple swapped it out twice because the first swap didn't resolve the problem. Why did the GPU die? Heat. And I am not a gamer who would otherwise kill a GPU quickly.

I am still using that iMac to this day, typing this post on it, in fact. But I now use smcFanControl to keep the fans running faster than normal, so as to avoid another GPU death prior to my upgrading.

With all that said, I am still not sure if I want to upgrade my iMac YET. Why not? Because as much as that 5K screen is magnetically pulling my wallet out, I stuff that wallet back into its place in the knowledge that perhaps the biggest performance leap (and potentially a "heat-reducer" too) is Broadwell. We'll probably have to wait until June 2015 for that, but for guys like me who don't buy a new Mac but once every 6 years or so, there is merit (and virtue) to patience.
 
What is your concern?

Hi Fred

As you know from my related thread (https://forums.macrumors.com/threads/1812439/ - I should probably just have posted here, but didn't want to hijack the OP's thread), I don't want to run an iMac with a significantly higher base level power consumption, for only the occasional extra practical benefit - IF that's what choosing the M295X would mean.

Besides your kind indicative figures, I haven't seen any other comparative performance data on this front.

----------

250W is wrong. Most likely it is 125W. Push the GPU hard ant the total watts is about 175, and that includes screen and CPU.

Why do you say 250W is wrong, please? You previously said that your iMac ran to 230W with GPU & CPU working hard (again from here: https://forums.macrumors.com/threads/1812439/):

"I don't think there's any way of answering your questions without setting up a trial where one would measure power usage under different scenarios of CPU vs GPU load, but I don't think it really matters. For normal tasks it's not going to be using that much energy. Both models have the same idle rating.

Update: I did some of the testing for you. Here are some measurements I got:

Idle: 35 watts
CPU load: 130 watts
GPU load: 175 watts
CPU+GPU: 230 watts

Just using the computer, running Safari,Mail, playing videos, etc, it stayed mostly in the 50 to 75 watt range. Browsing through full-size images in Lightroom took it to 120 watts."​
 
...perhaps the biggest performance leap (and potentially a "heat-reducer" too) is Broadwell. We'll probably have to wait until June 2015 for that...

I don't see how Broadwell will be a big performance leap. It is only a die shrink from Haswell, so the micro-architecture will be exactly the same. It is likely they'll have improvements in power consumption and clock rate might be *slightly* faster but the i7-4790K in the retina iMac is already running at a base clock of 4Ghz and 4.4Ghz with Turbo Boost.

There will eventually be i7 Broadwell CPUs fabricated at 14 nm, and in theory this could increase the transistor budget to allow higher i7 core counts. However I don't recollect any credible Intel roadmap where this is planned in a packaging and TDP configuration suitable for an iMac.

The next micro-architecture update will be SkyLake.
 
While you are right about the CPU (i7 all the way!), the GPU is in fact "a brainer." We must use our brains to ponder it deeply, not merely because it costs more, but because of HEAT. Read previous posts and specs on that point. Note that the M290X has a higher clock speed too, which is most likely why it beat the M295X in at least one Bare Feats test.

I speak from experience having maxed out my late 2009 iMac i7 with the latest GPU. Guess what failed after two years of moderate to light use? Yes, the GPU. In fact, Apple swapped it out twice because the first swap didn't resolve the problem. Why did the GPU die? Heat. And I am not a gamer who would otherwise kill a GPU quickly.

.



I think this might be a good case for getting Apple Care and selling the machine prior to it ending :eek:
 
Last edited:
Why do you say 250W is wrong, please? You previously said that your iMac ran to 230W with GPU & CPU working hard (again from here: https://forums.macrumors.com/threads/1812439/):
The watt numbers I reported were measured externally with a Kill a Watt, so measured the total system power use of the iMac. So 175 watts while the GPU is being pushed hard includes the GPU, screen, some CPU, and other components. At least 35-40 watts are being used by those other parts, so the upper limit of the GPU would be 135-140. That assumes that the GPU was really pushed to the max, and considering the temperatures and fan speed, I think it was.

Aside from that, I'm pretty sure that the linked article just had their facts wrong. 250 watts is the rating for the desktop equivalent. It would make no sense that the mobile chip would have the same rating. I can't put my hands on the sources at the moment, but I've seen reports that puts the m295x at 125 watts. In the end it doesn't really matter. The system is limited by how much cooling can be done. It's pretty clear that the upper limit for heat output for the GPU is about 105 C., and the fan and possible throttling don't let it go higher. AMD obviously thinks thats not too high.
 
I don't see how Broadwell will be a big performance leap. It is only a die shrink from Haswell...

Exactly — a DIE SHRINK. And a pretty big one too. And as you know, when big shrinks happen, you have two choices in your engineering designs: potentially go fanless (as in laptops) or keep the same heat output but crank up the clock speed noticeably higher than was possible before (at the same heat output).

Further reading:

http://www.extremetech.com/computin...creases-dramatically-better-power-consumption

http://wccftech.com/intel-14nm-broa...haswell-2nd-generation-fivr-20-compute-units/
 
Exactly — a DIE SHRINK. And a pretty big one too. And as you know, when big shrinks happen...keep the same heat output but crank up the clock speed noticeably higher than was possible before

Despite numerous die shrinks, clock speeds have not significantly increased since about 2006. That can be seen by this graph: http://csgillespie.files.wordpress.com/2011/01/clock_speed3.png

Prior to 45nm lithography, each shrink produced a smaller field-effect transistor with a thinner gate dielectric. This held less charge and could switch faster. This is called Dennard Scaling: https://en.wikipedia.org/wiki/Dennard_scaling

However, as of 45nm (about 2006), the gate dielectric is now approximately 0.9nm thick -- about the size of a single silicon-dioxide molecule. It is simply impossible to make this thinner. Also that thin dielectric imposed leakage current and heat due to unavoidable physics.

That is a key reason why the many die shrinks and process enhancements since 2006 have not produced dramatically faster CPU clock rates.

The additional transistor budget allows more cores, but they can't be clocked much faster.

The additional transistors can provide micro-architectural enhancements like larger caches, improved pipelines, better branch prediction, etc. However most of the easy gains have already been obtained.

Typically a pure die shrink will not produce significant improvements in instructions per cycle, however it turns out Broadwell is not a pure die shrink (despite Intel's "Tick Tock" strategy). Intel is making some architectural enhancements to Broadwell, so it may have about a 5% IPC improvement.

The major improvement 14nm brings is the theoretical ability to put 8 cores on an i7 desktop CPU that fits in an iMac power/thermal package.
 
Despite numerous die shrinks, clock speeds have not significantly increased since about 2006. That can be seen by this graph...

According to your graph and your opening line of argument, my late 2009 2.8GHz i7 iMac "should not be significantly slower" in terms of CPU performance with even the latest BTO i7 iMac, which has a base clock of 4.0GHz and Turbo Boost to 4.4GHz. But would one be accurate in believing that? The only way to answer that is to hear one's definition of the word "significant." But in my eyes, the number 2.8 is indeed significantly smaller than 4.4, and benchmarks I've seen show than an owner of a 2009 edition 2.8GHz iMac would indeed see a significant performance boost with a 4GHz Retina iMac. Yes, there have been improvements to the i7 processor above and beyond mere clock speed and there are GPU differences too, but as you can see, there have been improvements to CPU clock speed since 2009, much more since 2006.

How then can it be accurately and effectively argued that Broadwell will be somehow insignificant or apply merely to putting 8 cores on the chip?

Furthermore, I simply stated what I myself am pondering and my reasons for that thinking. For me, a wait for Broadwell most likely would have significance. And indeed, a wait for Skylake would likely have even more significance, although I am not sure if I wish to wait until 2016 for that. But my words should not be misconstrued a some kind of brick wall or force field that prevents anyone else here from getting a Retina iMac. Do what you feel is best. The 5K iMac looks like a great machine!
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.