Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

johnh57

macrumors regular
Original poster
Jul 6, 2011
130
30
Montana
Starting to think about a 27" Retina Imac. I have one external monitor I plan to use with the imac. It is a 32" 2560 x 1440 resolution Benq cad display.

I don't do gaming or video. I do some limited rendering (sketchup type work). Primarily it will be autocad work.

Is the $250 upgrade to go from a 2gb Gpu to a 4gb Gpu something to worry about? Do it just because you can't upgrade it later?

I'm living with a 2010 15" macbook pro now with a massive 256 mb gpu. It gets balky trying to pan and zoom in sketchup. I don't know if thats a software issue, a gpu issue, or a cpu issue. (my mbp has 8gb ram, 1st gen i5, and a 500 gb ssd)

What I'm looking at is 27", retina display, i7, 500 gb ssd, 32 gb mem (probably buy with 8, and buy a 32 gb setup from crucial). But I'm pondering about the GPU upgrade.

I'm not in a massive hurry, I'll probably wait for the Skylake release, fwiw.
 
Last edited:
Starting to think about a 27" Retina Imac. I have one external monitor I plan to use with the imac. It is a 32" 2560 x 1440 resolution Benq cad display.

I don't do gaming or video. I do some limited rendering (sketchup type work). Primarily it will be autocad work.

Is the $250 upgrade to go from a 2gb Gpu to a 4gb Gpu something to worry about? Do it just because you can't upgrade it later?

I'm living with a 2010 15" macbook pro now with a massive 256 mb gpu. It gets balky trying to pan and zoom in sketchup. I don't know if thats a software issue, a gpu issue, or a cpu issue. (my mbp has 8gb ram, 1st gen i5, and a 500 gb ssd)

What I'm looking at is 27", retina display, i7, 500 gb ssd, 32 gb mem (probably buy with 8, and buy a 32 gb setup from crucial). But I'm pondering about the GPU upgrade.

I'm not in a massive hurry, I'll probably wait for the Skylake release, fwiw.

My advice? Worth exactly what you paid for it by the way. :)

You can't upgrade the video card later. I would always future proof it now if you can afford it. You will get more time and usefulness out of the machine with a higher end card. That holds true for any upgrade. I was an early adopter this time around with the Retina iMac and I have not been disappointed with its performance (I am a gamer).

Besides, when you are spending this much for a new machine, you may as well make it the best you can afford. If you wanted to save money, you could buy an windows machine with equal specs for half the money.
 
My advice? Worth exactly what you paid for it by the way. :)

You can't upgrade the video card later. I would always future proof it now if you can afford it. You will get more time and usefulness out of the machine with a higher end card. That holds true for any upgrade. I was an early adopter this time around with the Retina iMac and I have not been disappointed with its performance (I am a gamer).

Besides, when you are spending this much for a new machine, you may as well make it the best you can afford. If you wanted to save money, you could buy an windows machine with equal specs for half the money.

Agree with this in general but in the current-gen iMac 5K a lot of people are reporting issues with the 295x specifically (there's a very long thread on here about it if you search for it). However despite that being in the title the OP then says he'll wait for Skylake (or presumably whatever the next-gen iMac 5K is, Broadwell or Skylake). They'll almost certainly change the card in that to one that's a bit more efficient and runs less hot, which is very important in an enclosure as tight as the iMac.

Basically if you're waiting for the next-gen iMac my advice is to wait until it's been out a couple weeks, then check in with some of the reliable review sites like Anandtech to see how the various levels of GPU that are offered with it perform.
 
Hadn't really thought about the GPU being upgraded when the cpu is uprated - stands to reason though.

Will just have to wait and see I guess.
 
If you have no plan on how to use the extra performance of the 295X, then the only benefit is possible future proof.

However, old hardware is not just phased out by it's performance, but also functions. e.g. The old GPU which not support Metal / DX12 is not considered future proof now regardless how powerful it is. Those new functions sometimes may be even more important than it's raw performance. An example is the CPU, a more powerful old CPU which not support quick sync can easily be beaten by the newer but much slower CPU on some MP4 encoding task. In this area, both the 295x and 290x are the 2015 product, most likely none of them may do better than the other.

Furthermore, the hardware may be upgraded because of increase energy efficiency. In this point of view, the 295x also not doing any better than 290x in future proof.

I have a 6 years old Mac Pro. In terms of raw processing power (CPU multi core performance), none of the iMac can beat it (not even the 4790K). I use it 24/7, it encode, run as a server, handle few VM at the same time... However, is it future proof? I don't think so, it's lack of TB, USB3, SATA3, BT4.0, WiFi ac… luckily I can make some upgrade by installing PCIe card, but not all of them has upgrade avail. So far, IMO, the only real future proof characteristic on my machine is the hardware upgradability (CPU, GPU, RAM, PCIe card...), but not anything else.

For a computer like the iMac, almost nothing is upgradable. Only upgrade the GPU to a more powerful one but no plan to use it, most likely won't give you any extra future proof (unless you know that your workflow will be GPU limiting in the future).

If the 290x can handle both of your screens now, it should also able to handle them in the future. If I were you, I will only get the 295x if I know I need (or at least I can use) those power in the foreseeable future. Otherwise, there is no point to pay extra, get a hotter and more power hunger GPU. That may only give you more fan noise but nothing else.

Anyway, I never use AutoCAD, so I don't know if it can utilise the GPU. If yes, may be still worth the upgrade. If not, may be better to save the money.
 
Last edited:
Maybe a simpler question would be is 2GB GPU memory adequate to drive both a 27" Retina display at nearly 15x10^6 pixels and an additional 3.7 x10^6 pixels on a second monitor? In my case only one monitor at a time will be doing anything significant - i.e. I can't see ever needing to render 2 different structures simultaneously.
 
Should be more than enough. The D300 in new Mac Pro only has 2GB VRAM, and it can drive up to three 5K monitors.

Yes, I know it has two D300. However, only one is use to drive the display, the other one is purely for computation. So, 2G VRAM is more than enough in your case.

However, more VRAM is always good for web browsing. Almost all current web browser use VRAM (I know what I am saying, it's VRAM, not RAM) to accelerate. So, purely considering VRAM, the more the better, and I am sure your machine will able to use most of the 4G VRAM if you choose the 295x. Even though you just doing something as simple as browsing the web.
 
Looking at the overall cost of the upgrade, the M295X is less than 10% of the overall base price. When I specced out my MBPr for work, I went with the 750M 2GB upgrade, and I'm glad I did. My computing requirements can change at the drop of a hat though.

I always use PassMark benchmarks as a great comparison between models. It can't be considered 100% reliable (like all benchmarks like this), however it gives a ball-park figure. Below are the ratings for both GPUs (higher is better):

Radeon R9 M290X - 2547
Radeon R9 M295X - 5024

As you can see, there is a considerable difference, with the M295X having almost twice the performance of the M290X. Again, probably not a direct 1:2 performance comparison but maybe this can help. On the plus side, it always adds value to when you come to sell it on. Ok, not a $250 extra value though. Food for thought!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.