Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

karnashuk

macrumors newbie
Original poster
Jan 7, 2011
22
1
Warsaw
Guys. I've got a late 2012 27'' iMac with GeForce GTX 680 MX. Do you think it is possible to upgrade to NVIDIA GeForce GTX 780M with 4GB of GDDR5 (available in the newest late 2013 iMacs)?

Does Apple handle such upgrades?

BTW- I just hate when they do a 'refresh' a year later...

THANKS VERY MUCH for all the replies.
 

0983275

Suspended
Mar 15, 2013
472
56
Nope, you'll have to sell your 2012 model and buy the 2013 model.

But why? GTX 680MX and GTX 780M are virtually the same (just slightly higher clock/memory speed and 100w vs 120w TDP), you won't really see the difference unless you're benchmarking.
 

Confusius

macrumors regular
Mar 24, 2012
131
0
New York
Guys. I've got a late 2012 27'' iMac with GeForce GTX 680 MX. Do you think it is possible to upgrade to NVIDIA GeForce GTX 780M with 4GB of GDDR5 (available in the newest late 2013 iMacs)?

Does Apple handle such upgrades?

BTW- I just hate when they do a 'refresh' a year later...

THANKS VERY MUCH for all the replies.
Not an expert here, but even if such an upgrade was possible, which I think it isn't, I doubt it would be worth your time, money and effort - you still have one of the best mobile graphics chips around.
 

jabalczar

macrumors member
Jul 22, 2011
61
0
Apple does not do post-sale upgrades, ever, no matter how much money you offer.

Except on RAM that is not user-upgradeable, such as on the 2012 / 2013 21.5-inch iMac:

"Memory replacement on the iMac (21.5-inch, Late 2012) and iMac (21.5-inch, Late 2013) is not user-removable and must be done by an Apple Retail Store or Apple Authorized Service Provider."
 

sza

macrumors 6502a
Dec 21, 2010
570
869
Guys. I've got a late 2012 27'' iMac with GeForce GTX 680 MX. Do you think it is possible to upgrade to NVIDIA GeForce GTX 780M with 4GB of GDDR5 (available in the newest late 2013 iMacs)?

Does Apple handle such upgrades?

BTW- I just hate when they do a 'refresh' a year later...

THANKS VERY MUCH for all the replies.

If you check the benchmark, you will see GTX 680 MX is actually slightly better the GTX 780M.
 

tomwvr

macrumors regular
Jun 12, 2012
213
98
Frederick Maryland
GeForce GTX 780M GeForce GTX 680MX GeForce GTX 770M
Cuda Cores 1536 1536 960
Base Clock 771 MHz 720 MHz 706 MHz
Boost Clock 797 MHz - 797 MHz
Memory Clock 1250 MHz 1250 MHz 1002 MHz
Effective Mem Clock
5000 MHz 5000 MHz 4008 MHz
Memory up to 4GB GDDR5 up to 2GB GDDR5 up to 3GB GDDR5
Interface 256-bit 256-bit 192-bit
Bandwidth 160 GB/s 160.0 GB/s 96.2 GB/s


The only difference is the Boost Clock and not really sure that it does much in 99% of use. And in benchmarks the 680 actually seems to do better.

Tom
 

leman

macrumors Core
Oct 14, 2008
19,197
19,055
What's with all the "680MX is faster than 780M"? There is just no way it can be faster (at stock settings). I will do some benchmarks later today when my imac arrives.

@elithrar: the benchmarks were using a laptop with a totally misdesigned cooling solution. The card will be faster then that. See notebookcheck
 

MacsRgr8

macrumors G3
Sep 8, 2002
8,284
1,753
The Netherlands
Nope, you'll have to sell your 2012 model and buy the 2013 model.

But why? GTX 680MX and GTX 780M are virtually the same (just slightly higher clock/memory speed and 100w vs 120w TDP), you won't really see the difference unless you're benchmarking.

Or if you play X-Plane 10 which eats > 2 GB of VRAM.
 

kaellar

macrumors 6502
Nov 12, 2012
441
17
If you check the benchmark, you will see GTX 680 MX is actually slightly better the GTX 780M.

Looks like you don't even understand the nature of the 680m>780m difference in those benchmarks, do you?

The gaming performance isn't GPU-dependant only. It's also the CPU that contributes. Leaving aside the CPU performance itself, under the certain circumstances (bad cooling, mostly) CPUs can throttle and thus drop the performance of the whole system while gaming. That's the only reason of 680m being more performant than 780m in that particular review.

The GPU that has more cores and performs at higher clocks simply can't do worse than the lower-spec'd one if both are working in equal (and, what's important, proper) conditions.
 
Last edited:

sza

macrumors 6502a
Dec 21, 2010
570
869
looks like you don't even understand the nature of the 680m>780m difference in those benchmarks, do you?

The gaming performance isn't gpu-dependant only. It's also the cpu that contributes. Leaving aside the cpu performance itself, under the certain circumstances (bad cooling, mostly) cpus can throttle and thus drop the performance of the whole system while gaming. That's the only reason of 680m being more performant than 780m in that particular review.

The gpu that has more cores and performs at higher clocks simply can't do worse than the lower-spec'd one if both are working in equal (and, what's important, proper) conditions.

bs.
 

elithrar

macrumors 6502
May 31, 2007
372
3
So which is it, 780m faster than 680MX?

The 780M is faster, yes. There is no doubt about this. It is not (by my mark) much faster - 10% in most cases. But it also does this at a much lower TDP (100W vs 120W), which means it should run cooler and potentially give a very good overclock.
 

2Turbo

macrumors 6502
Feb 18, 2011
360
0
The 780M is faster, yes. There is no doubt about this. It is not (by my mark) much faster - 10% in most cases. But it also does this at a much lower TDP (100W vs 120W), which means it should run cooler and potentially give a very good overclock.

Cool. Can't wait for some OC results.
 

MrGimper

macrumors G3
Sep 22, 2012
8,475
11,745
Andover, UK
Looks like you don't even understand the nature of the 680m>780m difference in those benchmarks, do you?

The gaming performance isn't GPU-dependant only. It's also the CPU that contributes. Leaving aside the CPU performance itself, under the certain circumstances (bad cooling, mostly) CPUs can throttle and thus drop the performance of the whole system while gaming. That's the only reason of 680m being more performant than 780m in that particular review.

The GPU that has more cores and performs at higher clocks simply can't do worse than the lower-spec'd one if both are working in equal (and, what's important, proper) conditions.

That's a sweeping statement. Architectural changes can make a huge difference even with less cores and clock speed.
 

kaellar

macrumors 6502
Nov 12, 2012
441
17
That's a sweeping statement. Architectural changes can make a huge difference even with less cores and clock speed.

The statement was made about the two particular GPUs that share identical architecture. I thought it's pretty clear.
 

WilliamG

macrumors G3
Mar 29, 2008
9,924
3,800
Seattle
Simply stated:

Expect the 780M to be close to 10% faster than the 680M
(if the game doesn't need more than 2 GB VRAM - like X-Plane 10 - then the 780M with 4 GB VRAM will be hugely faster)

I'd be willing to bet you quite a few pennies (at least three) that X-Plane 10 will not run significantly better on the 4GB 780M than the 2GB 680MX.
 

MacsRgr8

macrumors G3
Sep 8, 2002
8,284
1,753
The Netherlands
I'd be willing to bet you quite a few pennies (at least three) that X-Plane 10 will not run significantly better on the 4GB 780M than the 2GB 680MX.

If you have many detailed extra sceneries loaded, the VRAM consumed by X-Plane 10 exceeds 2 GB easily.
I have that now on my Radeon HD 7950.

Pay-ware EHAM with ZL17 NL Photoscenery pushed my VRAM consumption to around 2.5 GB. Same goes with KJFK, KLGA and converted Manhattan scenery and others. It really is no exception in X-Plane 10, if you use custom scenery (which gives the flight-sim great eye-candy)

Using these settings on a grfx card with 2 GB VRAM or less results in disastrous performance. On a grfx card with 3 GB or more, it'll be fine.

I will happily take your pennies! :p
 

WilliamG

macrumors G3
Mar 29, 2008
9,924
3,800
Seattle
If you have many detailed extra sceneries loaded, the VRAM consumed by X-Plane 10 exceeds 2 GB easily.
I have that now on my Radeon HD 7950.

Pay-ware EHAM with ZL17 NL Photoscenery pushed my VRAM consumption to around 2.5 GB. Same goes with KJFK, KLGA and converted Manhattan scenery and others. It really is no exception in X-Plane 10, if you use custom scenery (which gives the flight-sim great eye-candy)

Using these settings on a grfx card with 2 GB VRAM or less results in disastrous performance. On a grfx card with 3 GB or more, it'll be fine.

I will happily take your pennies! :p

I would love to see proof of this on the 4GB GPU 2013 iMac vs last year's 2GB GPU iMac. I await benchmarks!
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.