Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

TallManNY

macrumors 601
Original poster
Nov 5, 2007
4,741
1,594
So Apple says: "If you choose the NVIDIA GeForce GTX 680MX on the 27-inch iMac, you get the best graphics performance available in an all-in-one computer."
Is that true? The 2011 iMac that I bought supposedly had the top mobile graphics card or at least arguably the top.

Anyone want to weigh in? I'm concerned about the slim design being a limiting factor for inclusion of high end graphics cards. But maybe I shouldn't be considering we have always only been talking about cards that go into laptops.
 

boto

macrumors 6502
Jun 4, 2012
437
28
Yes, Apple is correct that their top-of-the-line AIO 27" sports the highest performance graphics card available. It has been revealed only recently that a GTX 680MX is in production and being offered based on Nvidia's website. The second best AIO machine is Maingear's 24" that provides a GTX 680m GPU. Also, judging basely on the spec sheet on Nvidia's website, it appears the GPU is a full core GTX 680 desktop processor. If this is true, I'm amazed how Apple can fit basically a desktop 680 in a thin machine and to also be able to manage airflow circulation in and out.
 

quagmire

macrumors 604
Apr 19, 2004
6,910
2,335
^^^

Mobile GPU's have long been based on their desktop class brothers. Take the 7970M referenced in the link posted above. It's basically the desktop 7870, just downclocked.
 

Dirtyharry50

macrumors 68000
May 17, 2012
1,769
183
So Apple says: "If you choose the NVIDIA GeForce GTX 680MX on the 27-inch iMac, you get the best graphics performance available in an all-in-one computer."
Is that true? The 2011 iMac that I bought supposedly had the top mobile graphics card or at least arguably the top.

Anyone want to weigh in? I'm concerned about the slim design being a limiting factor for inclusion of high end graphics cards. But maybe I shouldn't be considering we have always only been talking about cards that go into laptops.

The one you bought did have the most powerful mobile GPU available at its release.

The difference this time around is you can get the top mobile GPU but it's going to be BTO and it's going to cost you. How much is unknown at this point. I would not expect it to be cheap.
 

doh123

macrumors 65816
Dec 28, 2009
1,304
2
while a GTX680MX is the fastest Mobile GPU available right now... its still much slower than the desktop GTX680 ... its lower power, and clocked lower... and should run much cooler than the desktop part.
 

TallManNY

macrumors 601
Original poster
Nov 5, 2007
4,741
1,594
The one you bought did have the most powerful mobile GPU available at its release.

The difference this time around is you can get the top mobile GPU but it's going to be BTO and it's going to cost you. How much is unknown at this point. I would not expect it to be cheap.

Yep, pretty big difference I think. I'm obviously not in the market for an upgrade so soon. And I can pretty much still run anything that comes out now on bootcamp at acceptable settings. But I am going to see how much the pricing would be to buy the comparable iMac this year versus last year. I bet it costs a few hundred dollars more. The GPU upgrade could even be as much as another $300.
 

SlickShoes

macrumors 6502a
Jan 24, 2011
640
0
A guy on a game forum I post on wrote this up a couple of weeks ago:

For months, Nvidia has had a gaping hole in their mobile lineup. The GeForce GTX 660M was based on the generation's slowest GPU chip--from either vendor. The GeForce GTX 680M was a very nice card, but cost a fortune. Laptop vendors don't publish absolute prices, but only differences in prices between various cards, but if you got a GTX 680M, you were probably paying about $700 or $800 for the video card alone. That's fine if you were planning on spending $2500 on a laptop, but not if you're looking for something a little more budget-friendly.

In between, there was the GeForce GTX 670M and the GTX 675M. Unfortunately, those were old Fermi rebrands. You could cope with the massive heat output in a desktop, but not a laptop.

So you buy AMD this generation, right? The Radeon HD 7970M certainly looks good in terms of specs and price. But in previous generations, going AMD had typically meant keeping the discrete card running all of the time, as AMD didn't bundle their own drivers with Intel video drivers for integrated graphics, the way that Nvidia did. That's fine for some purposes, but it will kill your battery life.

But no, this generation, Clevo decided to use AMD Enduro switchable graphics, rather than leaving an AMD card running all the time. That meant no driver updates. It also ran into a rather problematic glitch where AMD switchable graphics wasn't able to use the full PCI Express bandwidth, which hurt performance badly. A 7970M was still faster and far more efficient than previous generation cards, but not nearly as fast as it should have been.

For all the promise of 28 nm, the generation is nearly over and neither side had something suitable for $1500-$2000 gaming laptops on the market.

Today, Nvidia fixed that with the launch of the GeForce GTX 675MX and GTX 670MX. The model number hole between the 660M and 680M was already filled by the 675M and 670M. Rather than getting creative by having the third digit be something other than 0 or 5, or calling a card a GTX 665M (and thereby conjuring comparisons to the disastrous GeForce GTX 465, a marketing no-no), Nvidia marketing decided to add an X onto the end. But contrary to the similar names, the GTX 675MX has nothing to do with the GTX 675M, and likewise if you subtract 5 from the model numbers.
The short story is that the GeForce GTX 675MX is a severely cut down GK104 die, with only 5 of the 8 SMXes active. The GeForce GTX 670MX is a fully-functional GK106 die. With the same clock speeds, they should have identical GPU performance. The difference is in video memory, where the GTX 675MX has four memory channels at 900 MHz, while the GTX 670MX has three channels at 700 MHz. Both are more expensive than the GeForce GTX 660M, of course, but not outlandishly so--and they're cheaper than a Radeon HD 7970M.

All is not lost on the AMD front, though. AMD recently launched their first laptop drivers for discrete switchable graphics. A hotfix for the PCI Express bus problem is currently being tested and due for public release next week. Those who bought a 7970M early on won't be left out in the cold. Hopefully this will mean that we see some laptops equipped with a Radeon HD 7870M shortly, too.
More generally, all up and down the lineup, Nvidia seems to have bet that more memory capacity and more GPU performance will win, while AMD has bet that more memory bandwidth will win. A GeForce GTX 680M has considerably more GPU power than a Radeon HD 7970M, but only 3/4 of the memory bandwidth. A GeForce GTX 670MX has a little shy of double the GPU performance of a Radeon HD 7870M, but only 5% more memory bandwidth.

So which side is right here? On memory capacity, from a performance perspective, AMD is right and Nvidia is wrong. It really is that simple. But they probably both knew that a long time ago. Nvidia is betting that customers are stupid and will think that more video memory means a faster card, and from a marketing perspective, they might be right about that. Maybe.

Where it gets more interesting is the GPU performance versus memory bandwidth tradeoffs. Here, it's the same story in desktops, where Nvidia went for more GPU performance while AMD went for more memory bandwidth. And who bet correctly?

For older games that have MSAA (or maybe SSAA through drivers) as their only anti-aliasing options, AMD wins. But as post-processing anti-aliasing effects such as FXAA replace the traditional MSAA, GPU performance matters a lot more and video memory bandwidth less. In that case, Nvidia wins.
Assuming that the transition to post-processing anti-aliasing continues and MSAA dies out (which it should, but then, DirectX 9.0c should have died out by now, too), Nvidia has the more forward-looking architecture here, and there's a good chance that in games that launch two or three years from now, an Nvidia card would do better as compared to an AMD card in today's games. Don't expect miracles; this isn't going to magically double the performance of Nvidia cards. But movement of 5% in Nvidia's direction is a realistic possibility.

Furthermore, if you're comparing two cards that are both plenty fast enough in older games today, it doesn't matter which card is faster. All that matters is how they'll perform in the future, more demanding games, where both cards might not have far more performance than you need anymore.
This only applies if we're comparing Kepler to Southern Islands. In particular, it doesn't apply to Fermi, which still is and always will be a train wreck. But AMD didn't regress here; Nvidia simply got better. Fermi didn't scale well to highly demanding cases. Kepler fixed that.
 

MacGamerHQ

macrumors member
Sep 25, 2012
98
0
Lyon, France
while a GTX680MX is the fastest Mobile GPU available right now... its still much slower than the desktop GTX680 ... its lower power, and clocked lower... and should run much cooler than the desktop part.

Indeed, the mobile version will never be as powerful as the desktop.

However, I would assume all All-in-one PCs use "mobile" graphics card, so we can assume that Apple is right when saying their new iMacs will have the best graphics among All-in-one PCs..... (first assumption to be checked!)
 

cirus

macrumors 6502a
Mar 15, 2011
582
0
I would keep in mind that there is a disclaimer on the nvidia website, right above the specifications, that the actual card is subject to manufacturer variations, which to me seems to imply apple, implies reduced clocks.
 

doh123

macrumors 65816
Dec 28, 2009
1,304
2
I would keep in mind that there is a disclaimer on the nvidia website, right above the specifications, that the actual card is subject to manufacturer variations, which to me seems to imply apple, implies reduced clocks.

or increased...? Why assume its only decreased?

Its the same for all GPUs on the website there.

Apple runs them how Apple finds best.

For example, the GT650M in the rMBP 15" is actually clocked much higher than the nvidia spec, and around the level of the GTX660M.
 

cirus

macrumors 6502a
Mar 15, 2011
582
0
or increased...? Why assume its only decreased?

Its the same for all GPUs on the website there.

Apple runs them how Apple finds best.

For example, the GT650M in the rMBP 15" is actually clocked much higher than the nvidia spec, and around the level of the GTX660M.

its entirely possible but given thermal constraints I would say that if it was different from manufacturer in any way, it would be lower clocks.

Aside: I never got why apple used the gt 650m at higher clocks over the 660m, sure they save a couple bucks (which with their margins really does not matter), but they piss off nvidia and lose out on some marketing appeal
 

maghemi

macrumors 6502
Aug 7, 2009
317
0
Melbourne Australia
and lose out on some marketing appeal

Not much market appeal, and only to a small segment. The vast majority of apple purchasers of laptops couldn't give two hoots about video card model numbers/specs/clock speeds.

It's most likely all about margin. If you look at it from apples point of view they can get a certain degree of performance that is acceptable for a few dollars less per laptop. Over the number of laptops they sell that turns into a massive chunk of money.

The desktop is a different story I'd say, as you're more likely to get the people who really care about graphics performance buying the iMacs than laptops (I just wish they'd do something with the graphics on the Mac Pro).
 

omenatarhuri

macrumors 6502a
Feb 9, 2010
901
837
I have a hackintosh with Ivy Bridge 3570k and desktop version Geforce 670. Also I have a MBP 2,2Ghz i7 with the 6750M.

I wonder how the new iMac 27" with 675MX or 680 performs relative to what I have. My guess is right in closer to the MBP, while lagging far behind the hackintosh (in performance). :confused:
 

cluthz

macrumors 68040
Jun 15, 2004
3,118
4
Norway
I have a hackintosh with Ivy Bridge 3570k and desktop version Geforce 670. Also I have a MBP 2,2Ghz i7 with the 6750M.

I wonder how the new iMac 27" with 675MX or 680 performs relative to what I have. My guess is right in closer to the MBP, while lagging far behind the hackintosh (in performance). :confused:

From 3Dmark11
GTX670: ~9400
GTX680m:~6600
6750M: ~2000

The GTX680MX is a improved GTX680M, so it should perform better. The chip is not available yet, but specs are:
GTX 680MX features 1536 instead of 1344 CUDA cores (compared to 680m) and higher memory clocks (720/2500MHz vs 720/1800 MHz).

How this affect real world performance is *probably* 10-20% increase, which will put the in the 7200-8000 score in 3DMark11.

The 680MX will not match the GTX670, but be more like 300-400% faster than the 6750M.
 

Asgorath

macrumors 68000
Mar 30, 2012
1,573
479
So which side is right here? On memory capacity, from a performance perspective, AMD is right and Nvidia is wrong. It really is that simple.

And this is proven to be true by the fact AMD is beating NVIDIA by 25% in all cases, right? It's really not that simple at all, and this is a very naive way of looking at GPU performance. Sure, the AMD parts might have more raw memory bandwidth than the NVIDIA ones. However, raw memory bandwidth is only one factor, and the fact that NVIDIA is actually ahead in many real-world game benchmarks suggests that this view is just flat-out incorrect. If an application is limited by shader horsepower alone, then more shader horsepower translates into higher FPS, and this is why NVIDIA can be competitive with less raw memory bandwidth.
 

pedromartins

Suspended
Sep 7, 2012
93
0
Porto, Portugal
while a GTX680MX is the fastest Mobile GPU available right now... its still much slower than the desktop GTX680 ... its lower power, and clocked lower... and should run much cooler than the desktop part.

it is only 30% slower, since it is underclocked 30%. This is amazing (if battery isn't a concern... impossible on a laptop)
 

cluthz

macrumors 68040
Jun 15, 2004
3,118
4
Norway
it is only 30% slower, since it is underclocked 30%. This is amazing (if battery isn't a concern... impossible on a laptop)

30% less core clock speed does not equal 30% less performance.
There are many factors, as the memory is also slower and so on.

However it could be 30% slower, but generally 30% clock decrease doesn't mean 30% speed decrease.

The GPU is a complex which speed is a mix of core speed, shader speed, memory bandwidth/speed, amounts of ROPs (raster operation pipelines), SPUs (or CUDA cores as NVIDIA calls them), TAUs (texture address units) and so forth...
 

adder7712

macrumors 68000
Mar 9, 2009
1,923
1
Canada
Mobile GPUs are underclocked when compared to their desktop equivalents.

The GTX 675MX won't equal the GTX 670 in performance but i''s still a fast mobile GPU.
 

MacsRgr8

macrumors G3
Sep 8, 2002
8,284
1,753
The Netherlands
The GTX 680MX outperforms the desktop Radeon 5870....

iMac FTW over the Mac Pro for gaming! ;)

I think the maxxed out iMac 27" will be my first non- Mac Pro or Power Mac purchase!
 

barrett14

macrumors regular
Jun 24, 2010
183
44
How will my old PC compare to the new iMac with these specs?

Current PC:
i7 950 processor
GTX 470 video card
6 gig of ram

New imac would have the i7 and probably just the 675MX... unless the 680 is a lot cheaper but I am not sure how much gaming I would be doing... but it would be nice to have the option.
 

omenatarhuri

macrumors 6502a
Feb 9, 2010
901
837
From 3Dmark11
GTX670: ~9400
GTX680m:~6600
6750M: ~2000

The GTX680MX is a improved GTX680M, so it should perform better. The chip is not available yet, but specs are:
GTX 680MX features 1536 instead of 1344 CUDA cores (compared to 680m) and higher memory clocks (720/2500MHz vs 720/1800 MHz).

How this affect real world performance is *probably* 10-20% increase, which will put the in the 7200-8000 score in 3DMark11.

The 680MX will not match the GTX670, but be more like 300-400% faster than the 6750M.
Thanks for this, excellent reply. That is stunningly close to a state-of-the-art desktop graphics card. Seems to run circles around the MBP as well....

What most intrigues me is the display, the lamination of it etc. Currently I have a 30" LG and 24" Dell. I like the size of the first one and the picture of the latter. Seems like the new iMac 27" could combine these feats.

And the iMac does look gorgeous with the new design.
 

doh123

macrumors 65816
Dec 28, 2009
1,304
2
GTX 675mx is no good in comparison to any GTX cards you get on PC

you mean Desktop GPUs... nothing to do with PC or not PC. you can get a GTX 675mx on a 'PC' as well and it works fine. You just mean that desktop versions of GPUs are much better in most ways compared to mobile versions.
 

Andrew1001

macrumors newbie
Jan 9, 2013
1
0
So, someone help a noob out...

When I read about the new iMac I was pretty interested at the news on the GPU. But someone help me out...

I'm currently running a late-2009 21.5inch iMac (3.06 GHz Intel Core 2 Duo, 4 GB 1067 MHz DDR3, ATI Radeon HD 4670 256 MB) and I use it for the usual (internet, aperture, photoshop, work stuff) and it is more than adequate. However, where I'm feeling let down is for gaming. At weekends I love to boot into Windoze and play MW3 or Black Ops - and these run at acceptable quality and speed. But newer titles like Black Ops II or Battlefield 3 run like treacle in winter (i.e. very slowly!).

So the question is: how much of a step up would be a new iMac with NVIDIA GeForce GTX 680MX, 2 GB? It seems a pretty big step up to me. What are the opinions on whether this set-up will see me set for another 3 - 4 years of gaming?

Thanks!

Andrew
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.