Go Back   MacRumors Forums > Special Interests > Mac and PC Games

Reply
 
Thread Tools Search this Thread Display Modes
Old Oct 23, 2012, 06:46 PM   #1
TallManNY
macrumors 68020
 
TallManNY's Avatar
 
Join Date: Nov 2007
GeForce GTX 675MX for Gaming?

So Apple says: "If you choose the NVIDIA GeForce GTX 680MX on the 27-inch iMac, you get the best graphics performance available in an all-in-one computer."
Is that true? The 2011 iMac that I bought supposedly had the top mobile graphics card or at least arguably the top.

Anyone want to weigh in? I'm concerned about the slim design being a limiting factor for inclusion of high end graphics cards. But maybe I shouldn't be considering we have always only been talking about cards that go into laptops.
__________________
Mid-2011 3.1GHz i5 iMac (6970m); HP Spectre (Win 8.1)
BBRY Q10; iPhone 6; iPad Mini-R
Apple Stockholder (a nice dividend, stock buybacks and cutting edge innovation? yes please!)
TallManNY is online now   0 Reply With Quote
Old Oct 23, 2012, 08:27 PM   #2
lewdvig
macrumors 65816
 
Join Date: Jan 2002
Location: South Pole
The 680m is actually a desktop card slowed way down. I think apple is making a fair claim.

----------

http://www.notebookcheck.net/Review-...M.77110.0.html
__________________
Zealot without a cause. I run an orchard.

Last edited by lewdvig; Oct 23, 2012 at 08:25 PM. Reason: Playbook kb sucks
lewdvig is offline   0 Reply With Quote
Old Oct 23, 2012, 09:28 PM   #3
boto
macrumors 6502
 
Join Date: Jun 2012
Yes, Apple is correct that their top-of-the-line AIO 27" sports the highest performance graphics card available. It has been revealed only recently that a GTX 680MX is in production and being offered based on Nvidia's website. The second best AIO machine is Maingear's 24" that provides a GTX 680m GPU. Also, judging basely on the spec sheet on Nvidia's website, it appears the GPU is a full core GTX 680 desktop processor. If this is true, I'm amazed how Apple can fit basically a desktop 680 in a thin machine and to also be able to manage airflow circulation in and out.
boto is offline   0 Reply With Quote
Old Oct 23, 2012, 10:26 PM   #4
quagmire
macrumors 603
 
quagmire's Avatar
 
Join Date: Apr 2004
^^^

Mobile GPU's have long been based on their desktop class brothers. Take the 7970M referenced in the link posted above. It's basically the desktop 7870, just downclocked.
__________________
Crimes against US History:
CV-6 USS Enterprise
Yankee Stadium
Penn Station-New York
quagmire is offline   0 Reply With Quote
Old Oct 24, 2012, 01:11 AM   #5
Dirtyharry50
macrumors 65816
 
Join Date: May 2012
Quote:
Originally Posted by TallManNY View Post
So Apple says: "If you choose the NVIDIA GeForce GTX 680MX on the 27-inch iMac, you get the best graphics performance available in an all-in-one computer."
Is that true? The 2011 iMac that I bought supposedly had the top mobile graphics card or at least arguably the top.

Anyone want to weigh in? I'm concerned about the slim design being a limiting factor for inclusion of high end graphics cards. But maybe I shouldn't be considering we have always only been talking about cards that go into laptops.
The one you bought did have the most powerful mobile GPU available at its release.

The difference this time around is you can get the top mobile GPU but it's going to be BTO and it's going to cost you. How much is unknown at this point. I would not expect it to be cheap.
Dirtyharry50 is offline   0 Reply With Quote
Old Oct 24, 2012, 07:21 AM   #6
doh123
macrumors 65816
 
Join Date: Dec 2009
while a GTX680MX is the fastest Mobile GPU available right now... its still much slower than the desktop GTX680 ... its lower power, and clocked lower... and should run much cooler than the desktop part.
doh123 is offline   0 Reply With Quote
Old Oct 24, 2012, 08:27 AM   #7
TallManNY
Thread Starter
macrumors 68020
 
TallManNY's Avatar
 
Join Date: Nov 2007
Quote:
Originally Posted by Dirtyharry50 View Post
The one you bought did have the most powerful mobile GPU available at its release.

The difference this time around is you can get the top mobile GPU but it's going to be BTO and it's going to cost you. How much is unknown at this point. I would not expect it to be cheap.
Yep, pretty big difference I think. I'm obviously not in the market for an upgrade so soon. And I can pretty much still run anything that comes out now on bootcamp at acceptable settings. But I am going to see how much the pricing would be to buy the comparable iMac this year versus last year. I bet it costs a few hundred dollars more. The GPU upgrade could even be as much as another $300.
__________________
Mid-2011 3.1GHz i5 iMac (6970m); HP Spectre (Win 8.1)
BBRY Q10; iPhone 6; iPad Mini-R
Apple Stockholder (a nice dividend, stock buybacks and cutting edge innovation? yes please!)
TallManNY is online now   0 Reply With Quote
Old Oct 24, 2012, 09:32 AM   #8
SlickShoes
macrumors 6502a
 
Join Date: Jan 2011
A guy on a game forum I post on wrote this up a couple of weeks ago:

For months, Nvidia has had a gaping hole in their mobile lineup. The GeForce GTX 660M was based on the generation's slowest GPU chip--from either vendor. The GeForce GTX 680M was a very nice card, but cost a fortune. Laptop vendors don't publish absolute prices, but only differences in prices between various cards, but if you got a GTX 680M, you were probably paying about $700 or $800 for the video card alone. That's fine if you were planning on spending $2500 on a laptop, but not if you're looking for something a little more budget-friendly.

In between, there was the GeForce GTX 670M and the GTX 675M. Unfortunately, those were old Fermi rebrands. You could cope with the massive heat output in a desktop, but not a laptop.

So you buy AMD this generation, right? The Radeon HD 7970M certainly looks good in terms of specs and price. But in previous generations, going AMD had typically meant keeping the discrete card running all of the time, as AMD didn't bundle their own drivers with Intel video drivers for integrated graphics, the way that Nvidia did. That's fine for some purposes, but it will kill your battery life.

But no, this generation, Clevo decided to use AMD Enduro switchable graphics, rather than leaving an AMD card running all the time. That meant no driver updates. It also ran into a rather problematic glitch where AMD switchable graphics wasn't able to use the full PCI Express bandwidth, which hurt performance badly. A 7970M was still faster and far more efficient than previous generation cards, but not nearly as fast as it should have been.

For all the promise of 28 nm, the generation is nearly over and neither side had something suitable for $1500-$2000 gaming laptops on the market.

Today, Nvidia fixed that with the launch of the GeForce GTX 675MX and GTX 670MX. The model number hole between the 660M and 680M was already filled by the 675M and 670M. Rather than getting creative by having the third digit be something other than 0 or 5, or calling a card a GTX 665M (and thereby conjuring comparisons to the disastrous GeForce GTX 465, a marketing no-no), Nvidia marketing decided to add an X onto the end. But contrary to the similar names, the GTX 675MX has nothing to do with the GTX 675M, and likewise if you subtract 5 from the model numbers.
The short story is that the GeForce GTX 675MX is a severely cut down GK104 die, with only 5 of the 8 SMXes active. The GeForce GTX 670MX is a fully-functional GK106 die. With the same clock speeds, they should have identical GPU performance. The difference is in video memory, where the GTX 675MX has four memory channels at 900 MHz, while the GTX 670MX has three channels at 700 MHz. Both are more expensive than the GeForce GTX 660M, of course, but not outlandishly so--and they're cheaper than a Radeon HD 7970M.

All is not lost on the AMD front, though. AMD recently launched their first laptop drivers for discrete switchable graphics. A hotfix for the PCI Express bus problem is currently being tested and due for public release next week. Those who bought a 7970M early on won't be left out in the cold. Hopefully this will mean that we see some laptops equipped with a Radeon HD 7870M shortly, too.
More generally, all up and down the lineup, Nvidia seems to have bet that more memory capacity and more GPU performance will win, while AMD has bet that more memory bandwidth will win. A GeForce GTX 680M has considerably more GPU power than a Radeon HD 7970M, but only 3/4 of the memory bandwidth. A GeForce GTX 670MX has a little shy of double the GPU performance of a Radeon HD 7870M, but only 5% more memory bandwidth.

So which side is right here? On memory capacity, from a performance perspective, AMD is right and Nvidia is wrong. It really is that simple. But they probably both knew that a long time ago. Nvidia is betting that customers are stupid and will think that more video memory means a faster card, and from a marketing perspective, they might be right about that. Maybe.

Where it gets more interesting is the GPU performance versus memory bandwidth tradeoffs. Here, it's the same story in desktops, where Nvidia went for more GPU performance while AMD went for more memory bandwidth. And who bet correctly?

For older games that have MSAA (or maybe SSAA through drivers) as their only anti-aliasing options, AMD wins. But as post-processing anti-aliasing effects such as FXAA replace the traditional MSAA, GPU performance matters a lot more and video memory bandwidth less. In that case, Nvidia wins.
Assuming that the transition to post-processing anti-aliasing continues and MSAA dies out (which it should, but then, DirectX 9.0c should have died out by now, too), Nvidia has the more forward-looking architecture here, and there's a good chance that in games that launch two or three years from now, an Nvidia card would do better as compared to an AMD card in today's games. Don't expect miracles; this isn't going to magically double the performance of Nvidia cards. But movement of 5% in Nvidia's direction is a realistic possibility.

Furthermore, if you're comparing two cards that are both plenty fast enough in older games today, it doesn't matter which card is faster. All that matters is how they'll perform in the future, more demanding games, where both cards might not have far more performance than you need anymore.
This only applies if we're comparing Kepler to Southern Islands. In particular, it doesn't apply to Fermi, which still is and always will be a train wreck. But AMD didn't regress here; Nvidia simply got better. Fermi didn't scale well to highly demanding cases. Kepler fixed that.
SlickShoes is offline   2 Reply With Quote
Old Oct 24, 2012, 10:30 AM   #9
MacGamerHQ
macrumors member
 
Join Date: Sep 2012
Location: Lyon, France
Quote:
Originally Posted by doh123 View Post
while a GTX680MX is the fastest Mobile GPU available right now... its still much slower than the desktop GTX680 ... its lower power, and clocked lower... and should run much cooler than the desktop part.
Indeed, the mobile version will never be as powerful as the desktop.

However, I would assume all All-in-one PCs use "mobile" graphics card, so we can assume that Apple is right when saying their new iMacs will have the best graphics among All-in-one PCs..... (first assumption to be checked!)
__________________
Runs MacGamerHQ.com, the only Mac Gaming Blog left alive!
MacGamerHQ is offline   0 Reply With Quote
Old Oct 24, 2012, 01:08 PM   #10
cirus
macrumors 6502a
 
Join Date: Mar 2011
I would keep in mind that there is a disclaimer on the nvidia website, right above the specifications, that the actual card is subject to manufacturer variations, which to me seems to imply apple, implies reduced clocks.
cirus is offline   0 Reply With Quote
Old Oct 24, 2012, 01:57 PM   #11
doh123
macrumors 65816
 
Join Date: Dec 2009
Quote:
Originally Posted by cirus View Post
I would keep in mind that there is a disclaimer on the nvidia website, right above the specifications, that the actual card is subject to manufacturer variations, which to me seems to imply apple, implies reduced clocks.
or increased...? Why assume its only decreased?

Its the same for all GPUs on the website there.

Apple runs them how Apple finds best.

For example, the GT650M in the rMBP 15" is actually clocked much higher than the nvidia spec, and around the level of the GTX660M.
doh123 is offline   0 Reply With Quote
Old Oct 24, 2012, 07:14 PM   #12
cirus
macrumors 6502a
 
Join Date: Mar 2011
Quote:
Originally Posted by doh123 View Post
or increased...? Why assume its only decreased?

Its the same for all GPUs on the website there.

Apple runs them how Apple finds best.

For example, the GT650M in the rMBP 15" is actually clocked much higher than the nvidia spec, and around the level of the GTX660M.
its entirely possible but given thermal constraints I would say that if it was different from manufacturer in any way, it would be lower clocks.

Aside: I never got why apple used the gt 650m at higher clocks over the 660m, sure they save a couple bucks (which with their margins really does not matter), but they piss off nvidia and lose out on some marketing appeal
cirus is offline   0 Reply With Quote
Old Oct 25, 2012, 06:27 PM   #13
maghemi
macrumors 6502
 
Join Date: Aug 2009
Location: Melbourne Australia
Quote:
Originally Posted by cirus View Post
and lose out on some marketing appeal
Not much market appeal, and only to a small segment. The vast majority of apple purchasers of laptops couldn't give two hoots about video card model numbers/specs/clock speeds.

It's most likely all about margin. If you look at it from apples point of view they can get a certain degree of performance that is acceptable for a few dollars less per laptop. Over the number of laptops they sell that turns into a massive chunk of money.

The desktop is a different story I'd say, as you're more likely to get the people who really care about graphics performance buying the iMacs than laptops (I just wish they'd do something with the graphics on the Mac Pro).
__________________
A Mac Computer
maghemi is offline   0 Reply With Quote
Old Nov 20, 2012, 03:02 PM   #14
omenatarhuri
macrumors 6502
 
Join Date: Feb 2010
I have a hackintosh with Ivy Bridge 3570k and desktop version Geforce 670. Also I have a MBP 2,2Ghz i7 with the 6750M.

I wonder how the new iMac 27" with 675MX or 680 performs relative to what I have. My guess is right in closer to the MBP, while lagging far behind the hackintosh (in performance).
omenatarhuri is offline   0 Reply With Quote
Old Nov 20, 2012, 03:37 PM   #15
cluthz
macrumors 68040
 
cluthz's Avatar
 
Join Date: Jun 2004
Location: Norway
Send a message via MSN to cluthz
Quote:
Originally Posted by omenatarhuri View Post
I have a hackintosh with Ivy Bridge 3570k and desktop version Geforce 670. Also I have a MBP 2,2Ghz i7 with the 6750M.

I wonder how the new iMac 27" with 675MX or 680 performs relative to what I have. My guess is right in closer to the MBP, while lagging far behind the hackintosh (in performance).
From 3Dmark11
GTX670: ~9400
GTX680m:~6600
6750M: ~2000

The GTX680MX is a improved GTX680M, so it should perform better. The chip is not available yet, but specs are:
GTX 680MX features 1536 instead of 1344 CUDA cores (compared to 680m) and higher memory clocks (720/2500MHz vs 720/1800 MHz).

How this affect real world performance is *probably* 10-20% increase, which will put the in the 7200-8000 score in 3DMark11.

The 680MX will not match the GTX670, but be more like 300-400% faster than the 6750M.
__________________
-tb
MacBook Air 13" i5 osx10.7.5
HackPro i7-4790k, 16GB RAM, GTX780GHz Edition, 3x SSD , win7+osx10.9.4
cluthz is offline   0 Reply With Quote
Old Nov 20, 2012, 04:17 PM   #16
Asgorath
macrumors 6502a
 
Join Date: Mar 2012
Quote:
Originally Posted by SlickShoes View Post
So which side is right here? On memory capacity, from a performance perspective, AMD is right and Nvidia is wrong. It really is that simple.
And this is proven to be true by the fact AMD is beating NVIDIA by 25% in all cases, right? It's really not that simple at all, and this is a very naive way of looking at GPU performance. Sure, the AMD parts might have more raw memory bandwidth than the NVIDIA ones. However, raw memory bandwidth is only one factor, and the fact that NVIDIA is actually ahead in many real-world game benchmarks suggests that this view is just flat-out incorrect. If an application is limited by shader horsepower alone, then more shader horsepower translates into higher FPS, and this is why NVIDIA can be competitive with less raw memory bandwidth.
Asgorath is offline   0 Reply With Quote
Old Nov 20, 2012, 06:05 PM   #17
pedromartins
macrumors member
 
Join Date: Sep 2012
Location: Porto, Portugal
Quote:
Originally Posted by doh123 View Post
while a GTX680MX is the fastest Mobile GPU available right now... its still much slower than the desktop GTX680 ... its lower power, and clocked lower... and should run much cooler than the desktop part.
it is only 30% slower, since it is underclocked 30%. This is amazing (if battery isn't a concern... impossible on a laptop)
__________________
I'm just a fan of Apple products and the company in itself, as long as they keep following the path of awesomeness.
pedromartins is offline   0 Reply With Quote
Old Nov 20, 2012, 06:37 PM   #18
cluthz
macrumors 68040
 
cluthz's Avatar
 
Join Date: Jun 2004
Location: Norway
Send a message via MSN to cluthz
Quote:
Originally Posted by pedromartins View Post
it is only 30% slower, since it is underclocked 30%. This is amazing (if battery isn't a concern... impossible on a laptop)
30% less core clock speed does not equal 30% less performance.
There are many factors, as the memory is also slower and so on.

However it could be 30% slower, but generally 30% clock decrease doesn't mean 30% speed decrease.

The GPU is a complex which speed is a mix of core speed, shader speed, memory bandwidth/speed, amounts of ROPs (raster operation pipelines), SPUs (or CUDA cores as NVIDIA calls them), TAUs (texture address units) and so forth...
__________________
-tb
MacBook Air 13" i5 osx10.7.5
HackPro i7-4790k, 16GB RAM, GTX780GHz Edition, 3x SSD , win7+osx10.9.4
cluthz is offline   0 Reply With Quote
Old Nov 21, 2012, 01:19 AM   #19
adder7712
macrumors 68000
 
adder7712's Avatar
 
Join Date: Mar 2009
Location: Anywhere
Mobile GPUs are underclocked when compared to their desktop equivalents.

The GTX 675MX won't equal the GTX 670 in performance but i''s still a fast mobile GPU.
__________________
Custom PC (Windows 8.1), MacBook Aluminium (OS X Mavericks), iPad 3, Sony Xperia Z3, Samsung Galaxy Tab S 8.4
adder7712 is offline   0 Reply With Quote
Old Nov 21, 2012, 05:42 PM   #20
MacsRgr8
macrumors 604
 
MacsRgr8's Avatar
 
Join Date: Sep 2002
Location: The Netherlands
The GTX 680MX outperforms the desktop Radeon 5870....

iMac FTW over the Mac Pro for gaming!

I think the maxxed out iMac 27" will be my first non- Mac Pro or Power Mac purchase!
__________________
Steve Jobs. 1955 - 2011. My Hero.
MacsRgr8 is offline   0 Reply With Quote
Old Nov 21, 2012, 08:05 PM   #21
barrett14
macrumors regular
 
Join Date: Jun 2010
How will my old PC compare to the new iMac with these specs?

Current PC:
i7 950 processor
GTX 470 video card
6 gig of ram

New imac would have the i7 and probably just the 675MX... unless the 680 is a lot cheaper but I am not sure how much gaming I would be doing... but it would be nice to have the option.
barrett14 is offline   0 Reply With Quote
Old Nov 22, 2012, 02:59 PM   #22
omenatarhuri
macrumors 6502
 
Join Date: Feb 2010
Quote:
Originally Posted by cluthz View Post
From 3Dmark11
GTX670: ~9400
GTX680m:~6600
6750M: ~2000

The GTX680MX is a improved GTX680M, so it should perform better. The chip is not available yet, but specs are:
GTX 680MX features 1536 instead of 1344 CUDA cores (compared to 680m) and higher memory clocks (720/2500MHz vs 720/1800 MHz).

How this affect real world performance is *probably* 10-20% increase, which will put the in the 7200-8000 score in 3DMark11.

The 680MX will not match the GTX670, but be more like 300-400% faster than the 6750M.
Thanks for this, excellent reply. That is stunningly close to a state-of-the-art desktop graphics card. Seems to run circles around the MBP as well....

What most intrigues me is the display, the lamination of it etc. Currently I have a 30" LG and 24" Dell. I like the size of the first one and the picture of the latter. Seems like the new iMac 27" could combine these feats.

And the iMac does look gorgeous with the new design.
omenatarhuri is offline   1 Reply With Quote
Old Jan 7, 2013, 01:35 PM   #23
costa2013
macrumors newbie
 
Join Date: Jan 2013
GTX 675mx is no good in comparison to any GTX cards you get on PC
costa2013 is offline   1 Reply With Quote
Old Jan 7, 2013, 07:06 PM   #24
doh123
macrumors 65816
 
Join Date: Dec 2009
Quote:
Originally Posted by costa2013 View Post
GTX 675mx is no good in comparison to any GTX cards you get on PC
you mean Desktop GPUs... nothing to do with PC or not PC. you can get a GTX 675mx on a 'PC' as well and it works fine. You just mean that desktop versions of GPUs are much better in most ways compared to mobile versions.
doh123 is offline   0 Reply With Quote
Old Jan 9, 2013, 01:48 PM   #25
Andrew1001
macrumors newbie
 
Join Date: Jan 2013
So, someone help a noob out...

When I read about the new iMac I was pretty interested at the news on the GPU. But someone help me out...

I'm currently running a late-2009 21.5inch iMac (3.06 GHz Intel Core 2 Duo, 4 GB 1067 MHz DDR3, ATI Radeon HD 4670 256 MB) and I use it for the usual (internet, aperture, photoshop, work stuff) and it is more than adequate. However, where I'm feeling let down is for gaming. At weekends I love to boot into Windoze and play MW3 or Black Ops - and these run at acceptable quality and speed. But newer titles like Black Ops II or Battlefield 3 run like treacle in winter (i.e. very slowly!).

So the question is: how much of a step up would be a new iMac with NVIDIA GeForce GTX 680MX, 2 GB? It seems a pretty big step up to me. What are the opinions on whether this set-up will see me set for another 3 - 4 years of gaming?

Thanks!

Andrew
Andrew1001 is offline   0 Reply With Quote

Reply
MacRumors Forums > Special Interests > Mac and PC Games

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Similar Threads
thread Thread Starter Forum Replies Last Post
GTX 675MX or GTX 680MX lixe iMac 92 Jun 30, 2013 01:03 AM
GTX 680MX and GTX 675MX reviews to share Red Fuji iMac 0 Apr 19, 2013 04:59 PM
GTX 675MX is actually a 680M??? wmy5 iMac 3 Feb 23, 2013 06:17 AM
27" GTX 675MX vs 660M WorkaholicNL iMac 3 Dec 23, 2012 07:06 AM
NVIDIA GeForce GTX 675MX vs GTX680MX ? rainbowsofwhite iMac 35 Dec 5, 2012 08:51 AM

Forum Jump

All times are GMT -5. The time now is 05:58 AM.

Mac Rumors | Mac | iPhone | iPhone Game Reviews | iPhone Apps

Mobile Version | Fixed | Fluid | Fluid HD
Copyright 2002-2013, MacRumors.com, LLC