Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I know the 680m is the top spec mobile nVidia card and SHOULD be iMac bound (either that or the 7970m from AMD), but why are you all so convinced that it will be the 680m and not the 675m? I was just looking here and am thinking that this (the 675m) will be the top STANDARD card instead for the 27". Now according to this site it was originally released in March, which may beg the question, why would it not have been released if it's been available for the past 3-4 months? The answer, I believe, is that the 680m is still going to be made available but only as a pricey BTO option instead and is holding everything up.

The 675M is just a rebadged 580M from last year. Its performance improvement over the 6970m is in the 15-20% range, which doesn't make for good marketing type (notice how they push graphic performance in the 2011 iMac page). The 680M performance improvement is in the 80-100% range.

I find it unlikely that Apple would just plug in last year's tech. Surely they'd use a 28nm Kepler card, even if that meant 670M in the low 27" and 680M in the top end. Note that all the MBP's are using 28nm Kepler 650M's.
 
The 675M is just a rebadged 580M from last year. Its performance improvement over the 6970m is in the 15-20% range, which doesn't make for good marketing type (notice how they push graphic performance in the 2011 iMac page). The 680M performance improvement is in the 80-100% range.

I find it unlikely that Apple would just plug in last year's tech. Surely they'd use a 28nm Kepler card, even if that meant 670M in the low 27" and 680M in the top end. Note that all the MBP's are using 28nm Kepler 650M's.

Valid points, please read the rest of my post where I addressed those concerns. The minor performance improvements will probably be offset by making the base top-end card a 2GB card over this generation's 1GB. I agree that you would WANT to move to 28nm but if nVidia only offers a handful of cards in the correct price/performance range, there is nothing apple can do about it but use old tech and make it LOOK like an improvement. Granted, it is better than the current cards but some good old fashioned embellishment will be needed; something apple marketing is very good at.
 
Valid points, please read the rest of my post where I addressed those concerns. The minor performance improvements will probably be offset by making the base top-end card a 2GB card over this generation's 1GB. I agree that you would WANT to move to 28nm but if nVidia only offers a handful of cards in the correct price/performance range, there is nothing apple can do about it but use old tech and make it LOOK like an improvement. Granted, it is better than the current cards but some good old fashioned embellishment will be needed; something apple marketing is very good at.

If you're looking at the top video option, you're probably a gamer, and you know when you're being duped. "Twice the video RAM!" sounds a bit pathetic. They did actual frame rate comparisons for 2011; they couldn't do anything like that in 2012 using a 675M and come out looking good.

Here's another angle - why would Apple or Nvidia want to bother writing drivers for Fermi for the iMac? The Kepler cards would use essentially the same driver; the Fermi would be completely different.
 
I'm deciding between M17x and iMac. If the new iMac releases with GTX 680m, I'll be running and screaming to get it. 27" 2560x1440 alone makes iMac so worth it, for work and gaming. If it packs real deal GPU, I'll pay anything for it. If not, manybe I'll have to go with the alienware. I just need to game. I dont need particular one per se, but I would really like 1440p AND the best GPUs

Nice.. I've been trying to "BTO" an Alienware M17x daily to see if the 680M shows up :)

Now when you say "pay this premium price".. compared to what? I would prefer Apple offer AMD choices and I would take a 7970m in a heartbeat. If Apple went Nvidia, I will take settle for no less than a 680M, even if it's only a BTO option in the top 27". The gulf in the performance from those two to the rest of pack is just too great. I'll silently (or perhaps not so silently) curse Nvidia for gouging and Apple for making that the only choice, but I will get the best GPU Apple allows me to get.

Apple doesn't really have the supply chain pull on these components that they have for their more exclusive stuff like Retina displays. So while I can hope they negotiated a far better price for Nvidia, I'm not counting on it.
 
The 675M is just a rebadged 580M from last year. Its performance improvement over the 6970m is in the 15-20% range, which doesn't make for good marketing type (notice how they push graphic performance in the 2011 iMac page). The 680M performance improvement is in the 80-100% range.

I find it unlikely that Apple would just plug in last year's tech. Surely they'd use a 28nm Kepler card, even if that meant 670M in the low 27" and 680M in the top end. Note that all the MBP's are using 28nm Kepler 650M's.


Really?80-100% more over the 6970m ,wow that's terrific,
while you said its around 30% more over the 7970m ,do i remember correctly?
I agree with you all will go to shrinked nm,how nm are 7970m and 6970m?
With an ipothetic 3200x2000 retina display how many frames will lose these cards
compared to the 2560x1440?
I fear to see the Gpu very limited with a retina display to drive.
 
Last edited:
Really?80-100% more over the 6970m ,wow that's terrific,
while you said its around 30% more over the 7970m ,do i remember correctly?
I agree with you all will go to shrinked nm,how nm are 7970m and 6970m?
With an ipothetic 3200x2000 retina display how many frames will lose these cards
compared to the 2560x1440?
I fear to see the Gpu very limited with a retina display to drive.

Nvidia markets the 680M as 30% faster than the 7970m. The reality is more like 10-15%. Both are 28nm.

I believe they could get very good performance on a retina display of this size if they used SLI/Crossfire. I wouldn't trust either card to push around the insane amount of pixels at gaming speeds on its own.

I also believe the bigger limiting factor on Retina in 2012 would be the cost of the display panel.
 
I agree with you.
Even SLI/Crossfire on a Imac its no so easy to implement isn't it?
No space,temperatures..etc etc.
 
I agree with you.
Even SLI/Crossfire on a Imac its no so easy to implement isn't it?
No space,temperatures..etc etc.

Dropping the ODD could make room for dual mobile GPUs with room to breathe. It would add another ~100W (ODD doesn't draw much) though, which Apple would probably find unacceptable.
 
Dropping the ODD could make room for dual mobile GPUs with room to breathe. It would add another ~100W (ODD doesn't draw much) though, which Apple would probably find unacceptable.

GTX 680m in SLI would be absolutely awesome and it would also be a real contender for those people who won't get an iMac due to it having only mobile GPU's.

I highly doubt it will happen but it would be cool nonetheless.
 
Apple produces premium products; they won't care if the 680m is more expensive. It's a superior card, so they'll use it, and absorb the relatively small increase of bulk-bought whole-sale cards.

----------

I should add, it would be excellent if they dropped the optical drive with the Retina iMac, freeing up room to allow for another mobile GPU (2x680m, ohgod!) to power that resolution as well as they can.
 
Apple produces premium products; they won't care if the 680m is more expensive. It's a superior card, so they'll use it, and absorb the relatively small increase of bulk-bought whole-sale cards.

----------

I should add, it would be excellent if they dropped the optical drive with the Retina iMac, freeing up room to allow for another mobile GPU (2x680m, ohgod!) to power that resolution as well as they can.

if it comes with sli 680m, i would probably pay for whatever they r pricing.:D
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.