nVidia GTX680m to expensive for iMac.

Discussion in 'iMac' started by Kerel, Jun 18, 2012.

  1. Kerel macrumors member

    May 19, 2011
    Weimar Germany
    I know everybody is hoping for the GTX680m to be the GPU of choice for the new high end iMac. And that the update of the new iMac line is taking so long because of availability problems caused by 28nm wafers shortages to produce this GPU.

    But before you get to exited and might get disappointed when the new iMacs arive.

    This GPU costs between 400 and 500 dollars/euro in desktop form.

    The current 6970m used in the high end iMac is based on the desktop Radeon HD6850 which costed around 150 dollar/euro when the 2011 iMac model was released.

    No way Apple is going to use such an expensive gpu in their iMac line

    My guess is that Apple is waiting for the nVidia GK106 chip which hasn't been released yet.
    This would also explain why the 2012 iMac model hasn't been released yet.
    Even when its been over a year since the last update.

    Expected release date to this GPU is this summer.
    rumored performance is around twice as fast as an GK107/GT650
  2. Razorhog macrumors 65816


    Sep 16, 2006
    I've thought the same thing and I hope we are both wrong. I'd be fine with the 7970.
  3. dayloon macrumors 6502

    Apr 19, 2005
    Stafford, UK
    Have to admit its crossed my mind as well. If they did include the 680m they'd have to add a huge price increase
  4. alksion macrumors 68000


    Sep 10, 2010
    Los Angeles County
    I really hope not! I don't see the 680m being standard on the highest end 27, but I see it being BTO and possibly a costly one at that, something I wouldn't mind paying extra for.
  5. bogatyr macrumors 65816

    Mar 13, 2012
    I'm fine with the 7970m, just so long as they release a new iMac soon! :)
  6. Mister Bumbo macrumors 6502

    Apr 30, 2012
    Would Apple, as a partner, and with difference deals and contracts and whatnot, pay fullprice? Not sure the price is as much bigger compared to other cards for Apple as it is for consumers.
  7. theSeb macrumors 604


    Aug 10, 2010
    Poole, England
    They certainly would not be paying anywhere near the recommended retail price since they order thousands at a time.
  8. Mister Bumbo macrumors 6502

    Apr 30, 2012
    Precisely. So one could argue that the price we see as consumers matter very little to trades between companies. It might even be that nVidia choses to overprice their product much more than AMD, the actual worth and production costs and other factors that matters to trades between companies might not be justified by the, to us, known price.
  9. forty2j macrumors 68030


    Jul 11, 2008
    I can't even find anything that says that they'd make a mobile GPU out of the GK106. They are suggesting it would be the Desktop GTX660.. which would make it roughly a 670M? But 660M, 670M, and 675M are already accounted for.

    Apple went top-available AMD last year. To go middle-of-the-road Nvidia this year would be disappointing.
  10. boto macrumors 6502


    Jun 4, 2012
    Regardless of which gpu manufacturer they choose I will enjoy a 680m or 7970m. Even if I need to pay additional costs to get the nvidia chip, I would gladly pay to enjoy less driver issues, compatibility, and more features integrated into the nvidia platform.
  11. Johnf1285 macrumors 6502a


    Dec 25, 2010
    New Jersey
    Correct me if I'm wrong, but isn't it speculated that it is the GT 680m that will appear in the iMac, not the GTX ?
  12. boto macrumors 6502


    Jun 4, 2012
    GTX represents the higher performance models, whereas the GT is for mid-range and below. So a GT 680m does not exist.
  13. DeF46 macrumors regular

    May 9, 2012

    Otherwise for comparsion we could ask what was the price of the 6970 at release?
  14. jmhart macrumors regular

    Jun 14, 2012
    Too expensive for iMac? I've never seen cost prevent Apple from going ahead with anything. If anything, they would just pass down the cost increase to us in the form of a BTO option. You gotta pay to play.

    No way they're going to not include the top end mobile GPU because of cost. I can see them not including it if it runs too hot for the form factor, but that's about the only way I see it not being a user selectable option at a premium price.
  15. Melbourne Park macrumors 6502

    Mar 5, 2012
    its suitable for imac, it runs cooler


    Nvidia GeForce GTX 680M GPU Adds Keplar Strength to Top-Tier Line-Up
    June 5th, 2012 by Kenneth Butler, LAPTOP Web Producer/Writer
    Today Nvidia announced the GeForce GTX 680M, a version of the GPU maker’s high-end graphics solution that’s infused with the power of Keplar, its latest design process. The 680M marks a new top-performance graphics solution that is powerful enough to facilitate HD gaming but cool enough to keep thin notebooks–like Ultrabooks–from over-heating.

    It’s well-known that the Keplar generation of graphics cards pack superlative performance. Thanks to smaller 28-nanometer design and special built-in features like improved anti-aliasing technology for rendering smoother lines, Keplar outperformed several other chip solutions including two notebooks with high-end AMD Radeon graphics in our benchmark tests.

    We have yet to test the 680M GPU, but with 2GB of video RAM and support for an arsenal of killer Nvidia features like Optimus switchable graphics and 3D Vision 2 technology, the new GeForce GPU isn’t likely to disappoint. Look for the 680M in upcoming Alienware M17x and M18x models, the MSI GT70, and new machines from Ava Direct, Maingear, and Origin.

    End of Quote

    There are speed tests out their for this GPU, and also for the 7970M GPU.
  16. Johnf1285 macrumors 6502a


    Dec 25, 2010
    New Jersey
    I understand the different tiers that Nvidia offers. Just misunderstood what the speculated mobile chip everyone thinks Apple is waiting for.

    In my opinion, I bet we see a spec bump sometime in July:

    Ivy Bridge
    Cheaper SSD options / Configurations
    HD7xxxm GPUs
  17. Rlnplehshalo macrumors regular

    Jan 28, 2011
    As long as some form of current GPU from Nvidia or AMD with 2GB+ option memory will be available I'm happy:)

Share This Page