2nd gen iMac Retina gpu?

Discussion in 'iMac' started by belleville, Jun 18, 2015.

  1. belleville macrumors newbie

    Joined:
    Sep 29, 2006
    #1
    Hi everyone!

    I'm in the market for an iMac Retina. Owning a Mac book air enables me to wait for the upcoming skylake refresh. If I would have bought one now, I would have updated to i7, m295x gpu and ssd.

    My question is, what do you guys think will happen to the gpu? Will it be upgraded? Is there anything better then the m295x coming soon from AMD? I know that Apple likes to play ping pong between AMD and nVidia.

    I've used macs for over 10 years but have never really looked at the way they update the gpu. I just wish they would have gone for the nVidia 980m though.

    What are your thoughts?
     
  2. slayerizer macrumors 6502a

    slayerizer

    Joined:
    Nov 9, 2012
    Location:
    Canada
    #2
    AMD just announced of new video cards, they will probably make mobile version soon after. They could probably be included in the next iMac refresh. I prefer nVidia card in my PC, I always preferred their drivers. Let's wait and see.
     
  3. dagamer34 macrumors 65816

    dagamer34

    Joined:
    May 1, 2007
    Location:
    Houston, TX
    #3
    Interesting 14/16nm GPUs won't be coming until mid-next year.
     
  4. belleville thread starter macrumors newbie

    Joined:
    Sep 29, 2006
    #4
    I had an nVidia card in my Hackintosh, but couldn't get it to be silent in idle which bothered me too much while doing light office tasks, so I ended up selling it.

    14/16nm GPU? Is that the mobile Version of a GPU?
     
  5. Samuelsan2001 macrumors 603

    Joined:
    Oct 24, 2013
    #5
    No thats the gate length and the manufacturing process, GPU's have been slow in getting smaller in comparison to CPU's with Intel currently on a 14nm process with plans to go to 10nm for canon-lake. These changes make chips smaller afster and more energy efficient.

    The Mobile versions of GPU's usually have an M in the name, M290X for instance the M is for mobile.
     
  6. belleville thread starter macrumors newbie

    Joined:
    Sep 29, 2006
  7. Dubadai, Jun 19, 2015
    Last edited: Jun 19, 2015

    Dubadai macrumors regular

    Dubadai

    Joined:
    Jun 16, 2015
    Location:
    Stockholm, Sweden.
    #7
    I just bought the late 2014 Retina iMac. After about two months of contemplating waiting for the new one coming this year, I took decided to buy it now instead.

    The new Skylake isn't going to be major for the desktops. The benchmarks that have leaked (that are trustworthy) show around 10% benchmark increase. Put that into real use and there won't be much of a reason to wait. Especially with El Capitans optimizations.

    Regarding GPUs. I really don't think Apple will use Nvidia. So if we look at the current "new" AMD cards we see rebrands except for the top cards. With this in mind I really don't see much of an improvement. Tops 10-15%. Again, nothing worth the wait. Sure there are other things such as the new USB etc. but the core performance won't be something to write home about in regards to how much better it will be than the Late 2014 model.

    On the other hand I do believe that people who will opt for the high end retina iMac this fall will get much more of an improvement than those who opt for the base model, as the lower models are mostly rebranded cards.

    One thing to note though is that the m290X is based on the m7970 (correct me if I'm wrong). So I could be completely wrong regarding the update this fall if there is a whole new chipset etc.

    Went a bit off topic but I hope that I somehow answered your question. :)

    Edit:
    Read this:
    http://www.notebookcheck.net/AMD-Radeon-R9-M390X.144432.0.html
     
  8. belleville thread starter macrumors newbie

    Joined:
    Sep 29, 2006
  9. belleville thread starter macrumors newbie

    Joined:
    Sep 29, 2006
    #9
    Thx for your eplanation.
     
  10. lchlch macrumors 6502a

    Joined:
    Mar 12, 2015
    #10
    The m390xis not the top of the line. The top of the line is the r9 fury x; which should be about 60% faster than the older generation r9 290x.

    The fury line uses hbm which enables a smaller gpu pcb and up to a 2x reduction in power consumption; which if used in the iMac will delay throttling.
     
  11. Dubadai macrumors regular

    Dubadai

    Joined:
    Jun 16, 2015
    Location:
    Stockholm, Sweden.
    #11
    Yes, you are correct that the m390X isnt the top of the line card, but I have a hard time thinking that the r9 Fury X will translate into a mobile card in the sense that we expect it to. The desktop model is extremely power hungry, so I honestly think that it'll be a while till we see something that is much better than m390X for mobile...

    I guess we'll have to wait and see what mobile cards they do release...

    I do hope they bring the Fury line into the mobile range, but I have a feeling it'll be next year and not for this year...

    But I could be wrong of course :)
     
  12. lchlch macrumors 6502a

    Joined:
    Mar 12, 2015
    #12
    I think the fury line is better suited for mobile. Because it consumes less power and hence produce less heat.
     
  13. Dubadai macrumors regular

    Dubadai

    Joined:
    Jun 16, 2015
    Location:
    Stockholm, Sweden.
    #13
    "Despite using two 8-pin power connectors, the Fury X's power consumption isn't as high as some feared: the TDP is 275W, just a tad higher than the R9 290X's, although it's worth bearing in mind that in real-world usage, the R9 290X was much closer to 300W. The Fury X supports up to 375W of power for overclocking." - Ars Technica

    Not so much "less power" I'm afraid... We'll see how the mobile cards turn out.
     
  14. Serban Suspended

    Joined:
    Jan 8, 2013
    #14
    so when Amd will release the top of the line dedicated GPU? like M395x ?
     
  15. MikhailT, Jun 21, 2015
    Last edited: Jun 21, 2015

    MikhailT macrumors 601

    Joined:
    Nov 12, 2007
    #15
    What about the Fury Nano? It's a 175watt card, I wonder if that is usable in iMac compared to the mobile x90x cards? Ars said 2x pref per watt against 290X. If they can apply it to the mobile series, it might be a good update.

    That's not the high-end anymore, it's been replaced by the Fury series which looks to be interesting with HBM.
     
  16. Stacc macrumors 6502a

    Joined:
    Jun 22, 2005
    #16
    One problem here is essentially the Fury Nano is double the processing power of the m295x. In AMD chip nomenclature, Fiji (Fury) is essentially Tonga x2 (m295x) with HBM instead of GDDR5. Right now, it sounds like the cooling in the iMac struggles with the m295x, so its doubtful whether any increase in thermal load is tolerable. So if the enclosure or cooling system doesn't change, either the Fury will have to be significantly downclocked, reducing its performance, or we will be stuck with something similar to the m295x. Of course the alternative is something from Nvidia, but Apple has certainly been going out of their way to avoid them.
     
  17. Serban Suspended

    Joined:
    Jan 8, 2013
    #17
    so what gpu will probably get into the next 27" imac?
     
  18. belleville thread starter macrumors newbie

    Joined:
    Sep 29, 2006
  19. Dubadai macrumors regular

    Dubadai

    Joined:
    Jun 16, 2015
    Location:
    Stockholm, Sweden.
    #19
    In other words we are stuck with something very much the same to what we have today?
     
  20. Serban Suspended

    Joined:
    Jan 8, 2013
    #20
    yes, until next year when we hope will get 1080M from Nvidia
     
  21. Dubadai macrumors regular

    Dubadai

    Joined:
    Jun 16, 2015
    Location:
    Stockholm, Sweden.
    #21
    While it is little bit of a disappointment, but it isn't Apple's fault. If they have decided to stick to AMD, and they rebrand most of their cards for this release, then we as consumers will get rebranded cards. Of course I am sure that there will be an increase in performance, but just not by a lot.

    On the other hand, this leaves everyone with the late 2014 not too far behind (Which feels good for us haha). Had they put a GTX980m in the late 2015 5K iMac then most would obviously HAVE to upgrade ;)
     
  22. soupcan macrumors 6502a

    soupcan

    Joined:
    Nov 21, 2014
    Location:
    Netherlands
    #22
    I'm still hoping they will. At some point Apple will be tired of all the bad PR for using rebranded AMD cards which oh by the way AMD does all the rebranding.

    I'm still happy I got my MacBook Pro with the 750M and if I were in the market for an iMac I would get a 2nd hand one with the GTX780M instantly.
     
  23. Dubadai macrumors regular

    Dubadai

    Joined:
    Jun 16, 2015
    Location:
    Stockholm, Sweden.
    #23
    Yeah, but I do understand their decisions as well. People always complain that you can't game etc on a Mac. The thing is:

    It. Is. Not. A. Gaming. Computer.

    Sure, it's got potential, but why on earth should Apple listen to gamers etc and put graphics cards that are build for games when the current cards handle everything the mac should do just fine...

    I understand the viewpoint of having more power, but not people who complain because it doesn't run Witcher 3 on high.

    A Mac is a workstation (give or take, depending on model), a gaming pc is a gaming pc.
     
  24. soupcan macrumors 6502a

    soupcan

    Joined:
    Nov 21, 2014
    Location:
    Netherlands
    #24
    It's not just gaming. There's this thing called GPU acceleration, and it's getting more and more support from companies like Adobe and Autodesk and Maxon. Thing is, they're using CUDA. AMD GPUs don't support CUDA. Apple wants to push the industry to OpenCL, but the software makers simply refuse. Why bother with it when you already have a stable platform in the form of CUDA? This is why people are upset. I know I was when Apple put a M370X in the MacBook Pro. I was expecting a GT950M or a GTX960M in there which outperforms the M370X on almost everything.
     
  25. Dubadai macrumors regular

    Dubadai

    Joined:
    Jun 16, 2015
    Location:
    Stockholm, Sweden.
    #25
    I am highly aware of GPU Acceleration. I am also sure that you are aware of how Apple usually does things, not mainstream. Even though it was a mistake leaving Nvidia.

    There is always a reason for everything.
     

Share This Page