AMD M395X in new Retina iMac

Discussion in 'iMac' started by MandiMac, Aug 16, 2015.

  1. MandiMac macrumors 6502

    Joined:
    Feb 25, 2012
    #1
    Hi all,

    I did some research and I came across an interesting fact. If you're looking at the AMD website, they actually have already the data for the M395X online. Link is here.

    It seems like the PC version of the R9 M395X is exactly the same as the R9 M295X.

    M295X: 28 nm, 32 Compute Units, 2048 Stream Processors, 723 MHz, 4 GB GDDR5, 1250 MHz memory clock, 160 Gb/s memory bandwith, 256-bit memory interface.

    M395X: 28 nm, 32 Compute Units, 2048 Stream Processors, 723 MHz, 4 GB GDDR5, 1250 MHz memory clock, 160 Gb/s memory bandwith, 256-bit memory interface.

    If AMD is doing such a rebrand, we could at least expect some MHz boost when it comes to the engine, right? Or does that mean that the technicalities are the same, but the underlying architecture/logic would be more efficient?

    GPUboss however paints another picture and differentiates the reference design of the M295X. There seems to be a difference when it comes to the MHz.
    The clock speed in the Mac version of the R9 295X seems to be 850 instead of 723 MHz, the memory clock speed is 1362 MHz instead of 1250 and thus the floating-point performance and the pixel rate are higher.

    I'm at a loss here - GPU wise it seems there's no point to wait for the next revision of the riMac. Or will Apple do some magic and turn up the MHz even higher? I'm completely confused now. Any thoughts?
     
  2. mjohansen macrumors regular

    Joined:
    Feb 19, 2010
    Location:
    Denmark
    #2
    I seriously hope they will go with the rumoured GTX 990M (http://wccftech.com/nvidia-geforce-mobility-gtx-990m-q4-2015-faster-than-gtx-980/)
     
  3. siddhartha macrumors member

    siddhartha

    Joined:
    Aug 8, 2008
    Location:
    Northern Virgina
    #3

    You are assuming that they stay with AMD for the next GPU in the riMac. There's no guarantee of that.
     
  4. pescartina macrumors newbie

    Joined:
    Aug 16, 2015
    #4
    It's definitely possible, and I'd probably pay $200 more for an nVidia GPU, but when El Capitan code references the M395X, it doesn't look good.
     
  5. AsprineTm macrumors member

    Joined:
    Jun 14, 2014
    #5
    Yep, thats what im expecting swell. The 21" will come out with the rebrand M395X as found in the El Capitan code.
    I assume the 27" will get an Sky-late update in Q2 2016 with AMD graphics with HBM2.
    Apple is never in a hurry to put a decent graphics card in a iMac.
    Maybe they will even wait longer and first bring out a new mac Pro.

    Its about time apple invests in some better GPU solutions.
    They always have the best of the best CPU's but combine it with mediocre GPU's.
     
  6. MandiMac thread starter macrumors 6502

    Joined:
    Feb 25, 2012
    #6
    I don't think that Apple switches GPU vendors for just one year. AMD is the better solution when it comes to ridiculously high resolutions as it is 5K on the riMac, as well. I hoped for a 980M too last year, and now we've got AMD on board. That's how it is in 2015 :)
     
  7. MandiMac thread starter macrumors 6502

    Joined:
    Feb 25, 2012
    #7
    Why do you think that the 21 incher will get the M395X powerhouse? The small iMac never got a decent GPU and I think that won't change anytime soon. My best guess is that the M390X will be in the riMac 27 inch as standard, and the M395X will be BTO just like it is today with the M200 series.
     
  8. redmike macrumors newbie

    Joined:
    Jul 15, 2010
    Location:
    Czech Republic
    #8
  9. theSeb macrumors 604

    theSeb

    Joined:
    Aug 10, 2010
    Location:
    Poole, England
    #9
  10. johnnyturbouk macrumors 68000

    johnnyturbouk

    Joined:
    Feb 9, 2011
    Location:
    on the yellow [oled] brick road to tech nirvana.
    #10
    Another confounding issue was that the 980M was only just announced when the RiMacs dropped last year. i has been in the the mind of buying the RiMac, but despite using being an early adopter on launch day, my spidey senses told me to hold on and test the machine out. I am certainly glad i did, from my personal test on the machines, and from other reports on macrumors, i am glad i am waiting for gen2 of the RiMac.

    Please apple, have the GTx990M [at least] as an option in the gen 2 RiMac.
     
  11. johnnyturbouk, Aug 16, 2015
    Last edited: Aug 17, 2015

    johnnyturbouk macrumors 68000

    johnnyturbouk

    Joined:
    Feb 9, 2011
    Location:
    on the yellow [oled] brick road to tech nirvana.
    #11
    If apple has taught us one thing, never say never, when predicting forthcoming apple products, there may of been multiple reasons apple went the with AMD > Nvidia 980M


    Do you have an evidence that the AMD is superior to the Nvidia card, in context of hi-red display. I have geadrd the contrary, that the 980M would of been the logical case.
     
  12. jerwin macrumors 65816

    Joined:
    Jun 13, 2015
    #12
    It also seems like the PC version of the R9 M395X is exactly the same as the R9 M390X.
     
  13. mjohansen, Aug 16, 2015
    Last edited: Aug 17, 2015

    mjohansen macrumors regular

    Joined:
    Feb 19, 2010
    Location:
    Denmark
    #13
    But then again, if the new iMac gets Thunderbolt 3 and external GPU support the iMac itself might not need a internal dedicated GPU as powerful as the GTX 990M
     
  14. robertojorge macrumors regular

    robertojorge

    Joined:
    May 6, 2014
    Location:
    Portugal
    #14
  15. theSeb macrumors 604

    theSeb

    Joined:
    Aug 10, 2010
    Location:
    Poole, England
    #15
    Good point. With the TB3 I could connect a nice 980 Ti
     
  16. boto macrumors 6502

    boto

    Joined:
    Jun 4, 2012
    #16
    I highly doubt Apple will shift back to nVidia when they are getting great custom made GPUs at much lower costs, but I hope I'm wrong. I'm willing to pay extra for nVidia!!
     
  17. MandiMac thread starter macrumors 6502

    Joined:
    Feb 25, 2012
    #17
    There's no clear evidence, but have a look yourself at these benchmarks over at AnandTech's. They clearly show that Nvidia has the advantage on Full HD resolutions, but once you go 1440p and beyond, this difference is minimal. Sometimes, the green ones win, but more often than not, AMD takes the lead. I don't say it's clearly superior, but the difference is next to negligible.
     
  18. MandiMac thread starter macrumors 6502

    Joined:
    Feb 25, 2012
    #18
    You're right, this is odd. And the same table shows the difference between the R9 M375 and the R9 M375X - the X version is around 25 MHz faster regarding the memory clock speed, and it has GDDR5 instead of slow DDR3 memory.

    Maybe this solves the confusion between the current R9 M290 and the R9 M290X? Then again, Apple's site says clearly that the R9 M290 has 2 gigs of GDDR5 memory. The jump from around 25 MHz can't be the only difference...very vague.
     
  19. jerwin macrumors 65816

    Joined:
    Jun 13, 2015
    #19
    What you need to do to really compare the r9 series is to take note of how many compute units the card has. This is luxmark, an opencl benchmark.

    Screen Shot 1.png
    This is LuxMark 3.0. I actually prefer LuxMark 2.0, it's a bit more stable.
    Note the "Compute Units." The r9 m290x has 20 of them, and I believe the r9 m295x has 32.
    How many does the r9 290 have? 14? 16?
    For that matter, it should be easy to verify that the r9 m370x has 10.

    Note also the clockspeed--975 Mhz. This also varies from card to card.

    If you've ever glanced at the review of the Furys, you'll know that specs wise, it is a beast. Benchmarked with real games. though, it doesn't hold much of an advantage over the nVidia Titans.
    So specs are one thing, performance in the games you want to play is quite another.
     
  20. MandiMac thread starter macrumors 6502

    Joined:
    Feb 25, 2012
    #20
  21. filmak macrumors 6502a

    Joined:
    Jun 21, 2012
    Location:
    between earth and heaven
    #21
    Imho, I really believe that whatever chip they put inside the new Rimac, the priority should be to do something about its thermal dissipation's design, there is a need for something better than the current high end model.
    Especially if they choose, and probably they will, an AMD solution.
     
  22. Nunyabinez, Aug 19, 2015
    Last edited: Aug 19, 2015

    Nunyabinez macrumors 68000

    Nunyabinez

    Joined:
    Apr 27, 2010
    Location:
    Provo, UT
    #22
    My personal experience is different. I have the riMac with the R9 295x and a gaming rig with a GTX 970. With the same settings the GTX 970's FPS are considerably higher. And I should say this is in Boot Camp vs. PC so it's not a driver issue.

    I thought I would just use my riMac for a game machine, but I decided that I needed a platform that was future proof, and while the riMac is an awesome Mac, my little i5 game machine kills it for gaming.

    Edit: Now that I think of it, I haven't tried this since I moved both machines to windows 10. I'll give it a shot and see what happens. Of course I have lots of games that may not be DX12 compatible, but I would be an interesting twist.
     
  23. filmak macrumors 6502a

    Joined:
    Jun 21, 2012
    Location:
    between earth and heaven
  24. MandiMac thread starter macrumors 6502

    Joined:
    Feb 25, 2012
    #24
    First, what exactly are higher FPS? I always keep VSync active because I personally like a fixed 30/60 FPS rate more than a hot graphics card ;)

    While your test might be an interesting move, Ars Technica already clearly showed the advantage of Nvidia - or better, the bad AMD driver when it comes to DX 11 games. It's just that DX12 and Metal are at its core the same (or at least should be). So that's where things are going to get interesting and it should remove the dramatic FPS difference when it comes to OS X vs. Windows...
     
  25. aevan macrumors 68000

    aevan

    Joined:
    Feb 5, 2015
    Location:
    Serbia
    #25
    Not really, no. Not on its own. It really depends on the developer and the code. Metal and DX12 CAN be used properly, but shaders can be written poorly just as any other part of the code. Meaning, if developers invest time and energy to make both a Metal and DX12 version of a game work properly, then yeah, the difference in FPS should be smaller. But whether developers will actually take the time to optimise a game for both platforms is questionable. Mac ports are often done by third party developers and we'll see if they use Metal the way it should be used. And from what I read, such low level coding requires some top level coding skills. Just remember that whole Batman Arkham Knight PC vs PS4/Xbox One fiasco and you can see how it all comes down to proper coding for each platform.

    TLDR: Metal/DX12 won't do anything by themselves but will require some dedication from developers, so whether the FPS gap between Windows and OS X will be smaller remains to be seen.
     

Share This Page