Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

MandiMac

macrumors 65816
Original poster
Feb 25, 2012
1,433
883
Hi all,

I did some research and I came across an interesting fact. If you're looking at the AMD website, they actually have already the data for the M395X online. Link is here.

It seems like the PC version of the R9 M395X is exactly the same as the R9 M295X.

M295X: 28 nm, 32 Compute Units, 2048 Stream Processors, 723 MHz, 4 GB GDDR5, 1250 MHz memory clock, 160 Gb/s memory bandwith, 256-bit memory interface.

M395X: 28 nm, 32 Compute Units, 2048 Stream Processors, 723 MHz, 4 GB GDDR5, 1250 MHz memory clock, 160 Gb/s memory bandwith, 256-bit memory interface.

If AMD is doing such a rebrand, we could at least expect some MHz boost when it comes to the engine, right? Or does that mean that the technicalities are the same, but the underlying architecture/logic would be more efficient?

GPUboss however paints another picture and differentiates the reference design of the M295X. There seems to be a difference when it comes to the MHz.
The clock speed in the Mac version of the R9 295X seems to be 850 instead of 723 MHz, the memory clock speed is 1362 MHz instead of 1250 and thus the floating-point performance and the pixel rate are higher.

I'm at a loss here - GPU wise it seems there's no point to wait for the next revision of the riMac. Or will Apple do some magic and turn up the MHz even higher? I'm completely confused now. Any thoughts?
 

mjohansen

macrumors regular
Feb 19, 2010
238
56
Denmark
Hi all,

I did some research and I came across an interesting fact. If you're looking at the AMD website, they actually have already the data for the M395X online. Link is here.

It seems like the PC version of the R9 M395X is exactly the same as the R9 M295X.

M295X: 28 nm, 32 Compute Units, 2048 Stream Processors, 723 MHz, 4 GB GDDR5, 1250 MHz memory clock, 160 Gb/s memory bandwith, 256-bit memory interface.

M395X: 28 nm, 32 Compute Units, 2048 Stream Processors, 723 MHz, 4 GB GDDR5, 1250 MHz memory clock, 160 Gb/s memory bandwith, 256-bit memory interface.

If AMD is doing such a rebrand, we could at least expect some MHz boost when it comes to the engine, right? Or does that mean that the technicalities are the same, but the underlying architecture/logic would be more efficient?

GPUboss however paints another picture and differentiates the reference design of the M295X. There seems to be a difference when it comes to the MHz.
The clock speed in the Mac version of the R9 295X seems to be 850 instead of 723 MHz, the memory clock speed is 1362 MHz instead of 1250 and thus the floating-point performance and the pixel rate are higher.

I'm at a loss here - GPU wise it seems there's no point to wait for the next revision of the riMac. Or will Apple do some magic and turn up the MHz even higher? I'm completely confused now. Any thoughts?
I seriously hope they will go with the rumoured GTX 990M (http://wccftech.com/nvidia-geforce-mobility-gtx-990m-q4-2015-faster-than-gtx-980/)
 

siddhartha

macrumors regular
Aug 8, 2008
160
44
Northern Virgina
Hi all,

I did some research and I came across an interesting fact. If you're looking at the AMD website, they actually have already the data for the M395X online. Link is here.

It seems like the PC version of the R9 M395X is exactly the same as the R9 M295X.

M295X: 28 nm, 32 Compute Units, 2048 Stream Processors, 723 MHz, 4 GB GDDR5, 1250 MHz memory clock, 160 Gb/s memory bandwith, 256-bit memory interface.

M395X: 28 nm, 32 Compute Units, 2048 Stream Processors, 723 MHz, 4 GB GDDR5, 1250 MHz memory clock, 160 Gb/s memory bandwith, 256-bit memory interface.

If AMD is doing such a rebrand, we could at least expect some MHz boost when it comes to the engine, right? Or does that mean that the technicalities are the same, but the underlying architecture/logic would be more efficient?

GPUboss however paints another picture and differentiates the reference design of the M295X. There seems to be a difference when it comes to the MHz.
The clock speed in the Mac version of the R9 295X seems to be 850 instead of 723 MHz, the memory clock speed is 1362 MHz instead of 1250 and thus the floating-point performance and the pixel rate are higher.

I'm at a loss here - GPU wise it seems there's no point to wait for the next revision of the riMac. Or will Apple do some magic and turn up the MHz even higher? I'm completely confused now. Any thoughts?


You are assuming that they stay with AMD for the next GPU in the riMac. There's no guarantee of that.
 

AsprineTm

macrumors member
Jun 14, 2014
89
47
Yep, thats what im expecting swell. The 21" will come out with the rebrand M395X as found in the El Capitan code.
I assume the 27" will get an Sky-late update in Q2 2016 with AMD graphics with HBM2.
Apple is never in a hurry to put a decent graphics card in a iMac.
Maybe they will even wait longer and first bring out a new mac Pro.

Its about time apple invests in some better GPU solutions.
They always have the best of the best CPU's but combine it with mediocre GPU's.
 
  • Like
Reactions: AlifTheUnseen

MandiMac

macrumors 65816
Original poster
Feb 25, 2012
1,433
883
You are assuming that they stay with AMD for the next GPU in the riMac. There's no guarantee of that.

I don't think that Apple switches GPU vendors for just one year. AMD is the better solution when it comes to ridiculously high resolutions as it is 5K on the riMac, as well. I hoped for a 980M too last year, and now we've got AMD on board. That's how it is in 2015 :)
 
  • Like
Reactions: kiopoptr877

MandiMac

macrumors 65816
Original poster
Feb 25, 2012
1,433
883
Yep, thats what im expecting swell. The 21" will come out with the rebrand M395X as found in the El Capitan code.
I assume the 27" will get an Sky-late update in Q2 2016 with AMD graphics with HBM2.

Why do you think that the 21 incher will get the M395X powerhouse? The small iMac never got a decent GPU and I think that won't change anytime soon. My best guess is that the M390X will be in the riMac 27 inch as standard, and the M395X will be BTO just like it is today with the M200 series.
 
Another confounding issue was that the 980M was only just announced when the RiMacs dropped last year. i has been in the the mind of buying the RiMac, but despite using being an early adopter on launch day, my spidey senses told me to hold on and test the machine out. I am certainly glad i did, from my personal test on the machines, and from other reports on macrumors, i am glad i am waiting for gen2 of the RiMac.

Please apple, have the GTx990M [at least] as an option in the gen 2 RiMac.
 
  • Like
Reactions: AlifTheUnseen
I don't think that Apple switches GPU vendors for just one year.

If apple has taught us one thing, never say never, when predicting forthcoming apple products, there may of been multiple reasons apple went the with AMD > Nvidia 980M


AMD is the better solution when it comes to ridiculously high resolutions as it is 5K on the riMac, as well. I hoped for a 980M too last year, and now we've got AMD on board. That's how it is in 2015 :)

Do you have an evidence that the AMD is superior to the Nvidia card, in context of hi-red display. I have geadrd the contrary, that the 980M would of been the logical case.
 
Last edited:

mjohansen

macrumors regular
Feb 19, 2010
238
56
Denmark
You can keep hoping, but unfortunately I am 99% sure that this won't happen. If it did, then I would buy an iMac.
But then again, if the new iMac gets Thunderbolt 3 and external GPU support the iMac itself might not need a internal dedicated GPU as powerful as the GTX 990M
 
Last edited:

theSeb

macrumors 604
Aug 10, 2010
7,466
1,893
none
But then again, if the new iMac gets Thunderbolt 3 and external GPU support the iMac itself might not need a internal dedicated GPU as powerful as the GTX 990M
Good point. With the TB3 I could connect a nice 980 Ti
 

boto

macrumors 6502
Jun 4, 2012
437
28
I highly doubt Apple will shift back to nVidia when they are getting great custom made GPUs at much lower costs, but I hope I'm wrong. I'm willing to pay extra for nVidia!!
 
  • Like
Reactions: MandiMac

MandiMac

macrumors 65816
Original poster
Feb 25, 2012
1,433
883
Do you have an evidence that the AMD is superior to the Nvidia card, in context of hi-red display. I have geadrd the contrary, that the 980M would of been the logical case.

There's no clear evidence, but have a look yourself at these benchmarks over at AnandTech's. They clearly show that Nvidia has the advantage on Full HD resolutions, but once you go 1440p and beyond, this difference is minimal. Sometimes, the green ones win, but more often than not, AMD takes the lead. I don't say it's clearly superior, but the difference is next to negligible.
 
  • Like
Reactions: kiopoptr877

MandiMac

macrumors 65816
Original poster
Feb 25, 2012
1,433
883
It also seems like the PC version of the R9 M395X is exactly the same as the R9 M390X.

You're right, this is odd. And the same table shows the difference between the R9 M375 and the R9 M375X - the X version is around 25 MHz faster regarding the memory clock speed, and it has GDDR5 instead of slow DDR3 memory.

Maybe this solves the confusion between the current R9 M290 and the R9 M290X? Then again, Apple's site says clearly that the R9 M290 has 2 gigs of GDDR5 memory. The jump from around 25 MHz can't be the only difference...very vague.
 

jerwin

Suspended
Jun 13, 2015
2,895
4,651
What you need to do to really compare the r9 series is to take note of how many compute units the card has. This is luxmark, an opencl benchmark.

Screen Shot 1.png

This is LuxMark 3.0. I actually prefer LuxMark 2.0, it's a bit more stable.
Note the "Compute Units." The r9 m290x has 20 of them, and I believe the r9 m295x has 32.
How many does the r9 290 have? 14? 16?
For that matter, it should be easy to verify that the r9 m370x has 10.

Note also the clockspeed--975 Mhz. This also varies from card to card.

If you've ever glanced at the review of the Furys, you'll know that specs wise, it is a beast. Benchmarked with real games. though, it doesn't hold much of an advantage over the nVidia Titans.
So specs are one thing, performance in the games you want to play is quite another.
 

filmak

macrumors 65816
Jun 21, 2012
1,418
777
between earth and heaven
Imho, I really believe that whatever chip they put inside the new Rimac, the priority should be to do something about its thermal dissipation's design, there is a need for something better than the current high end model.
Especially if they choose, and probably they will, an AMD solution.
 

Nunyabinez

macrumors 68000
Apr 27, 2010
1,758
2,230
Provo, UT
Small heads-up for everyone here: Ars Technica got their hands on a DirectX 12 benchmark.

Very, very interesting that the 290X even blows the 980 Ti one time and comes close every time...

My personal experience is different. I have the riMac with the R9 295x and a gaming rig with a GTX 970. With the same settings the GTX 970's FPS are considerably higher. And I should say this is in Boot Camp vs. PC so it's not a driver issue.

I thought I would just use my riMac for a game machine, but I decided that I needed a platform that was future proof, and while the riMac is an awesome Mac, my little i5 game machine kills it for gaming.

Edit: Now that I think of it, I haven't tried this since I moved both machines to windows 10. I'll give it a shot and see what happens. Of course I have lots of games that may not be DX12 compatible, but I would be an interesting twist.
 
Last edited:

MandiMac

macrumors 65816
Original poster
Feb 25, 2012
1,433
883
My personal experience is different. I have the riMac with the R9 295x and a gaming rig with a GTX 970. With the same settings the GTX 970's FPS are considerably higher. And I should say this is in Boot Camp vs. PC so it's not a driver issue.

I thought I would just use my riMac for a game machine, but I decided that I needed a platform that was future proof, and while the riMac is an awesome Mac, my little i5 game machine kills it for gaming.

Edit: Now that I think of it, I haven't tried this since I moved both machines to windows 10. I'll give it a shot and see what happens. Of course I have lots of games that may not be DX12 compatible, but I would be an interesting twist.

First, what exactly are higher FPS? I always keep VSync active because I personally like a fixed 30/60 FPS rate more than a hot graphics card ;)

While your test might be an interesting move, Ars Technica already clearly showed the advantage of Nvidia - or better, the bad AMD driver when it comes to DX 11 games. It's just that DX12 and Metal are at its core the same (or at least should be). So that's where things are going to get interesting and it should remove the dramatic FPS difference when it comes to OS X vs. Windows...
 

aevan

macrumors 601
Feb 5, 2015
4,507
7,176
Serbia
While your test might be an interesting move, Ars Technica already clearly showed the advantage of Nvidia - or better, the bad AMD driver when it comes to DX 11 games. It's just that DX12 and Metal are at its core the same (or at least should be). So that's where things are going to get interesting and it should remove the dramatic FPS difference when it comes to OS X vs. Windows...

Not really, no. Not on its own. It really depends on the developer and the code. Metal and DX12 CAN be used properly, but shaders can be written poorly just as any other part of the code. Meaning, if developers invest time and energy to make both a Metal and DX12 version of a game work properly, then yeah, the difference in FPS should be smaller. But whether developers will actually take the time to optimise a game for both platforms is questionable. Mac ports are often done by third party developers and we'll see if they use Metal the way it should be used. And from what I read, such low level coding requires some top level coding skills. Just remember that whole Batman Arkham Knight PC vs PS4/Xbox One fiasco and you can see how it all comes down to proper coding for each platform.

TLDR: Metal/DX12 won't do anything by themselves but will require some dedication from developers, so whether the FPS gap between Windows and OS X will be smaller remains to be seen.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.