Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

belleville

macrumors newbie
Original poster
Sep 29, 2006
12
2
Hi everyone!

I'm in the market for an iMac Retina. Owning a Mac book air enables me to wait for the upcoming skylake refresh. If I would have bought one now, I would have updated to i7, m295x gpu and ssd.

My question is, what do you guys think will happen to the gpu? Will it be upgraded? Is there anything better then the m295x coming soon from AMD? I know that Apple likes to play ping pong between AMD and nVidia.

I've used macs for over 10 years but have never really looked at the way they update the gpu. I just wish they would have gone for the nVidia 980m though.

What are your thoughts?
 
Hi everyone!

I'm in the market for an iMac Retina. Owning a Mac book air enables me to wait for the upcoming skylake refresh. If I would have bought one now, I would have updated to i7, m295x gpu and ssd.

My question is, what do you guys think will happen to the gpu? Will it be upgraded? Is there anything better then the m295x coming soon from AMD? I know that Apple likes to play ping pong between AMD and nVidia.

I've used macs for over 10 years but have never really looked at the way they update the gpu. I just wish they would have gone for the nVidia 980m though.

What are your thoughts?

AMD just announced of new video cards, they will probably make mobile version soon after. They could probably be included in the next iMac refresh. I prefer nVidia card in my PC, I always preferred their drivers. Let's wait and see.
 
I had an nVidia card in my Hackintosh, but couldn't get it to be silent in idle which bothered me too much while doing light office tasks, so I ended up selling it.

14/16nm GPU? Is that the mobile Version of a GPU?
 
  • Like
Reactions: Cape Dave
I had an nVidia card in my Hackintosh, but couldn't get it to be silent in idle which bothered me too much while doing light office tasks, so I ended up selling it.

14/16nm GPU? Is that the mobile Version of a GPU?

No thats the gate length and the manufacturing process, GPU's have been slow in getting smaller in comparison to CPU's with Intel currently on a 14nm process with plans to go to 10nm for canon-lake. These changes make chips smaller afster and more energy efficient.

The Mobile versions of GPU's usually have an M in the name, M290X for instance the M is for mobile.
 
Hi everyone!

I'm in the market for an iMac Retina. Owning a Mac book air enables me to wait for the upcoming skylake refresh. If I would have bought one now, I would have updated to i7, m295x gpu and ssd.

My question is, what do you guys think will happen to the gpu? Will it be upgraded? Is there anything better then the m295x coming soon from AMD? I know that Apple likes to play ping pong between AMD and nVidia.

I've used macs for over 10 years but have never really looked at the way they update the gpu. I just wish they would have gone for the nVidia 980m though.

What are your thoughts?

I just bought the late 2014 Retina iMac. After about two months of contemplating waiting for the new one coming this year, I took decided to buy it now instead.

The new Skylake isn't going to be major for the desktops. The benchmarks that have leaked (that are trustworthy) show around 10% benchmark increase. Put that into real use and there won't be much of a reason to wait. Especially with El Capitans optimizations.

Regarding GPUs. I really don't think Apple will use Nvidia. So if we look at the current "new" AMD cards we see rebrands except for the top cards. With this in mind I really don't see much of an improvement. Tops 10-15%. Again, nothing worth the wait. Sure there are other things such as the new USB etc. but the core performance won't be something to write home about in regards to how much better it will be than the Late 2014 model.

On the other hand I do believe that people who will opt for the high end retina iMac this fall will get much more of an improvement than those who opt for the base model, as the lower models are mostly rebranded cards.

One thing to note though is that the m290X is based on the m7970 (correct me if I'm wrong). So I could be completely wrong regarding the update this fall if there is a whole new chipset etc.

Went a bit off topic but I hope that I somehow answered your question. :)

Edit:
Read this:
http://www.notebookcheck.net/AMD-Radeon-R9-M390X.144432.0.html
 
Last edited:
  • Like
Reactions: 762999
I just bought the late 2014 Retina iMac. After about two months of contemplating waiting for the new one coming this year, I took decided to buy it now instead.

The new Skylake isn't going to be major for the desktops. The benchmarks that have leaked (that are trustworthy) show around 10% benchmark increase. Put that into real use and there won't be much of a reason to wait. Especially with El Capitans optimizations.

Regarding GPUs. I really don't think Apple will use Nvidia. So if we look at the current "new" AMD cards we see rebrands except for the top cards. With this in mind I really don't see much of an improvement. Tops 10-15%. Again, nothing worth the wait. Sure there are other things such as the new USB etc. but the core performance won't be something to write home about in regards to how much better it will be than the Late 2014 model.

On the other hand I do believe that people who will opt for the high end retina iMac this fall will get much more of an improvement than those who opt for the base model, as the lower models are mostly rebranded cards.

One thing to note though is that the m290X is based on the m7970 (correct me if I'm wrong). So I could be completely wrong regarding the update this fall if there is a whole new chipset etc.

Went a bit off topic but I hope that I somehow answered your question. :)

Edit:
Read this:
http://www.notebookcheck.net/AMD-Radeon-R9-M390X.144432.0.html

The m390xis not the top of the line. The top of the line is the r9 fury x; which should be about 60% faster than the older generation r9 290x.

The fury line uses hbm which enables a smaller gpu pcb and up to a 2x reduction in power consumption; which if used in the iMac will delay throttling.
 
Yes, you are correct that the m390X isnt the top of the line card, but I have a hard time thinking that the r9 Fury X will translate into a mobile card in the sense that we expect it to. The desktop model is extremely power hungry, so I honestly think that it'll be a while till we see something that is much better than m390X for mobile...

I guess we'll have to wait and see what mobile cards they do release...

I do hope they bring the Fury line into the mobile range, but I have a feeling it'll be next year and not for this year...

But I could be wrong of course :)
 
Yes, you are correct that the m390X isnt the top of the line card, but I have a hard time thinking that the r9 Fury X will translate into a mobile card in the sense that we expect it to. The desktop model is extremely power hungry, so I honestly think that it'll be a while till we see something that is much better than m390X for mobile...

I guess we'll have to wait and see what mobile cards they do release...

I do hope they bring the Fury line into the mobile range, but I have a feeling it'll be next year and not for this year...

But I could be wrong of course :)
I think the fury line is better suited for mobile. Because it consumes less power and hence produce less heat.
 
I think the fury line is better suited for mobile. Because it consumes less power and hence produce less heat.

"Despite using two 8-pin power connectors, the Fury X's power consumption isn't as high as some feared: the TDP is 275W, just a tad higher than the R9 290X's, although it's worth bearing in mind that in real-world usage, the R9 290X was much closer to 300W. The Fury X supports up to 375W of power for overclocking." - Ars Technica

Not so much "less power" I'm afraid... We'll see how the mobile cards turn out.
 
"Despite using two 8-pin power connectors, the Fury X's power consumption isn't as high as some feared: the TDP is 275W, just a tad higher than the R9 290X's, although it's worth bearing in mind that in real-world usage, the R9 290X was much closer to 300W. The Fury X supports up to 375W of power for overclocking." - Ars Technica

Not so much "less power" I'm afraid... We'll see how the mobile cards turn out.

What about the Fury Nano? It's a 175watt card, I wonder if that is usable in iMac compared to the mobile x90x cards? Ars said 2x pref per watt against 290X. If they can apply it to the mobile series, it might be a good update.

The third card is an odd one. Called the R9 Nano, it too is based on the Fiji chip with HBM, but comes in at just 15cm/6in, a size usually reserved for low-end cards. There's a large fan on the side for venting air inside a PC case, and AMD claims the R9 Nano offer ups to two times the performance-per-watt over the R9 290X; higher than the improvement offered by the Fury and Fury X. No pricing was given for the R9 Nano, but it is supposed to launch "this summer."

so when Amd will release the top of the line dedicated GPU? like M395x ?

That's not the high-end anymore, it's been replaced by the Fury series which looks to be interesting with HBM.
 
Last edited:
What about the Fury Nano? It's a 175watt card, I wonder if that is usable in iMac compared to the mobile x90x cards? Ars said 2x pref per watt against 290X. If they can apply it to the mobile series, it might be a good update.





That's not the high-end anymore, it's been replaced by the Fury series which looks to be interesting with HBM.
One problem here is essentially the Fury Nano is double the processing power of the m295x. In AMD chip nomenclature, Fiji (Fury) is essentially Tonga x2 (m295x) with HBM instead of GDDR5. Right now, it sounds like the cooling in the iMac struggles with the m295x, so its doubtful whether any increase in thermal load is tolerable. So if the enclosure or cooling system doesn't change, either the Fury will have to be significantly downclocked, reducing its performance, or we will be stuck with something similar to the m295x. Of course the alternative is something from Nvidia, but Apple has certainly been going out of their way to avoid them.
 
yes, until next year when we hope will get 1080M from Nvidia

While it is little bit of a disappointment, but it isn't Apple's fault. If they have decided to stick to AMD, and they rebrand most of their cards for this release, then we as consumers will get rebranded cards. Of course I am sure that there will be an increase in performance, but just not by a lot.

On the other hand, this leaves everyone with the late 2014 not too far behind (Which feels good for us haha). Had they put a GTX980m in the late 2015 5K iMac then most would obviously HAVE to upgrade ;)
 
While it is little bit of a disappointment, but it isn't Apple's fault. If they have decided to stick to AMD, and they rebrand most of their cards for this release, then we as consumers will get rebranded cards. Of course I am sure that there will be an increase in performance, but just not by a lot.

On the other hand, this leaves everyone with the late 2014 not too far behind (Which feels good for us haha). Had they put a GTX980m in the late 2015 5K iMac then most would obviously HAVE to upgrade ;)
I'm still hoping they will. At some point Apple will be tired of all the bad PR for using rebranded AMD cards which oh by the way AMD does all the rebranding.

I'm still happy I got my MacBook Pro with the 750M and if I were in the market for an iMac I would get a 2nd hand one with the GTX780M instantly.
 
I'm still hoping they will. At some point Apple will be tired of all the bad PR for using rebranded AMD cards which oh by the way AMD does all the rebranding.

I'm still happy I got my MacBook Pro with the 750M and if I were in the market for an iMac I would get a 2nd hand one with the GTX780M instantly.

Yeah, but I do understand their decisions as well. People always complain that you can't game etc on a Mac. The thing is:

It. Is. Not. A. Gaming. Computer.

Sure, it's got potential, but why on earth should Apple listen to gamers etc and put graphics cards that are build for games when the current cards handle everything the mac should do just fine...

I understand the viewpoint of having more power, but not people who complain because it doesn't run Witcher 3 on high.

A Mac is a workstation (give or take, depending on model), a gaming pc is a gaming pc.
 
Yeah, but I do understand their decisions as well. People always complain that you can't game etc on a Mac. The thing is:

It. Is. Not. A. Gaming. Computer.

Sure, it's got potential, but why on earth should Apple listen to gamers etc and put graphics cards that are build for games when the current cards handle everything the mac should do just fine...

I understand the viewpoint of having more power, but not people who complain because it doesn't run Witcher 3 on high.

A Mac is a workstation (give or take, depending on model), a gaming pc is a gaming pc.
It's not just gaming. There's this thing called GPU acceleration, and it's getting more and more support from companies like Adobe and Autodesk and Maxon. Thing is, they're using CUDA. AMD GPUs don't support CUDA. Apple wants to push the industry to OpenCL, but the software makers simply refuse. Why bother with it when you already have a stable platform in the form of CUDA? This is why people are upset. I know I was when Apple put a M370X in the MacBook Pro. I was expecting a GT950M or a GTX960M in there which outperforms the M370X on almost everything.
 
It's not just gaming. There's this thing called GPU acceleration, and it's getting more and more support from companies like Adobe and Autodesk and Maxon. Thing is, they're using CUDA. AMD GPUs don't support CUDA. Apple wants to push the industry to OpenCL, but the software makers simply refuse. Why bother with it when you already have a stable platform in the form of CUDA? This is why people are upset. I know I was when Apple put a M370X in the MacBook Pro. I was expecting a GT950M or a GTX960M in there which outperforms the M370X on almost everything.

I am highly aware of GPU Acceleration. I am also sure that you are aware of how Apple usually does things, not mainstream. Even though it was a mistake leaving Nvidia.

There is always a reason for everything.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.