Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
It's hard. The Nano is already a downclocked version from the Fury X. I don't think it's possible for them to cut the power consumption by another 30%. I am quite sure that AMD already try their best to find the optimum clock for the Nano. Go further lower won't help too much about power consumption.
 
It also seems like the PC version of the R9 M395X is exactly the same as the R9 M390X.
Good observation. It makes me wonder if there is a mistake in the listed M395X specs.

If the listed M395X specs are accurate, then I think it's a big disappointment only because there would be two products with the same specs but different names. If the listed specs are not accurate, then since the Tonga chip used in the M295X (and presumably the M390X) physically has a 384-bit bus, I think the M395X will have similar clocks to the M390X but with 50% more memory and memory bandwidth.
 
  • Like
Reactions: MandiMac
If the listed specs are not accurate, then since the Tonga chip used in the M295X (and presumably the M390X) physically has a 384-bit bus, I think the M395X will have similar clocks to the M390X but with 50% more memory and memory bandwidth.
Maybe HBM, then?
 
This is an estimate, I think. You can't simply buy a M version out there as a customer, right?

Off course you can... not mac specific as they are integrated in the custom logic board but you can for instance upgrade an alienware laptop.....900 bucks and you take the 980m!!! Soooo cheap!
Obviously Apple does not pay that much :D
 
Off course you can... not mac specific as they are integrated in the custom logic board but you can for instance upgrade an alienware laptop.....900 bucks and you take the 980m!!! Soooo cheap!
Obviously Apple does not pay that much :D
You know what I meant. You can buy a fully loaded GTX 970 desktop variant for half the price, but you don't get your hands on the M version *alone*.
 
HBM will only be possible if the Fiji chip is used, and I don't think that will be the case.
There seems to be quite a bit of speculation around that one. But yeah, I'll take the (once again) rebrand too - if it's even more powerful, I'd be happy!
 
It's hard. The Nano is already a downclocked version from the Fury X. I don't think it's possible for them to cut the power consumption by another 30%. I am quite sure that AMD already try their best to find the optimum clock for the Nano. Go further lower won't help too much about power consumption.
Is there any link for that regarding Compute Units and the likes? I didn't find anything so far :/
 

So, let's assume we're dealing with a M395X. 32 CU, 2048 processors, 723 MHz, 4 GB GDDR5, memory clock 1250 MHz (or higher?).

The Nano runs at 175 W, has 4096 processors, 256 texture units, 64 CU, an unknown GPU clock rate (1050 in R9 Fury), 4 GB GDDR5, memory clock 1 GHz.

Basically, if AMD/Apple would clock the stream processors at around 360 MHz, we should have the same output, but the power envelope is unknown. Next, if the R9 Nano 4096 processors run at the unknown clock rate, it sips around 175W. The other way around: If the R9 Fury X runs at 1050 MHz (with the same 4096 processors), it consumes around 275 W.

Speculation ahead!

The missing piece is the clock speed of the R9 Nano, because "up to 1 GHz" isn't a great claim to begin with. Looking at the R9 290X, the range starts at 727 MHz and ends at 1000 MHz - so let's assume the Nano has the same range between 700 and 1000 MHz. The around 30 % drop in MHz (coming from the Fury) results in around 36 % fewer watts the Nano needs. Assuming that the M395X needs around 125 watts (-29 %) we would need another 25-30 % drop in the MHz, resulting in a base clock speed from around 500 MHz (with the range to 700 for reasons).

So basically the MHz would be between the aforementioned 360 MHz assumption (for the same output) and the current 723 MHz claim with 2048 processors of the M295X/M395X. That would equal around 40 % faster graphics while keeping the same power envelope.

Am I missing something here or does it seem to be about right when making the point of a "R9 Nano Mobile"?
 
While it’s all fine and good to speculate about whether Apple can fit an R9 Nano into the RiMac chassis/power envelope for the next revision (and I’d certainly love it if they could), I think that regardless of what comes in October, there are at least two good reasons to be hopeful that the next two generations (2015 and 2016) of iMacs are going to bring some key improvements in graphics performance.


1. The addition of Thunderbolt 3, with it’s officially sanctioned support for external GPUs, may well finally unshackle us from the limitations of mobile (non-upgradable) GPUs dictated by the iMacs form factor. Of course this assumes that the GPUs themselves will support OS X, that they will be capable of writing back to the internal screen, that they won’t be horrendously expensive, and that the bandwidth limitations of Thunderbolt 3 compared to PCIE won’t be crippling at 4-5K resolution, none of which is guaranteed. Still I’ll be excited to (hopefully) see Thunderbolt 3 in the 2015 iMac.

2. The transition from a 28nm manufacturing process to as 16nm manufacturing process, forecasted to take place in 2016, should allow for a massive increase in the amount of GPU power that can be fit into the iMacs space and thermal constraints. When combined with the engineering effort both Nvida and AMD put into pulling as much out of 28nm as possible, and the trend towards more power efficient (Nvidia Maxwell) and smaller (R9 Nano) full featured GPUs, we could well be looking at another another golden age of iMac graphics.

Of course, we’ve been burned with promises of what thunderbolt could do before (I’m still pissed at Intel/Apple for not officially sanctioning external GPUs with Thunderbolt 1/2), and mobile graphics cards wont be the only ones that benefit from the move to 16nm. Still, I’m excited to see what the future holds for the iMac, as I no longer feel we’re the only ones demanding good GPUs for computers with more limited form factors. Here’s to hoping the new iMac expands our options and that whatever Apple puts in it, it can run cooler and throttle less than the M295X…

Oh, and there's also the potential of Metal,DirectX12 and Vulcan to be excited about :D
 
Apple would never do that, its too much of a niche market.

We should be unbelievably happy that Apple chose the m290x series over nvidia. The unfettered DX12 gains will be enormous moving forward, whereas if we ended up with the 980m we could be sitting on our hands praying that nvidia releases driver updates to include features that the m290x natively support.

What's very exciting to me is Intel/AMD APUs in 2016 + 2017. If AMD could ever release an APU that has reasonable single threaded performance, or at the same process node as intel, we could literally not see Apple go full AMD or Intel for a very long time.

If AMD can get Zen to 16nm with arctic islands GPU in 2016, combined with pretty big power savings (e.g. up to 35w) from HBM, we could see APUs in iMacs. Apple would love to have an APU in an iMac because of the enormous space it saves.
 
Space it saves in an iMac? There's already a bit of empty space in there, not sure if they're concerned.

I don't think Apple will go to AMD APUs. Apple has always valued CPUs over GPUs. They're not moving away from intel. iMacs are not gaming machines, they're productivity machines.
 
Does the AMD R9 Nano even fit into the iMac?

If so, it sounds pretty impressive:
http://www.anandtech.com/show/9564/amd-announces-radeon-r9-nano-shipping-september-10th
One day after Apple's keynote - nice coincidence ;)

But yes, it sounds pretty interesting indeed. I most liked the image claiming "20° cooler than R9 290X, 75° target operating temperature". If those values could be applied to the mobile variants (possibly) used in future Retina iMacs, most of the heat-related problems of current Retina iMacs should be solved.
 
  • Like
Reactions: mjohansen
I doubt about it, as long as Apple happy with 105C and the GPU capable to handle that. I guess Apple will only allow it to run hotter with higher clock speed. Otherwise, they can also downclock the M295X and let it run cooler.
 
We should be unbelievably happy that Apple chose the m290x series over nvidia. The unfettered DX12 gains will be enormous moving forward, whereas if we ended up with the 980m we could be sitting on our hands praying that nvidia releases driver updates to include features that the m290x natively support.

I am absolutely amazed how many nVidia fanboys have suddenly disappeared after this came out. nvidia is just getting arrogant with it's marketshare. This why as consumers we need the competition from AMD. nvidia's arrogance(less flexibility, higher prices) is probably what propelled Apple to switch to AMD in the first place.
 
You know what I meant. You can buy a fully loaded GTX 970 desktop variant for half the price, but you don't get your hands on the M version *alone*.

Again, you do, check on amazon / ebay or google it
 
Space it saves in an iMac? There's already a bit of empty space in there, not sure if they're concerned.

I don't think Apple will go to AMD APUs. Apple has always valued CPUs over GPUs. They're not moving away from intel. iMacs are not gaming machines, they're productivity machines.

I dont think you understand what you're talking about at all. APUs are exactly that, productivity in a smaller package.

If AMD could release an APU on the same process node as Intel, it would be extremely attractive for Apple to use in iMacs. A single die for CPU + GPU means less material cost, developmental cost, and proprietary parts, you source the GPU, ram, and CPU from 1 vendor. The added benefit of Arctic Islands performing well in DX12 is an after thought.
 
I am absolutely amazed how many nVidia fanboys have suddenly disappeared after this came out. nvidia is just getting arrogant with it's marketshare. This why as consumers we need the competition from AMD. nvidia's arrogance(less flexibility, higher prices) is probably what propelled Apple to switch to AMD in the first place.
its hilarious and amazing
I hope the Hawaii/Tonga GPUs receiving massive performance boost teaches every fanboy a lesson here.

it would be awesome to see Apple refresh the iMac, put in a 990/980m and only to see it 1 year down the road get trashed by the m290x cards because every game worth playing is DX12.

i'd love to see it, personally.

You never want to be beholden to any company hoping for software support when a competitor offers that support on the hardware level that's natively built in.

I mean, we're talking about a nice product, mobile GPUs, and people want to take the risk of nvidia prodiving driver support for their mobile gpu to get DX12 support? That might never happen.
 
I dont think you understand what you're talking about at all. APUs are exactly that, productivity in a smaller package.

If AMD could release an APU on the same process node as Intel, it would be extremely attractive for Apple to use in iMacs.
Still AMD is heavily trailing Intel when it comes to sheer power. The same process node would only help to lower the power consumption, but does nothing in regards to efficiency. When it comes to instructions per cycle, Intel is still king.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.