Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Zarniwoop

macrumors 65816
Aug 12, 2009
1,036
759
West coast, Finland
The problem is that Metal is a subset (at best) of Vulkan. There are concepts that you can express in Vulkan that will simply never be able to be implemented in Metal, which means this MetalVK thing will never really work that well. If it were the other way around (implementing the Metal API on top of Vulkan) then you'd be in business.

For example, what if your Vulkan app uses geometry shaders? Metal does not support them at all, even in Sierra. How does MetalVK deal with that?

Yep, we need the third and complete version of Metal first. Now we're going to have the beta version. El Captain had an alfa version. Transition to full Metal takes time.

iOS 10 on the other hand, will have pretty mature version of Metal.
 
Last edited:

Asgorath

macrumors 68000
Mar 30, 2012
1,573
479
Yep, we need the third and complete version of Metal first. Now we're going to have the beta version. El Captain had an alfa version. Transition to full Metal takes time.

iOS 10 on the other hand, will have pretty mature version of Metal.

You're missing the point. Apple added tessellation support to Metal in Sierra, but it's incompatible with the version in DX11 and OpenGL, which means it's practically impossible to port a DX11 game that uses tessellation to Metal. Apple is showing no indication that they care about DX11 feature parity.

Why do you think the version of Metal in iOS is more mature than the one for macOS? They have the same basic feature set, i.e. the desktop version is severely limited because Apple doesn't seem to want to expose all the functionality that their desktop/laptop GPUs actually have (and have had for 6-7 years now).
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
Impossible? So far everyone who are working with Metal say that it is possible, but it needs a workaround. Its in the nature of putting the Tesselation on Compute, rather than relying on hardware. Of course that makes the Tesselation completely hardware agnostic, thanks to this and can be run on the spot on Intel, AMD, Nvidia GPUs and also on Imagination Technologies, mobile GPUs.

Excerpt from What's New in Metal keynote: Compute Kernel preferred as it can asynchronously execute with draw commands on GPU.
 
Last edited:

Asgorath

macrumors 68000
Mar 30, 2012
1,573
479
Impossible? So far everyone who are working with Metal say that it is possible, but it needs a workaround. Its in the nature of putting the Tesselation on Compute, rather than relying on hardware. Of course that makes the Tesselation completely hardware agnostic, thanks to this and can be run on the spot on Intel, AMD, Nvidia GPUs and also on Imagination Technologies, mobile GPUs.

Excerpt from What's New in Metal keynote: Compute Kernel preferred as it can asynchronously execute with draw commands on GPU.

I'll refer you back to my earlier statements about geometry shaders then. How does one port a DX11 app that uses geometry shaders to Metal? Or how does MetalVK make a Vulkan app that uses geometry shaders work?
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
I'll refer you back to my earlier statements about geometry shaders then. How does one port a DX11 app that uses geometry shaders to Metal? Or how does MetalVK make a Vulkan app that uses geometry shaders work?
I will not bicker anymore about this, but first you said about Tesselation on Metal, now you say about Geometry Shaders.

Everything is possible, you need a workaround. It does not mean you will have geometry Shaders in Metal already, it does mean that you can port DX11 games to Metal.
That is my estimation based on what we know about Metal. Previously Mac OpenGL vs Windows DX11 was 20% performance difference(Mac was slower on the same hardware). Currently by the looks of things Apple narrowed it to 10%(still slower on the same hardware).
 

Fl0r!an

macrumors 6502a
Aug 14, 2007
909
530
Previously Mac OpenGL vs Windows DX11 was 20% performance difference(Mac was slower on the same hardware). Currently by the looks of things Apple narrowed it to 10%(still slower on the same hardware).

Those difference are mostly due to high CPU overhead in OS X, which is most obvious on systems with bad single core performance.

A CPU bound system (e.g. MP4,1 with 2.26 GHz CPUs or MP3,1) will usually perform much worse in OS X, I've seen differences way beyond 30%.

On the other hand a system with a fast CPU will almost close the gap:
Heaven OS X.png Heaven Win10.PNG
 

jeanlain

macrumors 68020
Mar 14, 2009
2,430
933
I don't expect moltenVK to be used at all on macOS, because of the issues raised by Asgorath + the expected scarcity of Vulkan games that could be ported to the Mac.
 

Asgorath

macrumors 68000
Mar 30, 2012
1,573
479
I will not bicker anymore about this, but first you said about Tesselation on Metal, now you say about Geometry Shaders.

Everything is possible, you need a workaround. It does not mean you will have geometry Shaders in Metal already, it does mean that you can port DX11 games to Metal.

I said I didn't think MetalVK would work that well, and that people will have a hard time porting DX11 games to Metal because of a lack of feature parity. I cited tessellation and geometry shaders as obvious examples. Have you looked at the differences between what DX11 exposes and what Metal on Sierra exposes? Do you really think it'll be easy for Feral or Aspyr to port modern DX11 games to Metal? I really don't think so.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
Ask Ferazel(he is working for Feral) in the Mac Gaming forum about this ;). He already stated his point of view there, and it is quite different view than yours(not so "doom and gloom").
There is also Marksatt, who works for Epic Games on Metal version of Unreal Engine 4.
For example his post: https://forums.macrumors.com/threads/looks-like-metal-got-an-update.1977661/#post-23059102

Realistically it is better to be silent until "anything happens" than state anything about possibilities right now. That is smart way.

I will wait and see, and then draw my conclusions.
 
Last edited:

buster84

macrumors 6502
Oct 7, 2013
428
156
Wow such a long thread! I have a few questions hopefully they can be answered by now. The first page should update with this basic info since I couldn't find it and didnt want to read 25 pages lol.

1) is this card flashable for the boot screen?

2) does this support a 4K monitor?

3) if no boot screen because of no flash, does this make it where I can no longer see the screen when I hold down option/alt at startup to access my bootcamp partition?
 

ActionableMango

macrumors G3
Sep 21, 2010
9,612
6,907
1) is this card flashable for the boot screen?

2) does this support a 4K monitor?

3) if no boot screen because of no flash, does this make it where I can no longer see the screen when I hold down option/alt at startup to access my bootcamp partition?

Not yet, yes, correct.
 

lowendlinux

macrumors 603
Original poster
Sep 24, 2014
5,439
6,735
Germany
Wow such a long thread! I have a few questions hopefully they can be answered by now. The first page should update with this basic info since I couldn't find it and didnt want to read 25 pages lol.

1) is this card flashable for the boot screen?

2) does this support a 4K monitor?

3) if no boot screen because of no flash, does this make it where I can no longer see the screen when I hold down option/alt at startup to access my bootcamp partition?
If I had a Mac and the card I would.
 

ManuelGomes

macrumors 68000
Dec 4, 2014
1,617
354
Aveiro, Portugal
470 and 460 specs are out.
It seems AMD was rather conservative with the 470. Low mem clocks, not standard at all. Was this to keep power down to 120W? That will surely be the DX300 or whatever it will be called now. But I hope now that they have some more reserve power that they will crank it up to 8Gbps. That would suit me just fine, I like the round 2048 cores :)
And it's confirmed the 1:16 FP64 ratio.
Too bad it doesn't even get to 5TFlops, 4.9 max it seems.
On the other hand, the lack of GDDR5X is really bugging me. And no one seems to even talk about it or care. No official word on support even. Is the integrated controller really lacking here? AMD at this time not supporting GDDR5X? No way Jose.
Maybe it's being left for the next gen RX4x5. And maybe Apple will also wait for the process to mature and improvements can be had with the second gen.
 

Stacc

macrumors 6502a
Jun 22, 2005
888
353
On the other hand, the lack of GDDR5X is really bugging me. And no one seems to even talk about it or care. No official word on support even. Is the integrated controller really lacking here? AMD at this time not supporting GDDR5X? No way Jose

The card is mainstream and the performance doesn't justify the faster memory. Remember that memory bandwidth isn't everything and it can be wasted if it doesn't have a fast enough GPU to go with it. We have only seen GDDR5X on enthusiast level cards from Nvidia, i.e. GTX 1080 and Titan X (2). Both of those cost north of $600 with performance that well exceeds the RX 480. We probably won't see faster memory until Vega with HBM2 unless AMD releases a much higher clocked Polaris 10 card which could justify GDDR5X.
 

ManuelGomes

macrumors 68000
Dec 4, 2014
1,617
354
Aveiro, Portugal
That is all correct but even the RX480 is using 7Gbps mem, not 8Gbps. Maybe this is also due to power constrains but that makes either the GPU or process node look not so efficient.
Still, using 6.6Gbps mem on a quite nice GPU like the 470 seems odd. Is it to be just slightly lower than the 480? Come on... 8G on 480 and 7G on 470 would seem correct but this?
I believe the GDDR5X support doesn't even exist, maybe AMD wanted to save die space and power and found it was not necessary for this type of card. OK, I buy it. I'm not sure anyone testing the card found a bottleneck there, was there?
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
I am pretty sure both memory controllers for HBM2, and for GDDR5X can work in "Legacy" mode - compatible with either HBM1 and GDDR5. The problem is willingness from AMD to adopt that technology if they have much better and faster memory standard - HBM1. Costs and benefits are exactly the same for both GDDR5X and HBM1(higher cost compared to GDDR5, lower power consumption compared to GDDR5, higher bandwidth compared to GDDR5).
 

ManuelGomes

macrumors 68000
Dec 4, 2014
1,617
354
Aveiro, Portugal
As much as I'd love to agree to that, I'm not so sure anymore.
Lower power would be a plus here, although the higher cost would render their price point target impossible I guess.
We'll see if a 490 will come out with GDDR5X and maybe higher clocks. If not, it would be moot to have the appropriate controller and not use it in any model at all, right?
 

Zarniwoop

macrumors 65816
Aug 12, 2009
1,036
759
West coast, Finland
GDDR5X is not a cheap memory making it less ideal for low a cost card AMD is offering. Maybe Radeon Pro will use it? There's no mention what memory type Radeon Pro will use.. no mention about ecc or anything.

GDDR5X sound great alternative for laptops and other thin&low power solutions. So, Apple could use under clocked GDDR5X just to reach their power target.
 

buster84

macrumors 6502
Oct 7, 2013
428
156
Not yet, yes, correct.

Thanks for that, what about a program like refit? Will that allow you to get a boot menu since refeit boots after the initial boot menu?

Also I didn't see much posted about this has anyone tested the Nvidia GeForce GTX 1060 card yet? It's only $50 more than this one and seems like a good alternative.
 

h9826790

macrumors P6
Apr 3, 2014
16,614
8,546
Hong Kong
Thanks for that, what about a program like refit? Will that allow you to get a boot menu since refeit boots after the initial boot menu?

Also I didn't see much posted about this has anyone tested the Nvidia GeForce GTX 1060 card yet? It's only $50 more than this one and seems like a good alternative.

No

1060 not working in OSX yet
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
LL


1120 MHz is the sweet spot, of power consumption and performance, and it is rather coincidentally the base clock for RX480.

I wonder what power consumption would get Fiji on this process, with 1050 MHz. As always, everything relies on the design of silicon, but it looks like the 14 nm process form GloFo is not that bad as people thought it is. It is again pushed out of its comfort zone(the architecture). AMD may just have underestimated the capabilities of Nvidia's silicon design, thats why they had to push the GPU out of its comfort zone at the last minute.
 

ManuelGomes

macrumors 68000
Dec 4, 2014
1,617
354
Aveiro, Portugal
GDDR5X should become more affordable with widespread use now. But if AMD isn't using it (or supporting as I suspect) that will make it take longer.
Price consideration was important for AMD when making the decision but power consumption might have made it worth it, or not if the extra bandwidth wasn't in fact needed. And if the GPU can't really go much faster...
koyoot, Fiji would be nice for compute but I believe we can forget about it in the nMP. It lacks the output capabilities of Polaris, and I don't see Apple using 2 different GPUs in the nMP. My bet is that they'll stick to the same GPU setup, both the same SKU.
Would Polaris fare better under TSMC process node? Who knows.
I guess Polaris 11 might shed some light on it.
 

koyoot

macrumors 603
Jun 5, 2012
5,939
1,853
Manuel, may I ask you a question? Why are you so attached to GDDR5X Polaris GPU, when it appears that Vega 10 with HBM2 is coming this year?
 

ManuelGomes

macrumors 68000
Dec 4, 2014
1,617
354
Aveiro, Portugal
It's not really a matter of attachment, I find it odd that they don't even mention it anywhere.
It would be, in my opinion, dumb not to upgrade the controller to GDDR5X at this moment in time.
If they didn't, to save die space, or power, and they have no plans to use it in the future, ever, OK I can live with that. But that will hinder their future lineup, read the 2nd gen RX4x5 models, that could be something they can claim as a value added feature, even if it's just marketing BS and has no practical effect.
But if it indeed is there, and they're not talking about it, it sounds like something is wrong. Unless they're not unveiling it at the moment on purpose the have the wow effect later on, on the 2nd revision.
Whatever the reason, AMD still is an awful player in the features and transparency department.
As to NVidia, they also suck when it comes to being transparent, but they know how to take advantage of their cards' features, no doubt.

As to Vega, are you still so sure about it coming this year still? Well, I'm not, but sure hope it is.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.