Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Now that mobile CPUs have continued to get faster, the lower overhead of the draw calls isn't really the limiting factor it once was. Now it's more about driver/shader efficiency. Some latest benchmarks on iOS for example have shown that OpenGL is actually producing better frame times than Metal is now that the CPU draw call batching isn't the limiting factor.

77657.png


https://gfxbench.com/compare.jsp?be...S&api2=metal&hwtype2=GPU&hwname2=Apple+A9+GPU
You did notice that the numbers given on the page you linked completely contradict the findings in this graph, didn't you? According to these Metal achieves 57.6 fps when rendering onscreen in the Manhattan test. OpenGL only 55.7.

The same is true for the other benchmark, T-Rex: Metal 59.7, OpenGL 59.1.

Interestingly, OpenGL scores higher than Metal when rendering off-screen, i.e. not actually drawing anything. I might be wrong, but that seems to me as if the draw calls are still very much the limiting factor.
 
Last edited:
The thing is, when engines are ported to another API, they may performed less well initially. This is the cost of porting. I suppose that engines coded for Metal from the start would perform better. OpenGL cannot do efficient multithreading for instance. Metal can. A 3D engine has to be rewritten to take advantage of that. There's a lot of optimisation to be done.

Metal was most likely not available when Blizzard began working on the Overwatch engine.
 
There are considerable gaps with the current Metal OS X API as it compares to modern 3D APIs
- Unable to run both compute and 3d rendering on the same metal device at the same time.

You absolutely can execute Compute tasks on a Metal device while also doing rendering - it uses exactly the same interleaved execution model as D3D 11. What you can't do is D3D 12's Asynchronous Compute, where compute execution runs on the GPU ALUs that aren't being used for rasterisation.
 
Well, that explains why we won't get the Elite Dangerous expansion….

No, I imagine that's because the Mac port of the Frontier renderer is built on OpenGL & presumably adding an OpenCL compute-backend or porting to Metal, which lacks geometry and tessellation shaders which they need, isn't justified by the sales of the Mac version.

So far only Feral have shipped a full-fat Shader Model 5 game on Mac because it is extremely problematic just now - believe me I've been asking for permission to do the same for UE4 but for the moment I can only applaud my former colleagues efforts.
 
The thing is, when engines are ported to another API, they may performed less well initially. This is the cost of porting. I suppose that engines coded for Metal from the start would perform better. OpenGL cannot do efficient multithreading for instance. Metal can. A 3D engine has to be rewritten to take advantage of that. There's a lot of optimisation to be done.

Metal was most likely not available when Blizzard began working on the Overwatch engine.

Once games are developed primarily for next-gen consoles and DX12/Vulkan/Metal they'll be architected to take advantage of multi-threaded graphics command submission, but back-porting that into an existing engine is tough work.
 
No, I imagine that's because the Mac port of the Frontier renderer is built on OpenGL & presumably adding an OpenCL compute-backend or porting to Metal, which lacks geometry and tessellation shaders which they need, isn't justified by the sales of the Mac version.
No, the thing is they are using compute shaders for the on-the-fly procedural generation of the planetary terrains in the upcoming expansion. Apple's OpenGL doesn't have compute shaders at all (as you know, I guess...), and Metal's compute shaders weren't suitable for a not furtherly elaborated reason (the guy bringing the news wasn't a programmer). That you can't compute and render on the same Metal device at the same time seems to be the missing detailed explanation. (Metal's other shortcomings certainly also might have played a role.)

EDIT: Sorry I managed to miss that post:
You absolutely can execute Compute tasks on a Metal device while also doing rendering - it uses exactly the same interleaved execution model as D3D 11.
In that case it's probably another problem with Metal's compute shaders or indeed due to Metal's other shortcomings. The specific quote we had was this:
In terms of Metal. As I said earlier in the thread, I'm not a technical guy but from speaking to the devs we know that the issue is more in the finer detail of how the compute shaders work in Metal at this stage.
 
Last edited:
Interestingly, OpenGL scores higher than Metal when rendering off-screen, i.e. not actually drawing anything. I might be wrong, but that seems to me as if the draw calls are still very much the limiting factor.

The manhantten off-screen test is the benchmark was the test I was referencing with that link. There are actual situations now (due to how little the CPU is the limiting factor) that OpenGL is faster than Metal on mobile. Granted, Metal is still probably the way to go since there is going to be more power consumption on OpenGL due to that computational overhead. However, it's not contributing to limiting the frame time.

You absolutely can execute Compute tasks on a Metal device while also doing rendering - it uses exactly the same interleaved execution model as D3D 11. What you can't do is D3D 12's Asynchronous Compute, where compute execution runs on the GPU ALUs that aren't being used for rasterisation.

Thanks for the correction. I formed that opinion based on what you had said in another thread.

I did some digging into the topic more and I now understand the concept better (I think). Seems like current AMD cards and current consoles (which are powered by AMD) support this feature better by having multiple hardware task schedulers on the hardware itself. AMD is claiming that console games are getting up to 30% performance gains by using this feature correctly and making sure that the GPU is fed properly with all of the compute tasks. As more and more hardware on the PC/Console use this feature (which may take some years before PC developers due to target market share). Regardless, Apple should take this feature seriously still and hopefully add support for it in the future of Metal.

I think the fundamental problem I see is that Apple still grabs new techs and can influence the market tremendously in terms of features. However, Apple is ALWAYS be behind in 3D graphics and they don't seem to care.
 
No, the thing is they are using compute shaders for the on-the-fly procedural generation of the planetary terrains in the upcoming expansion. Apple's OpenGL doesn't have compute shaders at all (as you know, I guess...), and Metal's compute shaders weren't suitable for a not furtherly elaborated reason (the guy bringing the news wasn't a programmer). That you can't compute and render on the same Metal device at the same time seems to be the missing detailed explanation. (Metal's other shortcomings certainly also might have played a role.)

Marksatt is very knowledgeable about these topics (he is one of the OS X developer of Unreal Engine 4) and should be considered a reliable source of truth in these matters.

I love these threads because they force me to read into and understand terminology better. :) I did some digging into Compute Shaders vs OpenCL to hopefully understand the differences.

The idea is that both Compute Shaders and OpenCL use the GPU to perform complex compute operations. OpenGL Compute shaders (or DirectCompute on D3D11) are easier for most 3D applications because they already have an API with the GPU. You can use this API to perform compute operations into OpenGL resources in sync with the rest of your rendering pipeline. You can also use GLSL in compute shaders rather than a special type of C language for OpenCL. OpenCL though has much more accurate computational results, and often more efficient program compilation. However, in most 3D rendering the accuracy and compilation is not worth the investment of getting the systems to communicate into the same rendering pipeline.

I think Marksatt is referring to Feral having to do this very thing did on one of their titles (I'm guessing Shadows of Mordor) where they needed GPU compute for a game to function correctly. So they rigged a solution so that both OpenCL and OpenGL were communicating into the same rendering pipeline. Basically, they reversed engineered a OpenGL compute shader using OpenCL and syncing the resources between these two very different contexts. I imagine was not an easy task.
 
This post by a blizzard employee: http://us.battle.net/d3/en/forum/topic/19743934595#2
And this one, as posted earlier in the thread: http://us.battle.net/wow/en/forum/topic/19580947919#9

Pretty much sums it up as to why Blizzard won't release their new game to OS X right now, and it makes a lot of sense to me.

Completely understandable. When the only desktop computer even remotely capable of playing the game is the maxed out 5k iMac retina or the now two - three year old mac pro. Its just too expensive.

Macs was never ment for gaming except casual iOS like stuff.
 
  • Like
Reactions: antonis
The manhantten off-screen test is the benchmark was the test I was referencing with that link. There are actual situations now (due to how little the CPU is the limiting factor) that OpenGL is faster than Metal on mobile.
This remains a brash hypothesis without further proof beyond that singular benchmark result, which is not only as a single data point of no statistical relevance, but could also be an outlier. All the while, there are numerous benchmarks showing Metal to outperform OpenGL in pretty much every regard, albeit not necessarily by a huge amount (Anandtech's Metal for Mac benchmarks aside, that they say themselves are to be taken with a massive grain of salt.)

Marksatt is very knowledgeable about these topics (he is one of the OS X developer of Unreal Engine 4) and should be considered a reliable source of truth in these matters.
I don't doubt Mark's technical expertise and know who he is. I also know he worked at Frontier in the past. He himself made clear though that he can only speculate about Frontier's reasoning of not porting ED Horizons to Mac, because he apparently is no longer privy to all the information in that regard.
 
Last edited:
This post by a blizzard employee: http://us.battle.net/d3/en/forum/topic/19743934595#2
And this one, as posted earlier in the thread: http://us.battle.net/wow/en/forum/topic/19580947919#9

Pretty much sums it up as to why Blizzard won't release their new game to OS X right now, and it makes a lot of sense to me.
These posts look more like informed guesses by forum moderators or support technicians than primary sources of information. I don't question the validity of the claims, though. They make sense.
 
  • Like
Reactions: Janichsan
This post by a blizzard employee: http://us.battle.net/d3/en/forum/topic/19743934595#2
And this one, as posted earlier in the thread: http://us.battle.net/wow/en/forum/topic/19580947919#9

Pretty much sums it up as to why Blizzard won't release their new game to OS X right now, and it makes a lot of sense to me.

Completely understandable. When the only desktop computer even remotely capable of playing the game is the maxed out 5k iMac retina or the now two - three year old mac pro. Its just too expensive.

Macs was never ment for gaming except casual iOS like stuff.


Green = MVP = Not Blizzard employee. It's just guessing and rumor mongering. I'm surprised they [devs] didn't say anything about why no Mac release at Blizzcon this past weekend, but oh well.
 
This post by a blizzard employee: http://us.battle.net/d3/en/forum/topic/19743934595#2
And this one, as posted earlier in the thread: http://us.battle.net/wow/en/forum/topic/19580947919#9

Pretty much sums it up as to why Blizzard won't release their new game to OS X right now, and it makes a lot of sense to me.

Completely understandable. When the only desktop computer even remotely capable of playing the game is the maxed out 5k iMac retina or the now two - three year old mac pro. Its just too expensive.

Macs was never ment for gaming except casual iOS like stuff.

Good find. And yes, it makes much sense. In fact, all of their points are also written in these forums, even in this thread. I'm also worried much about the route apple is taking with OS X regarding both h/w and s/w lately.
 
Marksatt is very knowledgeable about these topics (he is one of the OS X developer of Unreal Engine 4) and should be considered a reliable source of truth in these matters.

While I know a lot about this stuff please don't take my word as gospel - like anyone else I'm not always right.

The idea is that both Compute Shaders and OpenCL use the GPU to perform complex compute operations. OpenGL Compute shaders (or DirectCompute on D3D11) are easier for most 3D applications because they already have an API with the GPU. You can use this API to perform compute operations into OpenGL resources in sync with the rest of your rendering pipeline. You can also use GLSL in compute shaders rather than a special type of C language for OpenCL. OpenCL though has much more accurate computational results, and often more efficient program compilation. However, in most 3D rendering the accuracy and compilation is not worth the investment of getting the systems to communicate into the same rendering pipeline.

Pretty much. You can opt to get the same low-precision results in CL as in GL/D3D if you'd like. Basically OpenCL is a cross-vendor alternative to CUDA and so is geared toward getting the most out of the GPU for your compute tasks & forgoes any rasterisation. Instead it has bindings to GL and D3D...

I think Marksatt is referring to Feral having to do this very thing did on one of their titles (I'm guessing Shadows of Mordor) where they needed GPU compute for a game to function correctly. So they rigged a solution so that both OpenCL and OpenGL were communicating into the same rendering pipeline. Basically, they reversed engineered a OpenGL compute shader using OpenCL and syncing the resources between these two very different contexts. I imagine was not an easy task.

Yep, exactly right. OpenCL gives you an API to do this - but it has some idiosyncratic and quite arbitrary restrictions. I was asking to be allowed to this work @ Feral back in early 2012 as we started working on the first DX11 games. I'm glad they pulled it off for Mordor.

No, the thing is they are using compute shaders for the on-the-fly procedural generation of the planetary terrains in the upcoming expansion. Apple's OpenGL doesn't have compute shaders at all (as you know, I guess...), and Metal's compute shaders weren't suitable for a not furtherly elaborated reason (the guy bringing the news wasn't a programmer). That you can't compute and render on the same Metal device at the same time seems to be the missing detailed explanation. (Metal's other shortcomings certainly also might have played a role.)

EDIT: Sorry I managed to miss that post:

In that case it's probably another problem with Metal's compute shaders or indeed due to Metal's other shortcomings. The specific quote we had was this:

I was suggesting that they do what Feral did and use OpenCL & its GL bindings to implement their compute shaders on Mac. I'm fairly sure talking to some friends there that they aren't doing anything that would be impossible with OpenGL + OpenCL. Like Mordor it might be a bit tricky - but it could work and they wouldn't need to change much else in their renderer.

Metal would effectively be another brand-new port of their renderer - one that doesn't have geometry and tessellation shaders and where there are other more subtle problems that are harder/impossible to fix. The compute shaders in Metal *should* do everything but I'm still battling away with them for UE4 so I can understand if Frontier don't think their Mac sales justify that effort.
 
I don't doubt Mark's technical expertise and know who he is. I also know he worked at Frontier in the past. He himself made clear though that he can only speculate about Frontier's reasoning of not porting ED Horizons to Mac, because he apparently is no longer privy to all the information in that regard.

Spot on. I still have friends there but I don't have any more insight into the why's & wherefore's of management decisions like this than any other external party. I can well understand their decision though.
 
This post by a blizzard employee: http://us.battle.net/d3/en/forum/topic/19743934595#2
And this one, as posted earlier in the thread: http://us.battle.net/wow/en/forum/topic/19580947919#9

Pretty much sums it up as to why Blizzard won't release their new game to OS X right now, and it makes a lot of sense to me.

Completely understandable. When the only desktop computer even remotely capable of playing the game is the maxed out 5k iMac retina or the now two - three year old mac pro. Its just too expensive.

Macs was never ment for gaming except casual iOS like stuff.

I don't think that's it, man. Look at the horsepower Blizzard games require - that's right. Not much. WE've got the new cutting edge UT being developed by Epic for Mac right now, we've got Shadow of Freaking Mordor available natively for Mac, Tomb Raider 2013, Metro 2033 and Metro Last Light for Mac by the original developer. These games push pixels far more intensely than anything Blizzard has EVER put out. Would anyone have forgiven any of those developers or porters for passing on those titles as being not worth it? I wouldn't. But they did it anyway.

I think that Blizzard's call has more to do with money than anything else. I just don't know whose money.
 
I don't think that's it, man. Look at the horsepower Blizzard games require - that's right. Not much. WE've got the new cutting edge UT being developed by Epic for Mac right now, we've got Shadow of Freaking Mordor available natively for Mac, Tomb Raider 2013, Metro 2033 and Metro Last Light for Mac by the original developer. These games push pixels far more intensely than anything Blizzard has EVER put out. Would anyone have forgiven any of those developers or porters for passing on those titles as being not worth it? I wouldn't. But they did it anyway.

I think that Blizzard's call has more to do with money than anything else. I just don't know whose money.

Overwatch is an FPS designed for 1080p/60hz. IMHO I'd take the min. specs they published as aspirations, rather than a realistic assessment of the hardware required to play it acceptably.

The only Macs that could even conceivably manage to run a PS4/XBONE-level game 1080p @ 60Hz are the 2013 Mac Pro, 5k Retina iMac or a 27" iMac with NV 675M/680M/775M/780M. That's a very small install base - I know I've said elsewhere that in my experience most Mac gamers at least have a dedicated GPU, but it is usually in a 15" MacBook Pro - which makes it difficult to justify investment in a game pitched firmly toward the high-end.

There's no need for conspiracy theories here - porting such a game requires serious investment and an understanding that you will have to make trade-offs to target a much smaller target audience.

For the games you mention Feral and 4A have done that and should be applauded. At Epic it is just one part of the ongoing effort to maintain and develop UE4. I know how hard it is to do, so I can totally understand Frontier's decision with Elite: Horizons and Blizzard's with Overwatch.
 
The only Macs that could even conceivably manage to run a PS4/XBONE-level game 1080p @ 60Hz are the 2013 Mac Pro, 5k Retina iMac or a 27" iMac with NV 675M/680M/775M/780M.
Why not the more recent iMacs?

(OK you mentioned the 5K models).
 
Last edited:
Or older ones with upgraded memory and graphics cards etc ?

The statement was quite clear: a list of Macs that have sufficient GPU power to run a PS4/XBONE-level game at 1080p/60Hz. With regard to iMacs in particular, the models prior to 2012 have GPUs slower than that found in the consoles, though the 2011 27" iMac's AMD 6970 might be close & doesn't support Metal so you are stuck with OpenGL 4.1. Of the 2012 & 2013 iMacs only the Nvidia 657M, 680M, 775M & 780M are sufficiently powerful.

The models with Intel GPUs, AMD GPUs < 6970, Nvidia 640M, 650M, 750M & 755M definitely won't be fast enough for 1080p/60Hz - which is why all the MacBook Pro's are eliminated too. That isn't to say that they are all bad machines - just that they aren't the kind of machine Blizzard are targeting with Overwatch.

If anyone mentions 2012 & earlier Mac Pro's - yes you can upgrade the GPU, but Apple don't support such configurations. This means they don't necessarily realise all the performance of the GPU in OpenGL applications which is something I've had problems with.
 
The 5K iMac GPU isnt really capable even though spec-wise it could be (in 1440p). That is because of thermal throttling when you start using it. So no, Macs are not and have never been made for people that want to kick back and play some games in between working sessions.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.