Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
AFAIK @leman said Metal has no direct equivalent for the mesh shader pipeline.
MeshShaderPipeline.png

3DMark has a feature test that shows the framerate difference when using mesh shading versus not. I think it exaggerates the performance improvement, but there is one.
This seems like a reasonable omission that developers might care about, thanks!

That said, it doesn't seem like there is any hardware limitation here; updated metal API's (or even some shared open-source implementation) could over come this disparity, but who's to say Apple will actually implement these directly. If not, the metal shading language already has hooks for implementing customized pipelines, so I'd be surprised if some of the developers who really enjoy using a mesh shader pipeline haven't already implemented their own versions.

Not only does the Mac not get hybrid RT in the normal game, but 4AGames didn't bother even trying to convert the Full RT Enhanced Edition Engine at all.
Hardware real-time raytracing is incredibly cool, but only widely used in some of the higher end gaming GPUs. The vast majority of gamers don't have raytracing capable GPUs --- at the moment this is very much a future-forward thing.

On the other hand, I don't disagree; even if Apple didn't care about games, there's no reason to not have hardware raytracing as they license GPU IP from Imagination Technologies, a leader in (power efficient!) raytracing tech, and gaming isn't the only viable use-case for hardware accelerated raytracing (e.g., could significantly boost creative graphical workflows).

With as much raw power the M1Pro/Max chips have shown within their thermal envelope, I'm guessing Apple is going to release things like raytracing as "magical" new features in coming iterations. We'll see.
 
The impending Baldur's Gate 3 compelled me to re-download the original Extended Edition and start playing again for the first time in literal years. I reckon by the time I get through re-playing the original and the sequel, it will be time for a new Apple silicon laptop and BG3. I wholeheartedly own that this is just a strategy to stop me from buying a new MBP this year.
 
  • Haha
Reactions: januarydrive7
Also, Apple numbers are misleading because Google buys 20,000 exclusively for the office.

I doubt you’re playing Tomb Raider at work…

I would hope not.
To be fair, some of the big tech companies encourage this type of behavior. I have a few colleagues at Google, Facebook(meta?), LinkedIn, etc., and as one example, a friend of mine at LinkedIn has beer on tap, and is encouraged to spend at least half a day drinking and playing at least 1 day a week, as a company policy.

Relatively unheard of outside of the tech industry, but it's becoming fairly common to treat tech employees in this way as it's been shown to significantly boost performance.
 
This seems like a reasonable omission that developers might care about, thanks!

That said, it doesn't seem like there is any hardware limitation here; updated metal API's (or even some shared open-source implementation) could over come this disparity, but who's to say Apple will actually implement these directly. If not, the metal shading language already has hooks for implementing customized pipelines, so I'd be surprised if some of the developers who really enjoy using a mesh shader pipeline haven't already implemented their own versions.


Hardware real-time raytracing is incredibly cool, but only widely used in some of the higher end gaming GPUs. The vast majority of gamers don't have raytracing capable GPUs --- at the moment this is very much a future-forward thing.

On the other hand, I don't disagree; even if Apple didn't care about games, there's no reason to not have hardware raytracing as they license GPU IP from Imagination Technologies, a leader in (power efficient!) raytracing tech, and gaming isn't the only viable use-case for hardware accelerated raytracing (e.g., could significantly boost creative graphical workflows).

With as much raw power the M1Pro/Max chips have shown within their thermal envelope, I'm guessing Apple is going to release things like raytracing as "magical" new features in coming iterations. We'll see.
I forgot to add DirectStorage as a thing though I think Apple may have an equivalent API (@leman?).
 
  • Like
Reactions: januarydrive7
Cascading consequences. If you can’t run a PS5 game at 4K, how can you guarantee that your After Effects explosion is 4K?
 
I forgot to add DirectStorage as a thing though I think Apple may have an equivalent API (@leman?).
As I'm unfamiliar, I just did a quick search --- it seems typical streaming budgets are on the order of 50MB/s, and DirectStorage opens that up; something like this (if it's not already accomplished automagically with Metal, which wouldn't surprise me), would be a big gain for AS's UMA bandwidth.
 
This seems like a reasonable omission that developers might care about, thanks!

That said, it doesn't seem like there is any hardware limitation here; updated metal API's (or even some shared open-source implementation) could over come this disparity, but who's to say Apple will actually implement these directly. If not, the metal shading language already has hooks for implementing customized pipelines, so I'd be surprised if some of the developers who really enjoy using a mesh shader pipeline haven't already implemented their own versions.

Mesh shaders, at least as I understand them, are actually a tricky point. This is where difference between immediate rasterizers like Nvidia GPUs and TBDR rasterizers like Apple GPUs become significant. Nvidia invented mesh shading because it works well for their architecture — you can run these little compute shaders that will generate triangles which are immediately rendered, the entire process stays within the GPU cache, all is fast and efficient (as much as immediate renders are efficient anyway). Apple has a problem though — their approach requires them to store intermediate information about all the triangles that have been processed until they are fully shaded. So if you start generating millions of triangles in your shaders, those will create a lot of RAM overhead and hurt performance. I can imagine ways to mitigate this, but it's al tricky and in the end it kind of shoots past the idea of mesh shaders.

That said, I am not even sure that Metal needs mesh shaders... it has fairly advanced GPU-driven rendering capabilities so you can probably cover most interesting cases (such as LOD) with standard Metal functionality. Geometry generation is also possible, I am just not sure that Apple GPUs will scale to more generous mesh shader uses.

But who knows, maybe next year Apple will somehow come out with a TBDR-friendly version of mesh shaders...

On the other hand, I don't disagree; even if Apple didn't care about games, there's no reason to not have hardware raytracing as they license GPU IP from Imagination Technologies, a leader in (power efficient!) raytracing tech, and gaming isn't the only viable use-case for hardware accelerated raytracing (e.g., could significantly boost creative graphical workflows).

Imagination RT technology sounds very cool on paper but so far they haven't actually shipped a single product as far as I know. They just talk about them a lot. And there is also a question whether their RT tech is flexible enough for Apple's RT vision, which seems to go towards programmable RT — Apple wants this hardware to be usable in production raytracers.

I forgot to add DirectStorage as a thing though I think Apple may have an equivalent API (@leman?).

Does Apple never need something like DirectStorage? Their SSDs are already ridiculously fast, unified memory make sit unnecessary to shuffle things around between the CPU and the GPU, not to mention that Apple is currently the only vendor shipping a useable sparse texture implementation (good for dynamic streaming). Then again, I have no idea how efficient are these new Macs — or APFS — at handing high amounts of small read requests.
 
Does Apple never need something like DirectStorage? Their SSDs are already ridiculously fast, unified memory make sit unnecessary to shuffle things around between the CPU and the GPU, not to mention that Apple is currently the only vendor shipping a useable sparse texture implementation (good for dynamic streaming). Then again, I have no idea how efficient are these new Macs — or APFS — at handing high amounts of small read requests.
I don't know, as I have yet to see a PC game with RC:RA style loading (or lack thereof).
 
Mesh shaders, at least as I understand them, are actually a tricky point. This is where difference between immediate rasterizers like Nvidia GPUs and TBDR rasterizers like Apple GPUs become significant. Nvidia invented mesh shading because it works well for their architecture — you can run these little compute shaders that will generate triangles which are immediately rendered, the entire process stays within the GPU cache, all is fast and efficient (as much as immediate renders are efficient anyway). Apple has a problem though — their approach requires them to store intermediate information about all the triangles that have been processed until they are fully shaded. So if you start generating millions of triangles in your shaders, those will create a lot of RAM overhead and hurt performance. I can imagine ways to mitigate this, but it's al tricky and in the end it kind of shoots past the idea of mesh shaders.

That said, I am not even sure that Metal needs mesh shaders... it has fairly advanced GPU-driven rendering capabilities so you can probably cover most interesting cases (such as LOD) with standard Metal functionality. Geometry generation is also possible, I am just not sure that Apple GPUs will scale to more generous mesh shader uses.

But who knows, maybe next year Apple will somehow come out with a TBDR-friendly version of mesh shaders...
Thanks for the insight! I've never worked in GPU, so this is all foreign to me. It seems that the major work "necessary" here isn't necessarily supporting mesh shaders, but some higher abstractions that can help developers that can help non-metal developers translate to something more amenable to TBDR.
Imagination RT technology sounds very cool on paper but so far they haven't actually shipped a single product as far as I know. They just talk about them a lot. And there is also a question whether their RT tech is flexible enough for Apple's RT vision, which seems to go towards programmable RT — Apple wants this hardware to be usable in production raytracers.
The reason I brought up Imagination in particular is because this happened yesterday: https://www.imaginationtech.com/graphics-processors/img-cxt-gpu/

If Imagination is actually shipping something, now, the related IP could very well make its way into future AS. Who knows.
 
THIS IS A JOKE. It has to be.

I have had Macs my entire life but have never pretended that they even remotely compete with the PC that sits next to it in the realm of gaming.

Title should have read, "Top 5 App Store games you love on your iPhone - now on your Mac."
 
Am I in the minority wishing games didn't exist for Mac at all?
I presume so! Why on earth would you wish that?!?!

I never wish something or somebody was limited or made incapable of something it could do. Usually I wish everything could do more than it does. Or do you just wish people couldn't enjoy themselves or be as creative as possible by adding pointless intentional barriers to things in life generally? I'm genuinely confused why you'd wish Macs were worse computers!

I mostly use my Mac for audio production. But I definitely don't wish it didn't do something I don't do, just to make it a less useful machine!
 
  • Like
Reactions: ikir and leman
Thanks for the insight! I've never worked in GPU, so this is all foreign to me. It seems that the major work "necessary" here isn't necessarily supporting mesh shaders, but some higher abstractions that can help developers that can help non-metal developers translate to something more amenable to TBDR.

Yeah, and this is going to be a huge issue... because there is no one size fits it all translation path here. Without mesh shaders, you'll have to rethink your approach. It doesn't mean that you can't do it as efficiently on Apple's hardware with it's GPU-driven pipelines (you can probably do some things even more efficiently), but that though alone is not going to simplify porting.

Of course, similar considerations apply in the other direction as well. Apple GPUs have features that are simply impossible or impractical with traditional GPUs. Leveraging Apple-specific features can save one a lot of work and simplify the code tremendously.
 
Yeah, and this is going to be a huge issue... because there is no one size fits it all translation path here. Without mesh shaders, you'll have to rethink your approach. It doesn't mean that you can't do it as efficiently on Apple's hardware with it's GPU-driven pipelines (you can probably do some things even more efficiently), but that though alone is not going to simplify porting.
That's too bad -- I've done enough compiler work to know how difficult these issues can be on different architectures, but didn't appreciate the differences that TBDR makes. I wonder if Vulkan translates to DX mesh shaders, and, if so, how MoltenVK accomplishes translation for those code paths for metal?

Of course, similar considerations apply in the other direction as well. Apple GPUs have features that are simply impossible or impractical with traditional GPUs. Leveraging Apple-specific features can save one a lot of work and simplify the code tremendously.
This would be beneficial for those treating AS as a first-class citizen, but this won't help much in translation or porting efforts, or will it?
 
That's too bad -- I've done enough compiler work to know how difficult these issues can be on different architectures, but didn't appreciate the differences that TBDR makes. I wonder if Vulkan translates to DX mesh shaders, and, if so, how MoltenVK accomplishes translation for those code paths for metal?

As far as I can tell, Vulkan's mesh shader extension is more or less a direct copy of DX functionality, so there is no issue there. MoltenVK does not support mesh shaders and so far there don't seem any plans to do so.


This would be beneficial for those treating AS as a first-class citizen, but this won't help much in translation or porting efforts, or will it?

That's how it looks, yes.
 
  • Like
Reactions: januarydrive7
That's too bad -- I've done enough compiler work to know how difficult these issues can be on different architectures, but didn't appreciate the differences that TBDR makes. I wonder if Vulkan translates to DX mesh shaders, and, if so, how MoltenVK accomplishes translation for those code paths for metal?


This would be beneficial for those treating AS as a first-class citizen, but this won't help much in translation or porting efforts, or will it?
There are vendor specific extentions that allow you to do some of the things in Vulkan that you can do in DX.

edit: @leman beat me to the reply button.
 
  • Like
Reactions: januarydrive7
AR/VR is still niche. I think we are still several years away, and the efforts will still be more on PCs for it
There are already many AR/VR titles so I wouldn't say it's "niche". At some point, you have to ask yourself if it appears to be niche because of software limitations or hardware?

Is VR accessible to most people? Does the best pair of VR goggles that money can buy today provide the kind of experience customers expect for the money they spend? Admittedly, the field is wide open and that's exactly why gaming for Mac has a bright future because Apple is the company best positioned to solve all these problems.

You can't make predictions about the future purely based on the trajectory of the past. Otherwise, you can never see the success of the iPhone coming and you'd think Sony would dominate AV forever. If you can't see what a game-changer Apple Silicon is by now, there is no point in trying to convince you. To each his own. We'll talk in a few years.
 
At the bottom, the article mentions Fortnite as something to check out. I thought Macs couldn't get Fortnite? Or at least the newer version of it that is actually worth playing...? Is there a way to do this that I'm missing?
 
The performance difference between the Nvidia 2000 and 3000, 2000 and 3000, 900 and 1000 was more than just a "marginal".

Your being a bit optimistic about "developers will have no choice but to start developing native games for MacOS". If there's no viable user base for games no software shop is going to release games for Mac. There's no point, it's a money sink hole. If there's no games for the Mac, no "serious gamer" is going to bat an eye lid at the Mac. Additionally, when it comes to gaming, PCs and Consoles are better value for money than Macs, so there's no compelling reason for gamers to jump to Macs.

AMD and Intel / Consoles could switch to ARM, and game developers would absolutely follow because there would be a viable gaming audience. There still wouldn't be a compelling reason to port games over to the Mac games.



Unfortunately, the vast majority of those suck, IMO - your mileage may vary. They also don't take full advantage of the M1s.
Ask the Google god how the performance of Nvidia GPU has changed from year to year. There is really no point in arguing over this. I'm talking about GPU of the same tier, not comparing cards across tiers. The improvement is marginal in every sense of the word. In fact, if you factor in power consumption and measure performance per watt, the efficiency of some cards actually got worse.

Mac has not been traditionally good for gaming because:
  1. Apple favour all-in-one solution in their mid-range and entry-level offerings, which are not amenable to customization. High-end graphics cards are usually not available as options in this product line.
  2. High-end Apple desktops are prohibitively expensive.
  3. Prior to the advent of Metal, Apple has to rely on OpenGL, a standard they don't control. They are also dependent on Nvidia and/or AMD (ATI prior to 2006), graphics cards they don't design and make themselves. It is famously said by engineers at these companies that they have a hard time making graphics cards that are up to Apple's exacting standards (as graphics cards often need to be custom-made for Mac).
  4. After their row with Nvidia, Apple's sole graphics card supplier is AMD, whose offerings weren't as great as Nvidia's (and still aren't).
Apple Silicon pretty much solved all of the above. This is not to mention whatever higher specs required by games ported to the Mac are more than met with the superior processing power of Apple Silicon. It's really no coincidence that Metal Developer Tools on Windows was introduced last year. The foundation that they laid for gaming is about to bear fruit.

The tight integration of macOS/iOS and software/hardware not only incentivizes game developers to develop games for both platforms but also allows Apple to take a lead in AR/VR over its competitors as a tightly integrated software/hardware environment makes developing games much easier.

Metro Exodus is really a harbinger of things to come. The kind of solution favoured by gamers, i.e. eGPU or customized desktop, will have lost its attraction by the time Apple completes its transition to Apple Silicon (around 2023) then it will start to decline. Why get a $3000 graphics card when a MacBook Pro (of 2023) will offer the same kind of performance for $4000 or less? This is also assuming that you can find a high-end graphics card for sale.

If AMD, Intel, and consoles do switch to ARM, then it only adds to the likelihood of the scenario of Apple Silicon becoming a dominant force in gaming because games will be just that much easier to port to the Mac when everyone is on ARM.

And all Apple really needs is a blockbuster game. They might even develop that game themselves. Once game developers see Mac as a viable platform for their games and start developing games native to Macs, there is really no stopping the Apple Silicon train.
 
Last edited:
  • Like
  • Love
Reactions: ikir and SonOfaMac
Ask the Google god how the performance of Nvidia GPU has changed from year to year. There is really no point in arguing over this. I'm talking about GPU of the same tier, not comparing cards across tiers. The improvement is marginal in every sense of the word. In fact, if you factor in power consumption and measure performance per watt, the efficiency of some cards actually got worse.

Mac has not been traditionally good for gaming because:
  1. Apple favour all-in-one solution in their mid-range and entry-level offerings, which are not amenable to customization. High-end graphics cards are usually not available as options in this product line.
  2. High-end Apple desktops are prohibitively expensive.
  3. Prior to the advent of Metal, Apple has to rely on OpenGL, a standard they don't control. They are also dependent on Nvidia and/or AMD (ATI prior to 2006), graphics cards they don't design and make themselves. It is famously said by engineers at these companies that they have a hard time making graphics cards that are up to Apple's exacting standards (as graphics cards often need to be custom-made for Mac).
  4. After their row with Nvidia, Apple's sole graphics card supplier is AMD, whose offerings weren't as great as Nvidia's (and still aren't).
Apple Silicon pretty much solved all of the above. This is not to mention whatever higher specs required by games ported to the Mac are more than met with the superior processing power of Apple Silicon. It's really no coincidence that Metal Developer Tools on Windows was introduced last year. The foundation that they laid for gaming is about to bear fruit.

The tight integration of macOS/iOS and software/hardware not only incentivizes game developers to develop games for both platforms but also allows Apple to take a lead in AR/VR over its competitors as a tightly integrated software/hardware environment makes developing games much easier.

Metro Exodus is really a harbinger of things to come. The kind of solution favoured by gamers, i.e. eGPU or customized desktop, will have lost its attraction by the time Apple completes its transition to Apple Silicon (around 2023) then it will start to decline. Why get a $3000 graphics card when a MacBook Pro (of 2023) will offer the same kind of performance for $4000 or less? This is also assuming that you can find a high-end graphics card for sale.

If AMD, Intel, and consoles do switch to ARM, then it only adds to the likelihood of the scenario of Apple Silicon becoming a dominant force in gaming because games will be just that much easier to port to the Mac when everyone is on ARM.

And all Apple really needs is a blockbuster game. They might even develop that game themselves. Once game developers see Mac as a viable platform for their games and start developing games native to Macs, there is really no stopping the Apple Silicon train.
I am still somewhat bitter that Metro Exodus on Mac is as gimped as it is.
 
I'll disagree with that a bit actually - I'd say the reasoning is that is a fairly well-optimized game for Mac that looks like it has graphic and performance intensive gameplay. In actuality the graphics are pretty well optimized to hide performance issues - short draw distances and moderately few active elements on the screen at a time, relatively. They just make very good use of what they have.

Now if *Rise*of the Tomb Raider runs well I'd be more impressed. It's got much longer draw distances, leading to more active elements, etc... (I've had a couple of machines able to run both. Shadow always outperforms Rise.)
I get your point here, I always thought Shadow was the more intensive due to increased polygon count, further upgraded lighting and all the mud effects etc however didn’t think about the draw distances there, good catch!
 
That's especially impressive since the "port" is basically the Windows client running in a Crossover wrapper.

How does the laptop handle the load? Does it heat up significantly?
Nope. While it does get a bit warm it’s nowhere near hot, and best of all if the fans are on at all I can’t hear them. Dead silent.

Compare that to my 2019 i9 MBP where just the main title page would make the system sound like it was about to take flight!

I loved my 2019 and was debating keeping the 2021 or not, but that right there sold it for me.
 
  • Like
Reactions: bsolar
Considering ultra settings, 4K and most of these are not even Apple Silicon native. It is quite impressive. These SoC are faster than most notebook 3080 implementation.
Impressive, sure, but unless it's a stable 60 fps I would certainly not call it buttery smooth regardless.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.