Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Which is why Apple’s investing in BLENDER and not video games. Because 3Dfx uses actual 3D modeling programs like Blender, Maya, etc. Not video game engines.
This is ignorant and short sighted as more and more game rendering tech is being incorporated into said 3D authoring tools. This adoption has been accelerating over the last ten years because of the increased prominence and efficacy of GPU processing which requires more specific parallel instruction set type coding...which hasn't been happening over all this time for the Mac.

To not get into gaming is to voluntarily put Apple tech, in practical application, at the back of the second tier of 3D development processing line at this point in time.

You can not like it all you want, but for Apple to not be 100% in on gaming tech is a major strategic disadvantage that's so much larger than games.
 
  • Like
Reactions: Irishman
This is ignorant and short sighted as more and more game rendering tech is being incorporated into said 3D authoring tools. This adoption has been accelerating over the last ten years because of the increased prominence and efficacy of GPU processing which requires more specific parallel instruction set type coding...which hasn't been happening over all this time for the Mac.

To not get into gaming is to voluntarily put Apple tech, in practical application, at the back of the second tier of 3D development processing line at this point in time.

You can not like it all you want, but for Apple to not be 100% in on gaming tech is a major strategic disadvantage that's so much larger than games.
Hasn’t Apple always 3rd tier for 3D work? I am not sure if Apple can scale to the point where they are first tier. Especially since they are so slow to adopt technologies (like hardware ray tracing).
 
  • Like
Reactions: Irishman
Hasn’t Apple always 3rd tier for 3D work? I am not sure if Apple can scale to the point where they are first tier. Especially since they are so slow to adopt technologies (like hardware ray tracing).
That's the problem I see. If you have 1st tier hardware, which they very well might have now, however you're last in line at utilization eventually you're going to stop even trying to build for first tier (as they've already done for years and years with 3D) or not even know what 1st tier hardware is relative to everything else. The last time Apple had hardware anywhere near 1st tier for 3D was around like 2007-2008.
 
Hasn’t Apple always 3rd tier for 3D work? I am not sure if Apple can scale to the point where they are first tier. Especially since they are so slow to adopt technologies (like hardware ray tracing).

In my opinion, hardware-accelerated Ray Tracing is still relatively new as far as its adoption goes. Less than 50 games (mostly on nVidia hardware). Nvidia jumped in first, hoping that it would help sell RTX GPUs, then AMD followed suit, as much on consoles as on PCs. The problem with growing the faster adoption of the RTX ON mindset is that we’re still dealing with siloed implementations, even on the PC side of things. NVidias’ requires their hardware for their implementation, and so does AMD for theirs. Everyone involved is acting as though open standards don’t exist, and that’s greatly slowing it’s adoption.

There are also a lot of performance questions with regard to whether or not hardware accelerated Ray Tracing will be necessary for the best graphics. I’m thinking primarily about Crytek, who seems to be doing it in their engine, as does Apple, since Big Sur, it’s been supported at the OS level.

So, it seems like there’s a lot of work left to be done, certainly the current implementations of RT are far from being the final word.
 
Last edited:
Hardware-accelerated Ray Tracing is still relatively new as far as its adoption goes. Less than 50 games (mostly on nVidia hardware). Nvidia jumped in first, hoping that it would help sell RTX GPUs, then AMD followed suit, as much on consoles as on PCs. The problem with growing the faster adoption of the RTX ON mindset is that we’re still dealing with siloed implementations, even on the PC side of things. NVidias’ requires their hardware for their implementation, and so does AMD for theirs. Everyone involved is acting as though open standards don’t exist, and that’s greatly slowing it’s adoption.

There are also a lot of performance questions with regard to whether or not hardware accelerated Ray Tracing will be necessary for the best graphics. I’m thinking primarily about Crytek, who seems to be doing it in their engine, as does Apple, since Big Sur, it’s been supported at the OS level.

So, it seems like there’s a lot of work left to be done, certainly the current implementations of RT are far from being the final word.
What is the likelyhood that the implementation Apple comes up with will become the standard one that all others will emulate?
 
  • Like
Reactions: Irishman
This is ignorant and short sighted as more and more game rendering tech is being incorporated into said 3D authoring tools.
Care to elaborate? I have no idea what you're trying to say. I've yet to see anyone developing a game in Blender, Maya, 3ds Max or similar tools or are you talking about previews like effects that an artist needs when creating models?
This adoption has been accelerating over the last ten years because of the increased prominence and efficacy of GPU processing which requires more specific parallel instruction set type coding...which hasn't been happening over all this time for the Mac.
You lost me again, what "specific parallel instruction set type coding"? And why would that not have been done on the Mac?

Sure, back in the 80s and 90s when we created games and graphic demos for the C64 and Amiga, there wasn't much of parallelism there. And neither was there in the CGA/EGA/VGA days with PCs, at least not compared to a modern GPU. Lightwave (came out ~1990s) didn't have parallel support until ScreamerNet came along later. Parallelism really took off in the G4/G5 days with GPUs and has been utilized ever since.
 
  • Like
Reactions: Irishman
Care to elaborate? I have no idea what you're trying to say. I've yet to see anyone developing a game in Blender, Maya, 3ds Max or similar tools or are you talking about previews like effects that an artist needs when creating models?

You lost me again, what "specific parallel instruction set type coding"? And why would that not have been done on the Mac?

Sure, back in the 80s and 90s when we created games and graphic demos for the C64 and Amiga, there wasn't much of parallelism there. And neither was there in the CGA/EGA/VGA days with PCs, at least not compared to a modern GPU. Lightwave (came out ~1990s) didn't have parallel support until ScreamerNet came along later. Parallelism really took off in the G4/G5 days with GPUs and has been utilized ever since.
I guess none of this matters though in respect to making games as you and others have tirelessly pointed out that market Apple doesn‘t care about, cause it makes them no money.
 
Care to elaborate? I have no idea what you're trying to say. I've yet to see anyone developing a game in Blender, Maya, 3ds Max or similar tools or are you talking about previews like effects that an artist needs when creating models?

Maybe he's referring to examples like Blender roadmap moving from OpenGL to Vulkan. Both APIs are used for gaming and rendering tool pipeline.

https://code.blender.org/2021/10/blender-3-x-roadmap/
"A big target for the 3.x series is to move Blender to entirely use the Vulkan open standard. Vulkan and Metal backends for Blender’s GPU API are being developed. We expect these to be ready to replace the OpenGL backend by the end of 2022."
 
  • Like
Reactions: Irishman
I guess none of this matters though in respect to making games as you and others have tirelessly pointed out that market Apple doesn‘t care about, cause it makes them no money.
Well, it's brought up and discussed, so it must matter? I'm trying to understand what the claim is here.
And I'm not sure if Apple cares or not, I'm pretty sure Apple would be happy to take 30% for a sale/transaction without having invested anything. ;)

It's more the developers that find it not feasible, whenever the initial investment and cost of porting are too high. Ever since dropping Mac support for everything I and others have worked on in recent years (and that includes off the shelf engines like Unity and Unreal), our development life has become much easier. There's still Linux which throws the occasional curveball, but it's manageable. But let's face it, if games are the focus and only thing one does with the goal to maximize profit, then Windows would be the choice... even for me and I hate Windows with a passion and pretty much use it only for games. I guess I'll be a lawyer in my next life and I won't have to bother with these things. :p

That doesn't mean one can't make money with graphics related work on Macs. Go into highly specialized fields, like scientific applications which sell for a premium and you're all set when charging thousands for a single software license. I've done that with Macs in the past. Siemens, GE and others do it with Windows software too. They use a $25k workstation from HP (or similar) throw in their software and sell it for $150k. So I guess all it needs are Mac users buying games for $1k per title (Neo Geo anyone?). ?
 
Maybe he's referring to examples like Blender roadmap moving from OpenGL to Vulkan. Both APIs are used for gaming and rendering tool pipeline.
Ah, I see. Sure, APIs are used for visualization. Movie & image sequencing too. I'm still not sure how that would fit into game development, except for artists.
 
  • Like
Reactions: Irishman
I have no clue, if I’m being honest.

How about you, mate?
I think that, at least for gaming, the AMD way will become the “standard” because that is what current gen consoles use. To do it the nvidia way is a larger performance hit on AMD hardware than doing it the other way around.

Vulkan (if I am not mistaken) just took nvidia’s extensions and made them “standard” instead of coming up with their own. So on the PC side you are more or less “stuck” with how nvidia implemented it (I am not sure if DXR is the same, but would be unsurprised if it were).

Yeah disregard this, it just feels like that is what happened, lol.
 
Last edited:
  • Like
Reactions: Irishman
Maybe he's referring to examples like Blender roadmap moving from OpenGL to Vulkan. Both APIs are used for gaming and rendering tool pipeline.

https://code.blender.org/2021/10/blender-3-x-roadmap/
"A big target for the 3.x series is to move Blender to entirely use the Vulkan open standard. Vulkan and Metal backends for Blender’s GPU API are being developed. We expect these to be ready to replace the OpenGL backend by the end of 2022."

Isn’t Vulkan just the newest version of OpenGL?
 
That doesn't mean one can't make money with graphics related work on Macs. Go into highly specialized fields, like scientific applications which sell for a premium and you're all set when charging thousands for a single software license. I've done that with Macs in the past. Siemens, GE and others do it with Windows software too. They use a $25k workstation from HP (or similar) throw in their software and sell it for $150k. So I guess all it needs are Mac users buying games for $1k per title (Neo Geo anyone?). ?
Ah, the intrigue of specialized software! I remember reading about a curling app that costs $100. That's 1/4 the cost of a typical $400 iPad then! However, it was highly rated, and popular.

As far as Neo Geo goes... thank goodness for emulation b/c that was the only way I got to try that out. The console was $700, and games were roughly $200 to $300 in 1991 money (so double those amounts to account for inflation for 2022)
 
The problem with Apple and gaming has always been the chicken and egg problem.
There are not enough games, so there are no gamers. There are no gamers, so there are no games.

But I think Apple is missing a huge bit of revenue by ignoring gaming on Apple devices beyond mobile gaming.

The Apple TV could have evolved to be a casual gaming platform, but Apple's attempt to bring games to Apple TV was halfhearted.
The Apple arcade is also another halfhearted attempt.

I think Apple has the hardware technology to create a new gaming experience, it has just never been able to deliver on its promise, even in the Steve Jobs days.
 
The problem with Apple and gaming has always been the chicken and egg problem.
There are not enough games, so there are no gamers. There are no gamers, so there are no games.
I think there is also an attitude problem among the gaming community. I was listening to the latest Moore's Law is Dead podcast and they briefly covered the M1 Ultra. I realize that Apple isn't what he usually covers, and do give him credit for saying that the Ultra probably has graphics performance close to a 3090 based upon Hardware Unboxed benchmarks they did on the Pro.

However, you can tell by the tone of their voices that they just don't care. It's not part of their world, just a curiosity. They dismissively say that the M1 series performs the way that it does simply because of process technology at 5nm. No mention of the integration of the SoC, no thoughts about the variable length instruction set of x86 that @cmaier has often pointed out, forgotten is the massive power efficiency advantage of Apple Silicon, no hints at the quality of engineers that Apple has on their design team.

Assuming they even considered it, they just go on to say that that Nvidia will surpass the 3090 soon enough, and therefore the M1 Ultra. Keep in mind that in another part of this same podcast, Tom is pointing out that his sources claim that Lovelace will go over 600w, and some specialized cards may hit 800w.

Regardless, this dismissive attitude is found throughout the hardcore gaming community. This is important because PC game developers are also PC gamers. The reason I use Tom and his podcast as an example is because he is one of the least judgmental folks in that arena. He doesn't hate the Mac, he just has no interest it, because it isn't a platform for hardcore gamers, which is his audience.
 
  • Like
Reactions: Homy
Care to elaborate? I have no idea what you're trying to say. I've yet to see anyone developing a game in Blender, Maya, 3ds Max or similar tools or are you talking about previews like effects that an artist needs when creating models?

You lost me again, what "specific parallel instruction set type coding"? And why would that not have been done on the Mac?

Sure, back in the 80s and 90s when we created games and graphic demos for the C64 and Amiga, there wasn't much of parallelism there. And neither was there in the CGA/EGA/VGA days with PCs, at least not compared to a modern GPU. Lightwave (came out ~1990s) didn't have parallel support until ScreamerNet came along later. Parallelism really took off in the G4/G5 days with GPUs and has been utilized ever since.



Apple has been out of the parallelism loop and/or significantly behind in coding for parallelism for 3D specifically for YEARS. Apple had acquired a toe hold when switching to Intel. Immediately after the Intel shift they were behind still and then inched forward a bit. Once they dropped Nvidia GPU support right before Nvidia implemented CUDA core tech they really started sliding and then things started changing dramatically. Apple stayed with AMD exclusively (who themselves were constantly playing catch up to Nvidia from wayyyy behind until recently).


In that 10 or so years Apple got left behind in GPU accelerated physics execution for simulation, GPU accelerated procedural texture rendering for industry standard tools like Substance with all those PBR textures, GPU accelerated Subsurface Scattering, GPU accelerated hair rendering, GPU accelerated atmospherics, GPU accelerated instancing, GPU accelerated caustics, GPU accelerated reflection/blurry reflection, GPU accelerated Ray Tracing (already mentioned)…it’s really more numerous than what I can list, but GPU coders have been busy for the last decade plus.


Most of these features showed up in gaming based real-time renderers first and were only made possible because of the effort of hundreds of developers in gaming and VFX coding features that would benefit from CUDA hardware specifically for CUDA core hardware, and later for Tensor core hardware from Nvidia which is also highly parallel for 3D. Now Optix is seemingly poised to elevate performance for Nvidia users again and Apple may not be able to keep pace even with native Apple Silicon 3D software.

These major changes ultimately pushed 3D rendering software development, as a whole, in a completely different direction than it was going and pushed GPU accelerated rendering into the spot as a major convergence point for games rendering and professional CGI AND VFX rendering development. This shift also made most of the uber expensive “Pro” GPUs (Red or Green) far less relevant because all that was necessary were the CUDA cores and Tensor cores for high performance rendering advantages and those were the same regardless of card type.

This obviously left AMD and Apple out in the cold. AMD has recently stepped their game up, but in truth there’s still quite a few final frame rendering efficiencies that can only be had with Nvidia hardware. CPU advancements have kept a lot of tools, as a whole advancing at a steady clip but when you get to rendering there’s a sizable chunk of software engineering missing from Apple’s puzzle.

My understanding is that Arnold for Maya should be able to take advantage of all the new M1 Ultra cores for CPU based rendering, Maxon has an experimental version of Red Shift running on Metal and SideFX has a non-supported developmental edition of Houdini for M1 natively available for their customers, Octane Render is in it first Apple Silicon native versio and that’s all good for a start. However, there aren’t benchmarks and there are still missing acceleration code for a multitude of other specific features.

GPU based renderers fully have replaced traditional CPU renderers in many studios and render farms. Where that’s not the case it’s become common for software based renderers to have significant portions of GPU accelerated code that can be active or not based on whether there's Nvidia hardware available and various parts of the apps themselves are GPU accelerated in the same way.

Macs have been shut out of all of these advances for years. It has only been many creatives' preference for the MacOS that has kept tool developers engaged enough to keep non-GPU augmented general tools updates coming to the Mac. It’s about time Apple started paying these customers’ constant loyalty back.

Many 3D tools that run on Macs have been crippled in various ways in comparison to their PC counterparts from the lack of advancement with various features in the tools that have been updated to take advantage of GPU acceleration and there’s no drop in the price of the software so it’s just a net loss for the Mac users.

The point of all of this is that Apple really should go out of its way to finally acquire the specialists that build the 3D content and tools to keep hardware development on a good path and keep themselves from going back in the ditch for the next ten years. Also, if they’re going to advance the software to that degree on their end for VFX/CGI there’s no reason not to acquire game makers/games tools makers and get the games on the Mac as well…which provide a very useful practical application test bed for of a lot of tech and can be immediately profitable.
 
Last edited:
Apple has been out of the parallelism loop and/or significantly behind in coding for parallelism for 3D specifically for YEARS.
I'm still not sure what you're saying. For what were they out of this loop? Hardware, software or both?
And what paradigm are they missing?
Immediately after the Intel shift they were behind still and then inched forward a bit.
For what? Overall, sure, in other fields they were leading.
Once they dropped Nvidia GPU support right before Nvidia implemented CUDA core tech they really started sliding and then things started changing dramatically.
Again not sure what you're saying. Before Apple and Nvidia ran into disagreements, CUDA has been available for OS X for years.
In that 10 or so years Apple got left behind in GPU accelerated physics execution for simulation, GPU accelerated procedural texture rendering for industry standard tools like Substance with all those PBR textures, GPU accelerated Subsurface Scattering, GPU accelerated hair rendering, GPU accelerated atmospherics, GPU accelerated instancing, GPU accelerated caustics, GPU accelerated reflection/blurry reflection, GPU accelerated Ray Tracing (already mentioned)…it’s really more numerous than what I can list, but GPU coders have been busy for the last decade plus.
Are you talking about hardware features(if so, which)? Because all of this has been possible in software running on GPUs for Apple systems. I've implemented many of these myself for Windows, Linux and OS X/macOS.
Now Optix is seemingly poised to elevate performance for Nvidia users again and Apple may not be able to keep pace even with native Apple Silicon 3D software.
So we're talking software only and not hardware? I have absolutely no doubt that software in general is behind on Apple systems, especially compared to Nvidia, but is that really Apples fault? In the end, every developer is free to port software to macOS/iOS, but the point here is really the same as with games. Why bother? As long as it's not financially feasible, people are not going to do it. I've done it in the past, exclusively for macOS / iOS because it paid off (financially). But doing it with extremely high development cost and low return ($50 per game?) is just not worth it for the user base. On the other hand, when I can sell an Apple system + a software license for a very specific market and make anywhere between $10k to $500k for a single system, then that can be well worth it. I rely on Nvidia myself, simply because they have a large marketshare and provide the tools (as well as 3rd party tool providers), that doesn't mean it couldn't be done on Apple, but they need significantly more marketshare before that happens, and not just in general (email, office, webbrowsing), but for the target group.
This shift also made most of the uber expensive “Pro” GPUs (Red or Green) far less relevant because all that was necessary were the CUDA cores and Tensor cores for high performance rendering advantages and those were the same regardless of card type.
That depends massively on the target group, I'm still running RTX8000 in workstations and even more expensive cards for servers. For gaming not so much, but we're looking at a market where Apple has no hardware for, they'll have to do much better than a M1 Ultra for the "pro market". Sure, for gamers not so much.
Macs have been shut out of all of these advances for years. It has only been many creatives' preference for the MacOS that has kept tool developers engaged enough to keep non-GPU augmented general tools updates coming to the Mac. It’s about time Apple started paying these customers’ constant loyalty back.
Again, they've not been shut out, developers chose not to port software due to lack of user base.
Many 3D tools that run on Macs have been crippled in various ways in comparison to their PC counterparts from the lack of advancement with various features in the tools that have been updated to take advantage of GPU acceleration and there’s no drop in the price of the software so it’s just a net loss for the Mac users.
I've dropped macOS support myself, as much as I love macOS and I've been using Macs for decades. I can't blame anyone for doing the same.
The point of all of this is that Apple really should go out of its way to finally acquire the specialists that build the 3D content and tools to keep hardware development on a good path and keep themselves from going back in the ditch for the next ten years.
So Apple should buy... everyone? And even then, how long would it take to port the whole Nvidia software + others to macOS? That's just not going to happen. This is also not Apples core business, which is apps and services. So you really think Apple cares about an additional 10% or 20% marketshare vs Windows? That's not where the money is for them, apps/services are much more profitable than selling a new computer to everyone every 3 years or so.
 
  • Like
Reactions: Irishman
"...That depends massively on the target group, I'm still running RTX8000 in workstations and even more expensive cards for servers. For gaming not so much, but we're looking at a market where Apple has no hardware for, they'll have to do much better than a M1 Ultra for the "pro market". Sure, for gamers not so much.

Again, they've not been shut out, developers chose not to port software due to lack of user base.

I've dropped macOS support myself, as much as I love macOS and I've been using Macs for decades. I can't blame anyone for doing the same..."
...Ok, so basically you're clueless as relates computationally robust graphics hardware/software implementation. You should have lead with that.

I'll leave this here as a bit a of a primer:

Apple killed CUDA support/Nvidia GPU driver support so early on in the development of CUDA that what's there from that period of time doesn't qualify as CUDA in any way based on what CUDA is today.

The fact is that if Apple's hardware is up to snuff, and I suspect that it is, they don't need CUDA. However, Nvidia did already railroad that the graphics industry into writing a lot of low level APIs around Nvidia's closed hardware standards and Apple doesn't have equivalent Metal based APIs or the software to interface with those APIs and drivers to compete with those apples for apples, the cubbard is bare. That's where the work has to be done and the APIs/Tools have to be tested with actual developed content which also doesn't exist on the Mac.

Relative to other entertainment and tech sectors the acquisitions I would suggest/have suggested wouldn't include Nvidia, are software oriented and would be pretty minuscule in an industry corp. scale comparison...but not really all that different than acquiring Final Cut from Macromedia however many years ago that was and for the same reasons.

Apple doesn't care about offering server tech and they shouldn't since Apple at it's core has always been a creative/design software/tech company going all the way back to the importance Steve Jobs placed on the Mac system fonts and the importance of connection to type design history.

I've always maintained a hybrid OS facility, but would prefer to standardize on MacOS if/when possible.

Games Development, Games and VFX are the current era desktop publishing equivalent Apple should be pursuing and maybe they will, only time will tell.

Apple getting into the server business would be as relevant to the core business as Apple getting into the dump truck business.

Honestly it's just down to whether Apple continue sof the stubborn path for 3D and continues to slowly plod along and eventually allow hardware progression to languish or not.
 
...Ok, so basically you're clueless as relates computationally robust graphics hardware/software implementation. You should have lead with that.
Ah, we're going here again without answering actual questions. You should have told Apple about the clueless thing before they handed out two devloper awards for graphic related software and Jobs standing on stage presenting benchmarks with software I've worked on. And maybe all the students I've taught at a university for their BS/MS degrees. Maybe you should just answer a simple question instead.
Apple killed CUDA support/Nvidia GPU driver support so early on in the development of CUDA that what's there from that period of time doesn't qualify as CUDA in any way based on what CUDA is today.
Wait really? Don't tell anyone, but CUDA from three years ago is different from CUDA today.

I'm going to skip the rest, because it's "marketing speech" and instead ask again... you said Apple is out of the parallel loop and has been for ages. Which parallel programming paradigm wasn't possible with Macs in the PPC/Intel era that's been missing and there was no replacement for?

I'm not aware of any, if you know, please elaborate in detail. Again, I'm not talking about using Blender or some other software, but actual algorithms. What could be done on Windows/Linux on the lowest level that could not be done on Macs?
 
  • Like
Reactions: Irishman
To point out some things I didn't address: Yes, it is Apple's fault that they are way behind in hardware/software infrastructure solutions for using 3D tools and games on their hardware. Nvidia (who's entire business success has been built from games) attempting to acquire ARM for $40 BILLION dollars and Microsoft spending close to $80 BILLION dollars on videogame publisher acquisitions in the last 12 months alone kind of hints at the fact that there might be more than a few nickles to be made in the games industry.

Also, Apple can't get a larger marketshare by ignoring large swathes of already existing hardware buyers that want hardware that actually also gives them the option to effectively play games and/or build games/VFX. If the M1 Ultra is 10%-15% slower than a moderately spec'd high end PC with an Nvidia RTX 3090, which is what's looking like (before Apple has announced an M2) Apple is plenty close enough to start developing roads for more 3D on the Mac now.

Aside from all that with Epic having successfully sued Apple for locking purchases to happening inside the app store and having a judge force Apple to unlock secondary app purchases from the app store and Apple's cut of that revenue. Apple's apps and services business revenue is going to fall. Apple needs to look at 3D on the desktop as one arm that can compensate for future steep revenue losses from apps/services.

CNBC-
"According to the ruling, gaming apps account for approximately 70% of all App Store revenue."
"That 70% is generated by less than 10% of all App Store consumers, the court said."

With Apple's current infrastructure the games and tools that are available run dramatically slower in a number of categories than they do on Windows while still having premium pricing so it's not just about developers/3D users not liking Apple for whatever reason. The only place where Macs are at or beyond the standard for performance is with video editing and that alone isn't enough.
 
Nvidia (who's entire business success has been built from games) attempting to acquire ARM for $40 BILLION dollars and Microsoft spending close to $80 BILLION dollars on videogame publisher acquisitions in the last 12 months alone kind of hints at the fact that there might be more than a few nickles to be made in the games industry.
You should check the server market and how much Nvidia is involved. Sure they're known largely by gamers for games... Nvidia is massive in the GPU server market and research/science. And no one said there's no money to be made with games, the gaming market is huge. The gaming market on macOS is not.
If the M1 Ultra is 10%-15% slower than a moderately spec'd high end PC with an Nvidia RTX 3090, which is what's looking like (before Apple has announced an M2) Apple is plenty close enough to start developing roads for more 3D on the Mac now.
They've always been able to do that. There has been leading 3D software on Macs in the PPC/Intel days, which usually were Mac exclusive. Way better than Linux/Windows equivalents from other manufacturers.
Apple needs to look at 3D on the desktop as one arm that can compensate for future steep revenue losses from apps/services.
Apps/services are cheap for Apple they get their slice of pie when others did the work. Doing 3D on their own is expensive though and they're not in the position to make any money off it right now. In the end, if apps/services go down in revenue, they can always bring back the old developer model and charge developers as they did many, many years ago.
CNBC-
"According to the ruling, gaming apps account for approximately 70% of all App Store revenue."
"That 70% is generated by less than 10% of all App Store consumers, the court said."
Sure, those are all low cost games (<$10) which are cheap to develop (time and money). See Flappy Bird, under 50h development time and it generated over $50k per day. That's $5M in less than four months.

What's the revenue for AAAA titles with $100M+ development costs? (yes it's a rhetorical question, because I've done the math from an actual development project of a AAA game in another thread).
With Apple's current infrastructure the games and tools that are available run dramatically slower in a number of categories than they do on Windows while still having premium pricing so it's not just about developers/3D users not liking Apple for whatever reason.
What tools? 3rd party tools? Again that's up to the developer. There's absolutely no reason you can't port tools to run with top performance on Macs, it's just not worth it. Porting and developing games (and tools) for Macs is only worth it if development costs are low, which is the case for "simple" games and simple games are easy to port or the user base is large enough.

Technically developers can port modern DX12 games to Metal/macOS. One has to work around technology differences, such as resource binding tiers as Mac hardware and Metal have different limits. It isn't a trivial task, it's not super difficult either, it's just very, very time consuming - which means it costs a lot. Things just work differently, but it can be done. But in the end, why bother for under 5% of the gaming market? Again, we're talking AAA games and not "Flappy Bird".
 
  • Like
Reactions: Irishman
You should check the server market and how much Nvidia is involved.
I have and to a large extent this only serves to reinforce my of point. Nvidia is using the same technology in servers for AI development as they are for game GPUs. It's ONLY BEEN POSSIBLE FOR NVIDIA TO DO WHAT IT'S DOING TODAY, TECHNICALLY AND FINANCIALLY, BECAUSE OF VIDEO GAMES.

Apple could elect to go in that direction or not, but it's better to have all available options, you know, available. Releasing servers for general purpose use and the Enterprise support that's required to really make that work makes it prohibitive for Apple to get in there, but Servers for predictive simulation and AI development is feasible.

It's clear that Nvidia was afraid Apple would continue plowing forward at a good clip with their SOC development, which is why Nvidia moved to acquire ARM...just so they could block Apple if/when Apple Silicon SOCs got too competitive with forthcoming Nvidia SOC/APUs.

The new Grace SOC/APU is beefed up, but architecturally very similar to the M1 Ultra without Apple's relative power saving advantages.

They've always been able to do that.
Not really, Apple has always done their little Apple-y obscure benchmarks where they mostly only did Apple to Older Apple comparisons and the other comparisons were ones that made it appear Macs were somewhere close to parity (like they still do) and later it was found out that the performance spec didn't bare out in more comprehensive practical use testing, but even worse for Apple it always seemed to happen that generally in less then a couple of months of releasing new Apple hardware new would see new Intel and or Nvidia hardware that would run laps around Apple's offerings.

The main difference was that on the PC side they always had big data sets from photo realistic games to crunch to force them to keep advancing the hardware and they never stooped to the pettiness of separating, by "Social Importance", one big data set from another big data set. In truth you're not going to a sell a big data set cruncher to the customer that will make you feel more socially important without proving your hardware can crunch actual big data sets from somewhere.
Apps/services are cheap for Apple they get their slice of pie when others did the work.
That's nice and all but with the vast majority of the money coming from 10% of the users and Epic poking holes in the hull of that little boat it's already taking on water. The clock is ticking until Apple will have to give all the that apps and services money up, so it's the perfect time to start building that bigger sturdier games boat.
What's the revenue for AAAA titles with $100M+ development costs?
The profits aren't where you'd expect, you've probably been looking in the wrong places to see how Apple and not the individual game publisher. The "AAA" games are marginally profitable, but the platform holder that has the "AAA" bangers that bring that gamers to hardware use these games as a loss leader to sell other more cheaper games in greater quantities with better profit margins while selling to far more the 10% of their users. This model makes a high end games play far less risky, profitable and raises high performing unit sales of hardware overall.

It's not all that different from what Apple does with Final Cut, Motion, Logic X etc. Virtually every other non-platform holding software-only developing partner in these spaces have moved to subscription services almost exclusively and Apple doesn't have to because the software moves a lot of hardware. These days most of the people buying Macs to run Final Cut also play games and are generally pissy about constantly having to maintain two systems, but accept it because they're always told Apple's hardware doesn't have the grunt to run a Ghostrunner or Elden Ring at 4k 60fps. That's been mostly true up to this point, but the Mac M1 Max/M1 Ultra probably could run those games and be performant doing so IF Metal wasn't such a pain in the Ass to game developers.

it's just very, very time consuming - which means it costs a lot.
Apple spent all kinds of time, money and energy to facilitate the development and marketing/promotion of all the low level API tools in Swift to build and run all the waify mobile games that are on the iPhone now without a stitch of information in advance that mobile game development would flourish on iOS. Apple could and should do the same thing with desktop hardware and games.
 
Nvidia is using the same technology in servers for AI development as they are for game GPUs.
Same technology as in silicon, sure. Architecture not so much, just do a comparison of their gaming cards with Quadro/server cards and see a massive difference in performance for these two fields.
Releasing servers for general purpose use and the Enterprise support that's required to really make that work makes it prohibitive for Apple to get in there, but Servers for predictive simulation and AI development is feasible.
And that worked so well for Apple in the past... to the point they discontinued their server line. You need a full line up, something that Dell, Lenovo and HP offer. Apple really isn't interested in this, because they won't be able to offer the services.
It's clear that Nvidia was afraid Apple would continue plowing forward at a good clip with their SOC development, which is why Nvidia moved to acquire ARM...just so they could block Apple if/when Apple Silicon SOCs got too competitive with forthcoming Nvidia SOC/APUs.
This has been discussed up and down on this forum, but once more... even if Nvidia would own ARM, there would be no way to block Apple from developing ARM chips.
The new Grace SOC/APU is beefed up, but architecturally very similar to the M1 Ultra without Apple's relative power saving advantages.
Grace is similar to what Nvidia has been doing for years. The use case for Grace is improved I/O for the GPU, nothing else.
Not really,
Yes really.
This surely wouldn't have been possible if Apple was so behind back in the day? (and yes, I worked on parts of that, that's also what Jobs used for benchmarks on stage)
https://arxiv.org/pdf/astro-ph/0611400.pdf
https://lweb.cfa.harvard.edu/~agoodman/Presentations/ANDALUSIA_01_2010/andalusia_iic_10.pdf
The main difference was that on the PC side they always had big data sets from photo realistic games to crunch to force them to keep advancing the hardware and they never stooped to the pettiness of separating, by "Social Importance", one big data set from another big data set. In truth you're not going to a sell a big data set cruncher to the customer that will make you feel more socially important without proving your hardware can crunch actual big data sets from somewhere.
I have no idea what you're saying, maybe because you still haven't answered a simple question about parallel programming paradigms. Any dataset on the PC side has also been available on the Mac side for developers.
That's nice and all but with the vast majority of the money coming from 10% of the users and Epic poking holes in the hull of that little boat it's already taking on water. The clock is ticking until Apple will have to give all the that apps and services money up, so it's the perfect time to start building that bigger sturdier games boat.
Then they will change the license model, just like other manufacturers and Apple had a different model in the past as well. I really don't see Microsoft providing all those tools (again, maybe you could actually specify what exactly you actually need) for developers out there and yet somehow developers get their stuff running on Windows for Intel/Nvidia/AMD graphics.
The profits aren't where you'd expect, you've probably been looking in the wrong places to see how Apple and not the individual game publisher.
So looking at actual numbers from AAA game development studios is the wrong place to look at when trying to make money with games?
It's not all that different from what Apple does with Final Cut, Motion, Logic X etc. Virtually every other non-platform holding software-only developing partner in these spaces have moved to subscription services almost exclusively and Apple doesn't have to because the software moves a lot of hardware.
"A lot" of hardware? For that lot of hardware, they sure have a very small market share. And the gaming market is even smaller. And no, people in the professional world, be it film/music studios, dub stages, etc. do not play games on their Macs, they do actually work. Youtubers? Sure, but that's even a smaller target market.
That's been mostly true up to this point, but the Mac M1 Max/M1 Ultra probably could run those games and be performant doing so IF Metal wasn't such a pain in the Ass to game developers.
So maybe we're getting somewhere here... why exactly is it that Metal is such a pain in the ass to developers? Be technical here, let us know what parts of the API are problematic, what parts are not? I hope this isn't another one of you're statements we'll never get an answer for. Please, no marketing talk.
Apple spent all kinds of time, money and energy to facilitate the development and marketing/promotion of all the low level API tools in Swift to build and run all the waify mobile games that are on the iPhone now without a stitch of information in advance that mobile game development would flourish on iOS. Apple could and should do the same thing with desktop hardware and games.
What low level API tools in Swift?

As for mobile games, if I'd be just into making money, then the mobile iOS (and Android) market is precisely where I'd go. I'd release a new game every week or two, do some fancy in game purchases and have a good cash flow. If a game doesn't do so well, then I have wasted listed time and money. The other option would be to put down $100M+ first, do years of developments only to find people don't like a game and make a loss with it.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.