Nvidia is using the same technology in servers for AI development as they are for game GPUs.
Same technology as in silicon, sure. Architecture not so much, just do a comparison of their gaming cards with Quadro/server cards and see a massive difference in performance for these two fields.
Releasing servers for general purpose use and the Enterprise support that's required to really make that work makes it prohibitive for Apple to get in there, but Servers for predictive simulation and AI development is feasible.
And that worked so well for Apple in the past... to the point they discontinued their server line. You need a full line up, something that Dell, Lenovo and HP offer. Apple really isn't interested in this, because they won't be able to offer the services.
It's clear that Nvidia was afraid Apple would continue plowing forward at a good clip with their SOC development, which is why Nvidia moved to acquire ARM...just so they could block Apple if/when Apple Silicon SOCs got too competitive with forthcoming Nvidia SOC/APUs.
This has been discussed up and down on this forum, but once more... even if Nvidia would own ARM, there would be no way to block Apple from developing ARM chips.
The new Grace SOC/APU is beefed up, but architecturally very similar to the M1 Ultra without Apple's relative power saving advantages.
Grace is similar to what Nvidia has been doing for years. The use case for Grace is improved I/O for the GPU, nothing else.
Yes really.
This surely wouldn't have been possible if Apple was so behind back in the day? (and yes, I worked on parts of that, that's also what Jobs used for benchmarks on stage)
https://arxiv.org/pdf/astro-ph/0611400.pdf
https://lweb.cfa.harvard.edu/~agoodman/Presentations/ANDALUSIA_01_2010/andalusia_iic_10.pdf
The main difference was that on the PC side they always had big data sets from photo realistic games to crunch to force them to keep advancing the hardware and they never stooped to the pettiness of separating, by "Social Importance", one big data set from another big data set. In truth you're not going to a sell a big data set cruncher to the customer that will make you feel more socially important without proving your hardware can crunch actual big data sets from somewhere.
I have no idea what you're saying, maybe because you
still haven't answered a simple question about parallel programming paradigms. Any dataset on the PC side has also been available on the Mac side for developers.
That's nice and all but with the vast majority of the money coming from 10% of the users and Epic poking holes in the hull of that little boat it's already taking on water. The clock is ticking until Apple will have to give all the that apps and services money up, so it's the perfect time to start building that bigger sturdier games boat.
Then they will change the license model, just like other manufacturers and Apple had a different model in the past as well. I really don't see Microsoft providing all those tools (again, maybe you could actually specify what exactly you actually need) for developers out there and yet somehow developers get their stuff running on Windows for Intel/Nvidia/AMD graphics.
The profits aren't where you'd expect, you've probably been looking in the wrong places to see how Apple and not the individual game publisher.
So looking at actual numbers from AAA game development studios is the wrong place to look at when trying to make money with games?
It's not all that different from what Apple does with Final Cut, Motion, Logic X etc. Virtually every other non-platform holding software-only developing partner in these spaces have moved to subscription services almost exclusively and Apple doesn't have to because the software moves a lot of hardware.
"A lot" of hardware? For that lot of hardware, they sure have a very small market share. And the gaming market is even smaller. And no, people in the professional world, be it film/music studios, dub stages, etc. do not play games on their Macs, they do actually work. Youtubers? Sure, but that's even a smaller target market.
That's been mostly true up to this point, but the Mac M1 Max/M1 Ultra probably could run those games and be performant doing so IF Metal wasn't such a pain in the Ass to game developers.
So maybe we're getting somewhere here... why exactly is it that Metal is such a pain in the ass to developers? Be technical here, let us know what parts of the API are problematic, what parts are not? I hope this isn't another one of you're statements we'll never get an answer for. Please, no marketing talk.
Apple spent all kinds of time, money and energy to facilitate the development and marketing/promotion of all the low level API tools in Swift to build and run all the waify mobile games that are on the iPhone now without a stitch of information in advance that mobile game development would flourish on iOS. Apple could and should do the same thing with desktop hardware and games.
What low level API tools in Swift?
As for mobile games, if I'd be just into making money, then the mobile iOS (and Android) market is precisely where I'd go. I'd release a new game every week or two, do some fancy in game purchases and have a good cash flow. If a game doesn't do so well, then I have wasted listed time and money. The other option would be to put down $100M+ first, do years of developments only to find people don't like a game and make a loss with it.