The problem isn't hardware. It's a cultural problem that Apple has, which is holding them back from gaming, and will contiue to hold them back.
The hardware isn't good. It's not bad, but it's not great. The minimum spec Apple gaming machine is an M1 Macbook Air, which is, honestly, a capable machine - It's at least as powerful as a Nintendo Switch, and that has a ton of games, so
theoretically the M1 Macbook could be a good gaming machine too.
But the Switch has something besides hardware, too. It has a company behind it that values gaming. And Apple does not.
AMD and nVidia push out driver updated at a monthly pace, give or take. Sometimes a AAA game engine does something that the GPU wasn't expecting, and it causes a crash. At this point, the studio will work with the GPU manufacturer and the OS manufacturer, in order to properly identify and resolve the issue. A driver fix is issued, or sometimes even an OS level fix, and game development continues. When the game is released, the publisher recomends a specific minimum driver version.
But watch what happens when you are a AAA game studio, developing for the mac: Find a driver bug. Full stop. There's a high likelyhood that you don't even bother letting Apple know, because Apple is notorious for ignoring bugs. But lets say that you do. There's a very real posibility that Apple does nothing. You may never even know if Apple fixes the bug. You spend development cycles trying to work around this bug.
Eventually, you release your AAA game. It's great.
Four years later, Apple switches architectures, and the game no longer runs. You can a) spend money trying to get a team together to fix the game, and updated it for free, or b) update it for a fee - but lets be real, the amount of money you'll make off of a half-decade old Call of Duty or the like is minimal. So you take option c) do nothing, and eventually the game dies a slow death as new hardware ceases to run it.
This is the reality is that this is a best case scenario. Prior to Metal, Apple uses OpenGL, which was stuck at version 4.1 for
9 years. For comparison, Microsoft released (and shipped Windows with) DirectX 11.1, 11.2, 11.3, and 12. MS added API features such as Ray Tracing, VRS, refresh rate switching, and low level API access (similar to Metal), among other things.
Even Apple not giving the APIs enough attention and ignoring game developers wasn't the real issue.
You're right. It's nowhere near close. But it's a pretty damn good barometer for how Apple feels about gaming. It's one thing to get a PR person to say some nice fluff, but it's another thing entirely to put your money where your mouth is, and Apple hasn't. All that Apple has done, so far, is more of the same. It's telling that Blizzard, probably the most mac-centric game studio of all time, has dropped Apple from supported platforms.