You're kidding, right? Yeah, Apple could do more to encourage game developers to support the platform but the hardware itself is definitely not the problem.
Unfortunately they are right. I believe the best way to describe it is: "Barely adequate".
Not for gaming mind you, it's completely inadequate for even casual games these days (Trine, Hearthstone). It gets by for rendering the desktop environment and a few GPGPU tasks.
The very fastest GPU that Apple is shipping today is the high end SKU on the 2017 27-inch iMac, the Radeon Pro 580. It's around the same speed as a Nvidia GTX 970 from
September 2014 (LOLOLOLOLOLOLOL).
Unfortunately, Apple is all aboard the AMD train for their future GPUs. We've seen Polaris and now we've seen Vega... Nvidia is crushing them. I don't know what Nvidia did to get Apple to dump them, but it's making me sad.
Correct me if I'm wrong, but aren't "mobile GPUs" just desktop silicon, but modified for lower power consumption? It's largely the same architecture and the exact same capabilities. I just worry that people discussing this topic use "mobile" as some pejorative without actually explaining what exactly is wrong with the GPU.
I even see people complaining that MacBooks should have nVidia GTX 1080s in them which is confounding to hear. This would of course mean using multiple USB-C chargers just to power and charge the MacBook.
It seems that the majority of comments on the Internet about PC hardware are by people who "know enough to be dangerous". They know little more than how to plug in a PCIe card and the model number of the latest nVidia card. There's never any mention of how Apple could get the cooling to work or maintain battery life and convenient charging.
Historically, mobile GPUs have been custom designs around reduced functional units, clocks, and memory interfaces. A smaller and less capable chip to meet a power and thermal target. This has been problematic for laptops, as well as the iMac, for a long time.
Process improvements first to 28nm and now to 14/16nm have brought about an enormous change to the efficiency of GPUs. We saw the same thing with Ivy Bridge in CPUs. Boosting frequency on-demand combined with a process shrink to 22nm was a major improvement in the capability of mobile CPUs.
Today the dies are small enough and the GPUs are efficient enough that you can put a full-fat chip into a laptop and run it at reduced clocks. That means the GTX 1080 in a laptop is a full desktop GTX 1080. It isn't clocked as high, but it's within 20-30% of the desktop part, which is
amazing.
Apple has missed the boat on this revolution completely in the iMac. For the first time with the GTX 10-series they were able to produce an all-in-one with no drawbacks. Skylake, Pascal, and SSDs means that it is possible to build a no-compromise iMac. Fast desktop CPU, fast desktop GPU, 32-64GB of RAM, 2TB of PCIe SSD, a great screen. It gives up nothing at all. They should have been ready with this product on day one, it's what the iMac has always wanted to be.
Instead through some combination of incompetence/politics/arrogance, we have mediocre instead of a great product. Such is life. The tech is there, on the shelf to buy.