Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Today the dies are small enough and the GPUs are efficient enough that you can put a full-fat chip into a laptop and run it at reduced clocks. That means the GTX 1080 in a laptop is a full desktop GTX 1080. It isn't clocked as high, but it's within 20-30% of the desktop part, which is amazing.

Apple has missed the boat on this revolution completely in the iMac.

They are crap, heat generated will burn inside components or even crack the boards, this has been a very well known issue in the laptop industry for over 2 decades.

Still, I've beem using various Apple laptop models for gaming over the years and while they sure get hot and the fans spinn like crazy, nothing has broken down on them.

Apple has suffered many issues with defective graphics chips in laptops causing expensive recalls and repair programs. They might be *really* wary of suffering the same hits again with the new laptop 1080 graphics chips.
 
Apple has suffered many issues with defective graphics chips in laptops causing expensive recalls and repair programs. They might be *really* wary of suffering the same hits again with the new laptop 1080 graphics chips.

True that. Doesn't mean it has been a problem with every unit in a generation that had problems. Was many if the 2010 MacBook Pro computers that had issues because of manufacturing problems on NVIDIA's side, but that was also the case for other brands than Apple. And it didn't have anything to do with overheating (if I remember correctly).
 
Hopefully it works for 10.9. I'm sure theres some updated graphics drivers and whatnot that it takes advantage of, but I've been able to play other Mac games on older versions of OSes before.
It's very very doubtful that this game will run on 10.9.
 
Process improvements first to 28nm and now to 14/16nm have brought about an enormous change to the efficiency of GPUs. We saw the same thing with Ivy Bridge in CPUs. Boosting frequency on-demand combined with a process shrink to 22nm was a major improvement in the capability of mobile CPUs.

AMD underwent those same process improvements, this is not an nVidia advantage. Nor are process improvements a magic wand that make the latest full-power desktop GPUs practical in laptops. Of course desktops are always going to use 500W+ power supplies and push things to the limit while notebooks have finite power. GTX 1080 laptops are enormous, noisy and require massive external power supplies. Then there is the issue of inflated prices and fewer compute cores than AMD.

Maybe when nVidia releases a low-power chip we can rehash this debate, but right now it's about the same as arguing the Nintendo Switch should have a 1080 Ti instead of a Tegra.
 
AMD underwent those same process improvements, this is not an nVidia advantage. Nor are process improvements a magic wand that make the latest full-power desktop GPUs practical in laptops. Of course desktops are always going to use 500W+ power supplies and push things to the limit while notebooks have finite power. GTX 1080 laptops are enormous, noisy and require massive external power supplies. Then there is the issue of inflated prices and fewer compute cores than AMD.

Maybe when nVidia releases a low-power chip we can rehash this debate, but right now it's about the same as arguing the Nintendo Switch should have a 1080 Ti instead of a Tegra.
The switch should have used Tegra2...
Apple uses AMD because the compute on AMD is “better” than Nvidia. Or at least that is what they tell us...
 
AMD underwent those same process improvements, this is not an nVidia advantage. Nor are process improvements a magic wand that make the latest full-power desktop GPUs practical in laptops. Of course desktops are always going to use 500W+ power supplies and push things to the limit while notebooks have finite power. GTX 1080 laptops are enormous, noisy and require massive external power supplies. Then there is the issue of inflated prices and fewer compute cores than AMD.

Maybe when nVidia releases a low-power chip we can rehash this debate, but right now it's about the same as arguing the Nintendo Switch should have a 1080 Ti instead of a Tegra.
Nvidia's advantage is that the are producing better chips than AMD on those processes. It's probably a smarter architecture, or maybe it's a better chip partner (TSMC rather than GloFo or Samsung), or maybe they did a better job taping out the design. We don't know why exactly, but Nvidia's designs are crushing AMD's when you take into account power/heat. Which is important for Apple because they like to make things small.

The fact that GTX 1080 fits in laptops of pretty much any size means that it can fit in the iMac. Which has a higher thermal envelope than even 17-inch Alienware monsters. The iMac has both the power and thermal availability to use the high end Nvidia GPUs but doesn't. That's dumb.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.