Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
In American slang, “unreal” means something like “amazing” (similar to “incredible”) — nobody is disputing the chart
Yeah that's what I meant there - unreal as in astounding. I work in tech (though not at NVIDIA, otherwise I might be looking at early retirement lol) so I am familiar with what was going on there :)
 
When you look at performance per watt i think everyone is behind Apple. I could run my M1 Mini with a 20000 mAh Anker power bank that i have for about 5 hours.
 
When the M1 Max launched, Apple said that it rivaled the flagship NVIDIA GPU at the time, the RTX 3080.

However, in 2025, when comparing the M4 Max MacBook Pro to PC laptops equipped with the flagship NVIDIA GPU, the RTX 5090, which costs the same as a M4 Max MacBook Pro, the MacBook Pro gets destroyed, it is not even close.

And this is true even on battery power.

View attachment 2498773

View attachment 2498774

I have a Windows PC and two Macs on my desk. I run tasks on the systems for which they are optimized. I can't really run everything on macOS but I can live with that.
 
The consumer products don't matter (at least as far as revenue/profit is concerned). I actually think the internet has overblown some of the issues Nvidia has had (especially from the driver side).

The more Nvidia says publicly, the dumber they have looked with respect to the consumer side of their business. The 5060 review situation was proof that Nvidia wants a controlled and predefined narrative instead of legitimate reviews of their consumer products, which has actually created even more criticism from those independent reviewers who wouldn't agree to Nvidia's conditions even if they came with a Brinks truck filled with cash.
 
The more Nvidia says publicly, the dumber they have looked with respect to the consumer side of their business. The 5060 review situation was proof that Nvidia wants a controlled and predefined narrative instead of legitimate reviews of their consumer products, which has actually created even more criticism from those independent reviewers who wouldn't agree to Nvidia's conditions even if they came with a Brinks truck filled with cash.
That may be true, but folks still overwhelmingly choose these compromised products over another brand.
 
  • Like
Reactions: MRMSFC
The more Nvidia says publicly, the dumber they have looked with respect to the consumer side of their business.
And yet they can barely keep their products on the shelves, or sell them at MSRP. I have no love for nvidia but they are the major dominant force for GPUs, AMD is a distant second, and Intel is barely hanging on
 
  • Like
Reactions: eltoslightfoot
And yet they can barely keep their products on the shelves, or sell them at MSRP. I have no love for nvidia but they are the major dominant force for GPUs, AMD is a distant second, and Intel is barely hanging on

It's easy to keep the shelves empty when you cut production. Nvidia even cut 40xx series production well in advance of the 50 series launch specifically to artificially drive up demand and avoid having to discount the prior generation like they had to do with the 30 series when the 40s were released. The above MSRP thing is largely due to scalpers and bots buying up inventory and reselling at greatly inflated prices, coupled with the AIOs purposely focusing on card variants they can sell above MSRP and largely ignoring the base spec/MSRP market in the process.
 
  • Like
Reactions: splifingate
It's easy to keep the shelves empty when you cut production. Nvidia even cut 40xx series production well in advance of the 50 series launch specifically to artificially drive up demand and avoid having to discount the prior generation like they had to do with the 30 series when the 40s were released. The above MSRP thing is largely due to scalpers and bots buying up inventory and reselling at greatly inflated prices, coupled with the AIOs purposely focusing on card variants they can sell above MSRP and largely ignoring the base spec/MSRP market in the process.

And here we now have the RTX PRO Blackwells being discounted (12% off a New car, is still a New car *smile*), and highly-available . . .

. . . is this the precursor to the release of something new (RTX Pro Caudate Max-C (or some similar, silly nomenclature))?
 
The more Nvidia says publicly, the dumber they have looked with respect to the consumer side of their business.

The consumer side of their business is now the distracting sideline, just as the Mac has become Apple's distracting sideline.
 
For the products relevant to the market (mostly ultraportable laptops, all in ones and small form factor desktops) I’d much rather have my m4 max than a 5090.

Is it as fast as a 5090 at gaming?

No

Almost no one cares.


The fact that you’re even comparing a 100 watt SOC with integrated graphics to a flagship discrete GPU shows how impressive th M series are.
 
The consumer side of their business is now the distracting sideline, just as the Mac has become Apple's distracting sideline.
Very wrong. Entities like Apple and Nvidia are not like the little greeting card business you run out of your spare bedroom. Apple's Mac business, for instance, is the fourth largest computer vendor in the world. CEOs with the competence of a Tim Cook are fully capable of managing multiple large profit centers such that the fourth largest computer manufacturing operation in the world is not a distracting sideline.
 
  • Like
Reactions: throAU
Eh, this doesn’t really work when Apple brought the comparisons themselves. Also they’re both GPU designers, of course they’ll be compared just as Apple’s CPUs are compared to Intel/AMD’s.




Again, eh. I doubt Apple is perfectly fine with where the Mac Pro is right now. When they can compete at the level of Nvidia’s highest end products they’ll shout about it.
I suspect that Apple is perfectly fine with where the Mac is right now. Apple is the world's fourth largest personal computer vendor and is probably the most profitable computer vendor. Your reference to the Mac Pro is just a subset, even though you're correct only for the Mac Pro, which is the Mac that Apple sells the least of by a lot.
 
Such clickbait is just that, only clickbait. Excuse me if I do not waste bandwidth paying attention to such obvious clickbait nonsense.

One reason it can immediately be identified as obvious clickbait nonsense is because it is comparing Apples and oranges. Apple makes and sells devices while Nvidia makes and sells chips; both entities are among best-in-class at the different things that they do. Forcing some hardware comparison is just Apples/oranges clickbait.
 
  • Like
Reactions: avro707
Apples and oranges
I don't think so

Nividia doesn't just makes chips. they produce GPUs, both the IP, and the actual graphics cards. These GPUs are capable of handling tasks like AI, crypto, and obviously gaming. Apple designs, and sells computers that have thier own designed GPU, so the comparison is fair. They themselves have compared the Apple silicon GPU performance to that of Nvidia. So. they thought it was an apples to apples comparison why should we be so quick to dismiss it?

Apple has tried to improve its ability to play games, during a prior WWDC event, they unveiled that cyberpunk 2077 is getting ported to apple silicon and talked at how well it was performing. Again, this is very much in the wheelhouse of Nvidia.

So for AI (which apple is pushing), and gaming, I think its fair to compare how Apple is doing vs dominate force in both areas.
 
I don't think so

Nividia doesn't just makes chips. they produce GPUs, both the IP, and the actual graphics cards. These GPUs are capable of handling tasks like AI, crypto, and obviously gaming. Apple designs, and sells computers that have thier own designed GPU, so the comparison is fair. They themselves have compared the Apple silicon GPU performance to that of Nvidia. So. they thought it was an apples to apples comparison why should we be so quick to dismiss it?

Apple has tried to improve its ability to play games, during a prior WWDC event, they unveiled that cyberpunk 2077 is getting ported to apple silicon and talked at how well it was performing. Again, this is very much in the wheelhouse of Nvidia.

So for AI (which apple is pushing), and gaming, I think its fair to compare how Apple is doing vs dominate force in both areas.
It is kind of disparate systems though. You can't get macOS in anything that runs modern Nvidia hardware so there is no real way to compare apples to apples.
 
so there is no real way to compare apples to apples.
I think you're over-thinking it. The comparison is how many FPS does game X gets on a 5090 vs a given Mac. Does the 5090 process more tokens in AI, or does Apple Silicon? Are other GPU related tasks on the 5090 faster then on the Mac?

My M4 Max studio, is equivalent to a 4070, basically one of the higher end Macs is as fast as a upper-mid range GPU from the prior generation. The M4 Pro, and M4 SoC will fair even worse in the comparison.
 
Nividia doesn't just makes chips. they produce GPUs, both the IP, and the actual graphics cards.
... and workstations, servers, edge devices, low power systems (Jetson) and more recently the Spark systems which are more comparable to a Mac Studio. Just a different target market. Not even thinking about the software ecosystem Nvidia has in place.
 
I think you're over-thinking it. The comparison is how many FPS does game X gets on a 5090 vs a given Mac. Does the 5090 process more tokens in AI, or does Apple Silicon? Are other GPU related tasks on the 5090 faster then on the Mac?

My M4 Max studio, is equivalent to a 4070, basically one of the higher end Macs is as fast as a upper-mid range GPU from the prior generation. The M4 Pro, and M4 SoC will fair even worse in the comparison.

To be honest, for many years - pretty much the whole decade of the 2000’s - games ran at 60 fps or even 30 fps, and gamers didn’t complain but took it in their stride. It’s only recently that 150+ fps is seen as a pro, and it’s a sign of the hardware outperforming the sweet spot of market size and spectacular looks. You just don’t get much bang for your buck by going to more GPU intensive techniques when you are already providing plenty of eye candy.
 
  • Love
Reactions: uacd
To be honest, for many years - pretty much the whole decade of the 2000’s - games ran at 60 fps or even 30 fps, and gamers didn’t complain but took it in their stride. It’s only recently that 150+ fps is seen as a pro, and it’s a sign of the hardware outperforming the sweet spot of market size and spectacular looks. You just don’t get much bang for your buck by going to more GPU intensive techniques when you are already providing plenty of eye candy.

M4 pro struggles to reach 60 fps on the few games I play
 
To be honest, for many years - pretty much the whole decade of the 2000’s - games ran at 60 fps or even 30 fps,
Yes, 20 years ago, there was an expectation of 60 FPS, with mid-tier and low tier struggling achieve that threshold. 30 FPS was mostly the domain of consoles and mac gaming. Its a threshold that is largely regarded as unplayable today.

This is 2025 with demanding games, higher demanding elements like ray tracing and path tracing, frames per second is an indicator of a GPU's ability to perform.

Honest question, do you think Apple's GPU is keeping pace with nvidia's performance? Do you think Apple is leading, part of the crowd, or falling behind the leader regarding its GPU performance?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.