Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
artificially increasing demand of both 40xx and 50xx GPUs to drive prices upwards.
This is my take, just as prices were decreasing for the 40 series, Nvidia halted production. Every day, they seem to be less and less consumer focused.
 
The thing is, performance wise the 40 series and 50 series were not so far apart, if you didn’t count the DLSS implementation. A lot of cheap 40 series cards could have proved quite popular with upgraders from earlier generations.
 
The thing is, performance wise the 40 series and 50 series were not so far apart, if you didn’t count the DLSS implementation. A lot of cheap 40 series cards could have proved quite popular with upgraders from earlier generations.

To be fair, nobody should count DLSS or frame generation as measures of true performance. The same goes for both AMD and Intel GPUs. In Nvidia's case, the price premiums for 50xx cards over 40xx are far greater than the performance improvement between generations. Nvidia might pretend to care about the consumer market, but they're so heavily focused on AI and raw margins that they honestly couldn't care less about the market that helped them jump into AI in the first place.
 
  • Like
Reactions: Bodhitree
What Apple does best above anything else is marketing. The make people feel like they need to upgrade when their device is only a year old. Plenty of posts of people who update every single year. It certainly makes no sense financially for 99% of users to do this. I use Adobe CC, Indesign, Illustrator and Photoshop are what pays the bills. It would be a complete waste of money to upgrade to anything newer from my M1 Max. I would say even from an M1 Pro variant. We have hit a ceiling for pretty much everyone where the speed and performance is great for everything for the vast majority of users. People will convince themselves that they need that extra boost coming out of newer chips but I would suggest that is just the marketing machine at work doing what it does best. This is not to say that the newer chips are not getting better and faster, but that they really make very little difference in someone's daily life on a computer. I went from an iPhone 13 to a 17 Pro Max and feel no substantial improvements outside of the camera and the screen size. Youtube videos, websites, apps all loaded fast on my 13. I know technically the iPhone 17 Pro Max is much faster but we were so fast before those regular day to day things are not really improved anymore.
 
Apple has a significant advantage over Intel in this regard because all of the process/node development and retooling is handled by TSMC rather than by anyone in Cupertino. Look at how the M-series has moved from 5nm to the 3rd iteration of TSMCs 3nm process in five years while Intel is stuck in a rut from a node perspective. Intel has so many issues with their own fabs and process nodes that they're actively looking to use TSMC and possibly even Samsung to build CPUs.
No, TSMC works with Apple and others on process development, they couldn't do that without designs ... sure, the retooling is in TSMC for the most part.
Now Intel used to be the master of it all, but those days are long gone, a lot has to do that everything was homegrown but they seem to be on the uprise again with starting to use EUV. Their main problem though is lack of (foundry) customers, intel chips alone aren't enough to keep the fabs rumming
 
This is my take, just as prices were decreasing for the 40 series, Nvidia halted production. Every day, they seem to be less and less consumer focused.
Well, where are their profits coming from nowadays? Data center, openAI driving this. I think the "AI bubble" will burst, sooner or later, and nvidia will be one of the losers as they lost eyes on the consumers... but only time will tell
 
Well, where are their profits coming from nowadays? Data center, openAI driving this
The amount of money Nvidia is making in the datacenter business unit is staggering, 11% of their revenue is from GPUs, so them cutting production to squeeze the consumer is all the more galling
1760983922569.png
 
  • Like
Reactions: jz0309
Nvidia stopped production of the 40xx series in advance of the 50-series announcement. Many people have speculated that Nvidia did that for two reasons:

a) avoid having to discount 40xx GPUs like they had to do with 20xx and 30xx when the next generation was released
b) artificially increasing demand of both 40xx and 50xx GPUs to drive prices upwards.
Nvidia's 40xx and 50xx GPUs are both made using the same process, with similar die sizes. It doesn't make much sense to continue producing old chips, if you can produce a similar number of new chips for the same price.

And, having recently replaced my gaming PC, I found GPU prices surprisingly reasonable. Desktops with RTX 5080 and laptops with RTX 5090 are roughly the same price as Macs with the full M4 Max. The desktop 5090 is still expensive, but everything below that is acceptable. I guess consumer GPUs with limited memory no longer have that many alternate uses to drive the prices up.
 
The amount of money Nvidia is making in the datacenter business unit is staggering, 11% of their revenue is from GPUs, so them cutting production to squeeze the consumer is all the more galling
View attachment 2570391

Nvidia quit caring about consumers several years ago, and breakdowns like show why they stopped caring. What would interest me more would be a chart showing the YoY change for Graphics and Compute + Networking separately rather than just the combined revenue.
 
  • Like
Reactions: throAU
Apple devices up for a refresh

  • Aug 2020 iMac 27" 5K Intel

  • Oct 2022 Apple TV 4K

  • Jun 2023 Mac Pro M2 Ultra

  • Oct 2024 iMac 24" 4.5K M4
  • Oct 2024 Mac mini M4
  • Oct 2024 iPad Mini A17

  • Feb 2025 iPhone 16e A18
  • Mar 2025 iPad Air M3
  • Mar 2025 iPad A16
  • Mar 2025 MBA 13" M4
  • Mar 2025 MBA 15" M4
  • Mar 2025 Mac Studio M4 Max
 
Well, where are their profits coming from nowadays? Data center, openAI driving this. I think the "AI bubble" will burst, sooner or later, and nvidia will be one of the losers as they lost eyes on the consumers... but only time will tell

While the percentage of the overall revenue went down, the Graphics revenue has ( as snapped
[ from perplexity)

"
...
YearGaming Revenue (USD)Year-over-Year ChangeNotes
2025$11.35 billion+9%Growth despite supply constraints due to prioritization of AI chips
2024$10.45 billion-16%Consumer GPU sales slowed as AI GPU demand surged
2023$9.07 billion-27%Decline from the crypto mining downturn and limited GPU update cycle
2022$12.46 billion+61%Boost from RTX 30-series demand during global GPU shortage
..."

[ Yeah... not all that great a percentage math... 2024 is a +16% gain , not a -16% decrease. the revenues are likely directly lifted from reports though. Mainly looking for an easy table. ]

Average is $10.8B so the low there is only off $0.9B and high $1.66B It is all in a +/- 10% boundary. ( 2022 is mainly an outlier.)

2019 levels was 6.25B and 2017 $4B. For customers being "soo pissed off", they sure are buying a large quantity of stuff. If AMD was eating away Nvidia's consumer GPU market share at the same pace that they are eating away INtel's there might be some substance there. However, they are not. AMD is mainly fighting the 'war' at the much higher margin product levels (and at iGPUs ... which is at much taking away GPU business from Intel as it is Nvidia ).
A major problem for AMD (and competitors ) is that there is a decent amount of synergy between the AI data center and the very top end gaming cards.

AMD is stretched thin trying to compete against both Intel and Nvidia at the same time, that they have to choose where to do 'battle'. This RDNA4 generation , Nvidia got a free pass on the top end gaming GPU card. RDNA1 mostly similar issue.

Before the AI bubble , Nvidia was living on the 'crypto craze' on off the shelf GPUs bubble. (similar issue with crypto craze where the bubble bled down into the top end gaming cards. )

IF Nvidia takes Intel iGPU share away faster than AMD can displace it, then Nvidia will be in decent shape in gaming business even with a AI bubble collapse. (as long as don't get caught up in some circular AI bubble payments ponzi scheme. ) . Some Nvidia stockholders may get fleeced ( and some folks will get laid off) , but the company itself wouldn't have huge problems.

Nvidia would need to slightly lower their pricing, but it is still mainly just a duopoly of which they have the overwhelming dominate share. ( if Intel's iGPU completely craters then even more a duopoly in PC Windows ).
 
Last edited:
I remember hearing at the time of the M1’s release that one of the reasons that Apple ditched Intel chips was because they weren’t supplying frequent-enough product updates. It’s interesting because looking at it a few years later, Apple has managed to establish a yearly or nearly-so update schedule. But has it really helped them in sales?
Amazing how deadlines force us to get a lot of work done. Having multiple deadlines (whether 10, 12, x months apart) as a good project management way to crack the whip and keep the teams moving forward.
 
Nvidia quit caring about consumers several years ago, and breakdowns like show why they stopped caring. What would interest me more would be a chart showing the YoY change for Graphics and Compute + Networking separately rather than just the combined revenue.
They still care the same amount for consumers. It is just that their care for data centers dwarfs their care for consumers.

We know they still care about consumers because AMD is getting pummeled in gaming GPUs still.
 
Boy, I'll tell you — contra some others, raw performance and AI inference matters to me. I made the mistake of getting into video processing for upscaling as a hobby, as well as local LLM and AI work (lawyers have ethical prohibitions or limitations on putting client materials into the cloud). Seeing that the base M5 chip now outstrips all Apple Silicon save for M4 Max makes me very excited to see what M5 Max or Ultra might be able to do. And even being just a few months into my M4 Max Mac Studio purchase, I just might pull the trigger on an M5 Max MacBook Pro this spring. The future sure looks bright for Apple Silicon.

But dammit, Tim Apple, put a 5G modem into your MacBook Pro. That's way overdue.
 
I think we’ll still see yearly updates in silicon for a few years… or almost yearly.

We’ll see Apple completing the M5 lineup in the first half of 2026. Previously I thought the M4 was a widely adopted baseline M chip, but now with the huge improvements the M5 is bringing to AI features, I really expect Apple to update all their devices to at least the M5 chip during the next year. I didn’t expect it, but the M5 is more than just an evolutionary step up, at least on what concerns the GPU and especially neural engines.

Once all the devices are updated to M5 (if Apple leaves the iMac out, the only reason I see them doing this is because how hotter the M5 is), then the M6 update will come slowly to the best selling devices at the end of 2026. This is, mainly, the MacBook Pro. Especially with the new design, which will rise again it’s price. Let’s not forget that M6 will be the first SoC built in TSMC’s 2nm process, it will be more expensive, and with lower builds.

So, at least for the next two years (until October 2027) I foresee desktops staying on the powerful (especially GPU wise) M5, a chip that’s ready for more intensive AI tasks, and some premium products such as the MacBook Pro or the iPad Pro will adopt the new M6 SoC.
 
  • Like
Reactions: Admiral
They still care the same amount for consumers. It is just that their care for data centers dwarfs their care for consumers.

We know they still care about consumers because AMD is getting pummeled in gaming GPUs still.

AMD chose not to pursue the high end of the GPU market, so of course Nvidia is winning on the high end. If Nvidia cared for consumers, they wouldn't have ended 40xx production early to drive up demand and pricing for those chips, then also slow-rolled 50xx production to maintain that artificially inflated demand. Nvidia also has let its AIO partners charge whatever they want for their graphics cards, which is why so many 70/80/90 cards are selling for well over MSRP. Nvidia also still thinks 8GB GPUs are sufficient for modern games, even though there are games out that that will not run at all on 8GB of VRAM.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.