Well, this is getting ridiculous
There will be times when Apple M chip would closing its peak and stops giving huge performance boost annually, and I'd like to wait for that instead of buying the hype now and left being disappointed because Apple's chip being "too fast" year after year. That is the point of my argument from the beginning.But its not about buying the best, you are buying a certain level of performance not a claim to having the best.
It's sad, not good that we have so little competition in high end PC GPUs. The NVIDIA pricing is well out of hand.
The fact that the 4090 isnt really surpassed, doesnt mean new games pushing the limits run better. It doesnt give you more frames or stability, it just means you know you cant do better. I'd rather people who want to spend to have the best frame rate possible today can, but also people can buy something affordable that plays today games well 12 months from now. That is to say, who cares if 4090 is the best if it could only run (a fictional) The Witcher 4 at 15 fps, it would just be damning not reassuring.
I would absolutely prefer to live in a world where every year a $500 computer beats last year's $5000 computer. That sounds like paradise.
Well, one benchmark never tells the full storyWell, this is getting ridiculous
Huh?Not always. Getting an RTX 4090 is a solid tech purchase. It’s been the fastest GPU in the planet for 2 years now. You can play and render with very little compromise.
Even if 5090 is coming next year, it WILL BE a faster GPU. But I’m sure I don’t want to feel gutted by the fact that a 5060 is so much cheaper and faster.
Given the chance you’re able to buy the best tech stuff, wouldn’t you want that to last being the best for a little more while? I would 🤷🏻♂️
Imagine how much performance you could get if you waited ten years!Because it creates confusion as when should I buy a Mac? Imagine spending $4000 for the most expensive, fully maxxed out dream setup Mac and just a year later Apple can squeeze the same performance on a base line Mac Mini?
No, I’d get an iPhone when I need it and if I really need the performance increase, trade it in for the faster one.Can you imagine if iPhone is getting 3x faster each year? You'd always worry and wait for next gen because you don't want to feel left out and 3x slower because new iPhone is launched.
Huh?
The top tier GPU always gets knocked to mid tier by NVIDIA. This is how it has been for decades. The 5090 will be faster and the 4090 will instantly become mid tier.
![]()
Again, you’re looking silly.
Apples iPhone iterations feel lighter simply due to software maturity. The hardware in the iPhone isn’t slowing down each year. But the gains are so far out ahead it takes years before the average user hits a ceiling.As a poster wrote above, iPhones at the early 2010 was exactly like Mac now. Huge performance gain, up to 2-3x faster by each iteration and it was a bad time to just keep buying newest shiniest iPhone knowing you will be left out with half the performance next year. Now iPhone is just a boring but predictable updates at which you can hold upgrades for a bit longer. You get my point?
A 10% difference is a yawn. Especially considering frame generation really amplifies the 4070 performance well beyond that benchmark.In that benchmark of yours, 3090 is still a bit faster than 4070 you've been so kind for proving my point. And nothing beats 4090 for 2 years now until 5090 is released later, and that's fine knowing I've been enjoying the best tech for 2 years with no one else on top. What, you think I don't know that tech always evolves?
Yes buying the most expensive GPU is a rare bragging right in tech world that actually works well. I don't feel silly at all, so that's on you 😊
Thanks, I saw those specs from a sketchy publication, unsourced, and was vaguely citing them as non-authoritative. It is obvious that memory bandwidth in the 800GB/s range will have to wait for an M4 Ultra, and by then, bandwidth should be even higher.Nothing definitive? Apple released the varying specs for the various chips as each machine dropped:
A 10% difference is a yawn. Especially considering frame generation really amplifies the 4070 performance well beyond that benchmark.
The 4090 lasts as long as nvidia refresh cycle lasts. It has been more frequent than 2 years in the past. But, again your argument is silly.
Apple has no desire to adopt an NVIDIA release cycle. They’ll continue to release their products when they want to. Not when you or anyone else thinks they should.
That’s classically Apple.
Meaning at some point in the future, M chip iteration might slowing down just like what you're seeing in the iPhone right now, or at Intel, AMD and NVidia. I'd rather wait for that time to come. Surely my decision won't affect yours, no? 🤷🏼♂️👋🏼
Again, what are you talking about?
Apple uses the same chips largely across their lineup. iPhones aren’t completely divorced from their M series chips. So iPhones also see heavy performance increases each release.
Since Intel is not slowing down after decades, since Apple left for arm arhitecture, Intel is growing up again, amd as well...i dont see M doing so for another decade. Materials transistors, tools just started with the Apple SoC. Lets not forget M4 is a much more "colder" SoC than M2 and this is a positive for "injecting" more performance for both passive and active cooling devices. Even battery life increase while increasing performance..that means more health to the battery cells as wellMeaning at some point in the future, M chip iteration might slowing down just like what you're seeing in the iPhone, or happening at Intel, AMD and NVidia right now. I'd rather wait for that time to come instead of being stuck in the rapid train of change. Surely my decision won't affect yours, no? 🤷🏼♂️👋🏼
If all the tests are done the same way then its apples to apples and it makes a point, a big oneUnless they're running stress tests this means nothing to be honest.
I guess it depends what you need. I doubt people were buying the Studio for the CPU performance alone. It will be interested to see the benchmarks for the Max factoring in the GPU performance. For some people, the Studio will still do something for their workflow that the mini wont.Impressive performance. Seems like the Mac Studio is an awful purchase until it gets updated with the M4 family.
Not having a powerful enough GPU makes these new Macs look silly especially since everything is unified and you can't upgrade manually like before if needed for space or ram miss the good ole daysRight, or anything else it can do that the Mini can’t.
You need to have a very narrow view of what a computer is and can do to think CPU speed is the only determining factor in what gives a computer value. No one with a MacPro is going to give it up for a Mini. People that need what a MacPro does won’t be buying a Mini.
For example, a Mini might suit my needs, but I’m waiting on GPU performance numbers against tasks I do. I don’t give much of a crap about the CPU speed, as those have been fast enough for a few generations.
That was pretty much the case from circa 1984 to 2010. During that time I was upgrading roughly every 6 years and expect a 10x improvement in performance. I bought an M1 Mini 16G/1T in early 2021 and now thinking about the M4 Pro many with 64G of RAM. So in about 4 years the top of the line Mini is now ~4X faster, with 4X the RAM and 4X the memory bandwidth. This is slower than the "golden era" of rapidly increasing computer power, but still respectable growth.We dont have to imagine, for a long long time, computer perfomance doubled every 18 months like clock work. Like that was actually the industry standard until like 2010.