Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Is one able to fit that 4090 chip into a chassis the size of the Mac Studio, much less cool it adequately?
I don't know whether to laugh or cry...
How do you even compare M1 Ultra to a 4090 card?
M1 Ultra barely produces 21 TFLOPs compared to the 4090 which reaches almost 100 TFLOPS!!! Add to that that the 4090 card supports RAY TRACING in real time at the hardware level.
 
A computer based on SoC cannot be a professional computer!

That's so weird, what with people using it professionally.

How much will it cost Apple to create a shared memory of 1.5 tera ram? More than your house. don't be ridiculous

I don't think they'll do that. They'll either reduce the RAM or go for some hybrid approach.

 
I don't know whether to laugh or cry...
How do you even compare M1 Ultra to a 4090 card?
M1 Ultra barely produces 21 TFLOPs compared to the 4090 which reaches almost 100 TFLOPS!!! Add to that that the 4090 card supports RAY TRACING in real time at the hardware level.

And as a bonus, it can also melt your computer while at it. 🫠


I don’t really see these instances of “absolute performance at all costs” paradigms as something to crow about, honestly. It’s easy to do more with more, and at some point, you really have to wonder if companies are going in this direction because there really is a benefit to be had, or for purely marketing reasons, or because they know it’s the only way they can distinguish themselves from the M1 chip, which has the performance per watt trophy in a bag.
 
This thread is amusing. You don't need an i9 on 13th Gen to beat M1 Max. A 12700h will do it although just by a very small margin. The M1 Max is not an ultra low power chip. It is certainly more efficient than Intel and for throttling and battery life it certainly makes a difference.

The thing is if you are doing MacOS tasks and enjoy MacOS and does everything you need it really doesn't matter how much faster another chip is. Certainly we all want the best performance. However Apple is not concerned only with maximum compute performance. They are looking at the user and how they use a laptop and designing a package around that.

What did people hate about Intel Macs? Poor battery life, lukewarm performance and thermal throttling. M series chips have solved all of those problems in one generation. So they are not the absolute fastest nor have the absolute best gpu but they perform extremely well and go a long time on battery offering the same performance plugged in or not. They have encoders for media acceleration designed specifically for Mac Software.

The fact that people compare a Macbook Air M2 with an M1 MBP with M1 Pro shows how much performance these new chips deliver and they will only get better over time.

Will Intel have the absolute king in domination for performance and will Nvidia also be the leader in gpu? Probably they will but all of that comes at a cost of heat and battery life and throttling if not properly cooled.

I do think the PPW argument gets a little stale in the sense that if all I care about if having the most cpu power possible then what do I care about how efficient it is? If my number one priority is the absolute best GPU performance or CPU performance and I can cool it and give it as much juice as it needs then that is all I care about and for that Intel and Nvidia are still the best.

If I am a gamer/need Ray tracing for content creation then Intel 12700h or 13700h with Nvidia 4080 would be a lot better than any Macbook Pro.

So let's just be honest. Apples approach is more holistic and for more people makes more sense because they don't need that extreme performance all the time and would rather have a long battery life, no thermal issues and still get great performance. And even have a powerful laptop without the need for fans that still has the fastest web browsing scores.
 
Apples approach to cpu design has been wildly successful and I think Google is following Apple's lead with their Tensor 2 chip. Tensor 2 is not even as fast as an Snapdragon 8 Gen 1 and shortly Qualcomm is going to release the 8 GEN 2 which is supposed to be much faster.

So all of the new Pixels have a chip not even as fast as the last generation Snapdragon flagship SOC but use a Pixel for anything other than benchmarks or gaming and you would never know.

Sure Mediatek and Qualcomm and maybe even Exynos chips might be faster but does that mean that Pixel phones are worse than other Android phones?

No except for certain use cases just like Apple's m series SOC. Brute speed and force is certainly nice to have but it isn't everything and without good software optimization the best hardware is meaningless.

All of these posts about this AMD chip or this Intel chip beating Apple chip are fun for superficial comparisons but in the end don't really mean a whole lot. Okay your Intel PC is super fast and has a great GPU, well the same is true for a Macbook Pro.

I like PC's but I also like Macs so I keep up to date with the changing tech in the processors. But at the end of the day I am going to buy based on my priorities, available budget, and design preferences. Cpu speed is a very important part of that equation but let's say I had to make a choice between a laptop with the absolute best processor, ram, gpu amd ssd capacity but it had poor driver updates, cheap plastic body, sva screen with 45%ntsc, small battery and poor thermals, terrible speakers or a laptop with good build quality and materials a fast processor decent amount of ram and ssd great screen, great battery, decent speakers and adequate thermals I am going to get the latter every time. A laptop or a phone or a tablet is the sum of it's parts and not just a processor.
 
Apples approach to cpu design has been wildly successful and I think Google is following Apple's lead with their Tensor 2 chip. Tensor 2 is not even as fast as an Snapdragon 8 Gen 1 and shortly Qualcomm is going to release the 8 GEN 2 which is supposed to be much faster.

So all of the new Pixels have a chip not even as fast as the last generation Snapdragon flagship SOC but use a Pixel for anything other than benchmarks or gaming and you would never know.

Sure Mediatek and Qualcomm and maybe even Exynos chips might be faster but does that mean that Pixel phones are worse than other Android phones?

No except for certain use cases just like Apple's m series SOC. Brute speed and force is certainly nice to have but it isn't everything and without good software optimization the best hardware is meaningless.

All of these posts about this AMD chip or this Intel chip beating Apple chip are fun for superficial comparisons but in the end don't really mean a whole lot. Okay your Intel PC is super fast and has a great GPU, well the same is true for a Macbook Pro.

I like PC's but I also like Macs so I keep up to date with the changing tech in the processors. But at the end of the day I am going to buy based on my priorities, available budget, and design preferences. Cpu speed is a very important part of that equation but let's say I had to make a choice between a laptop with the absolute best processor, ram, gpu amd ssd capacity but it had poor driver updates, cheap plastic body, sva screen with 45%ntsc, small battery and poor thermals, terrible speakers or a laptop with good build quality and materials a fast processor decent amount of ram and ssd great screen, great battery, decent speakers and adequate thermals I am going to get the latter every time. A laptop or a phone or a tablet is the sum of it's parts and not just a processor.
The Intel CPU is a good solution, especially if combined with Nvidia, however pull the power cord different story. This 17" W10 system can barely manage 2 hours off the mains, 13" M1 MBP can stretch to 20 hours and arguably the MBP is faster in many a use case...

Very much agree there's a lot more to a computer than just the speed of the CPU, has to be said Apple knows this and nails it for the most part...

Q-6
 
Finally, the i9 lives up to its promise.
The only caveat is, it burns the building down to the ground around you.

Too bad the CAD world is stuck with Intel. I’d much rather support US, Indian & Korean computer makers with our purchases than [whatever unspeakable country Intel’s chipsets are produced in].
 
Last edited:
Finally, the i9 lives up to its promise.
The only caveat is, it burns the building down to the ground around you.

Too bad the CAD world is stuck with Intel. I’d much rather support US, Indian & Korean computer makers with our purchases than [whatever unspeakable country Intel’s chipsets are produced in].
Are you serious? Intel fabs are in the USA and Isreal mostly. At least for now and with the geopolitical situation in China probably for the foreseeable future. I don't know what high end fabs are in India and Korea is fine. TSMC is also fine as Taiwan is a good ally but they are in a position where China will intervene at some point and take over their IP.

Apple uses TSMC and so does AMD and Intel is starting to.


They have one fab for older process nodes in China.
 
Last edited:
Are you serious?
Yes. Last I remember paying attention to where Apple was looking into producing chips, they were optimistic about India and opening at least one more shop in the US. Korea‘s situation is what it is and we’ll happily support them as much as possible. On the other hand, Israel has now become such an ethical dilemma, supporting them financially is as good as out of the question. The trouble is, old CAD software has 50 years of development behind it and Autodesk nor Dassault will rewrite it all for arm or anything else, and in all this time they’ve hoovered up as much IP as possible, leaving newcomers with some pretty klugey options for basic functionality. So trying to recommend transitioning our industry to new unproven CAD software, which isn’t as capable or refined or commonplace, or compatible with decades of existing work, is such a major hurdle that it’s what’s keeping the design & engineering firms glued to Intel, and allowing the two dominant CAD software companies to innovate at a snails pace.
 
Last edited:
Yes. Last I remember paying attention to where Apple was looking into producing chips, they were optimistic about India and opening at least one more shop in the US. Korea‘s situation is what it is and we’ll happily support them as much as possible. On the other hand, Israel has now become such an ethical dilemma, supporting them financially is as good as out of the question. The trouble is, old CAD software has 50 years of development behind it and Autodesk nor Dassault will rewrite it all for arm or anything else, and in all this time they’ve hoovered up as much IP as possible, leaving newcomers with some pretty klugey options for basic functionality. So trying to recommend transitioning our industry to new unproven CAD software, which isn’t as capable or refined or commonplace, or compatible with decades of existing work, is such a major hurdle that it’s what’s keeping the design & engineering firms glued to Intel, and allowing the two dominant CAD software companies to innovate at a snails pace.
Intel has most of it's fabs in USA, Ireland and Israel. With finishing and testing in Asia. I don't see the big problem.
 
Last edited by a moderator:
I don’t really see these instances of “absolute performance at all costs” paradigms as something to crow about, honestly. It’s easy to do more with more, and at some point, you really have to wonder if companies are going in this direction because there really is a benefit to be had, or for purely marketing reasons, or because they know it’s the only way they can distinguish themselves from the M1 chip, which has the performance per watt trophy in a bag.

I mean, there is a benefit to be had if you need that much horsepower. But I wish more journalists approached the story as "Nvidia has once again failed to scale down their GPU", not "it's even faster OMG".

Here's a 4090 vs. the original GeForce 256:

this-original-geforce-256-with-my-new-geforce-4090-v0-3famm743riu91.jpg
 
Ah yes, the 4090 with the true “FireWire” port. YouTubers like Gamers Nexus are going nuts trying to root cause why the power cables are catching fire with this model.
 
Ah yes, the 4090 with the true “FireWire” port. YouTubers like Gamers Nexus are going nuts trying to root cause why the power cables are catching fire with this model.
They figured it out, folks are not plugging the power cable in all the way.
 
The Intel CPU is a good solution, especially if combined with Nvidia, however pull the power cord different story. This 17" W10 system can barely manage 2 hours off the mains, 13" M1 MBP can stretch to 20 hours and arguably the MBP is faster in many a use case...

Very much agree there's a lot more to a computer than just the speed of the CPU, has to be said Apple knows this and nails it for the most part...

Q-6
What do you mean with "20 hours"??
My MacBook Pro M1 MAX maxed out, will have an empty battery after 1 hour.
And that's the case when I use only my CPU.
 
  • Like
Reactions: pdoherty
LOL my 1080 is struggling. Gaming on the Mac is just an arse kicking contest. It's been like that for years. Move on and enjoy, I don't love W10/W11 but for gaming it's a way better experience...

Q-6
My 1080 is just fine. I don’t need MAX SETTINGS!!!! Or 4K!!!!!
 
  • Like
Reactions: Queen6
What do you mean with "20 hours"??
My MacBook Pro M1 MAX maxed out, will have an empty battery after 1 hour.
And that's the case when I use only my CPU.
well, then your machine might be faulty. My 16" MBP M1 Max makes it to 10 hrs rather easily doing programming running 1 - 2 VMs in Parallels in parallel
 
  • Like
Reactions: Queen6
A computer based on SoC cannot be a professional computer! How much will it cost Apple to create a shared memory of 1.5 tera ram? More than your house. don't be ridiculous

Cyberpunk in 1080P resolution with high settings will not reach 60FPS on your 1080, without RAY TRACIN technology that does not even exist on Mac, so please don't reinvent the wheel.

COD MW2 is a competitive online FPS game, to get the best experience in this game you need a fast GPU with a monitor that runs at 144-165HZ I don't see it happening without NVIDIA's 2000 or 3000 RTX series.

It's good that you didn't also include a 286 processor, a 1060 card in the list? We are already in the generation of 4000 cards...

If you can afford it, then what's the problem with enjoying the best 4K gaming experience. Furthermore, 4090 is the best card for 3D content creators especially on the UNREAL 5 engine. It's amazing what you can buy for the price of Mac Studio...

And by the way, this is the best time to be on PC gaming when the biggest gaming companies Sony and Microsoft develop their exclusive games for PC as well.
Not everything is competitive. Holy cow people. It’s like the only GPU even possible to get 1fps gaming on this site is a 4090. I’m gaming just fine on my 1080 still. I don’t need max settings. Ray tracing is a joke IMO. I’m still getting reasonable FPS in brand new games.

And in a few years even games today will be discussed on this site that it absolutely without a doubt NEEDS a 5090. Or you are not a gamer unless you buy these GPUs.
 
It's a 13" so can stretch to 20 hours with basic tasks. On the go I want battery life first.

Q-6
You can buy a MacBook Pro with M2 Extreme chip and have the same battery life.
On the go it's up to you if you use the M2 Extreme at the limit or not.
You can still use with the M2 Extreme the same amount of cores, which you are using now.
 
You can buy a MacBook Pro with M2 Extreme chip and have the same battery life.
On the go it's up to you if you use the M2 Extreme at the limit or not.
You can still use with the M2 Extreme the same amount of cores, which you are using now.
I'll likely look at a new Mac when M3 is released until then the M1 13" is more than good enough.

Q-6
 
  • Like
Reactions: Technerd108
Uh what? My MacBook Pro M1 Max and Mac Studio M1 Ultra are just as professional as my $4,000 windows system.
You have no idea what a workstation is.
A workstation is not intended for editing YouTube videos.
A workstation is designed for simulations and complex engineering CAD programs that require professional graphics cards such as the Quadoro series, in addition to ECC RAM error correction memories.
Ability to install at least 4 graphic cards together, a large bay for disk arrays in RAID and of course support for PCI cards.
 
  • Love
Reactions: Appletoni
You have no idea what a workstation is.
A workstation is not intended for editing YouTube videos.
A workstation is designed for simulations and complex engineering CAD programs that require professional graphics cards such as the Quadoro series, in addition to ECC RAM error correction memories.
Ability to install at least 4 graphic cards together, a large bay for disk arrays in RAID and of course support for PCI cards.

“Workstation” stopped meaning much after DEC, SGI, Sun, etc. died. It’s just a fancy way of saying “high-end computer” now.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.