Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I feel that, if there’s been a problem with the M-series branding, it’s that the Pro chips haven’t catered to a specific audience. Max is perfect for professionals, Ultra is for a minority and the base chip is for the majority.

Hmm. To me, the M1 Max and M2 Max seemed like a waste for my use case (software development), because the CPU part was virtually identical (memory memory bandwidth, but you can't really saturate it), and I don't need more GPU cores. Except, the Max offers more RAM. But aside from the RAM, it was as a pretty easy "yeah, I'll go with Pro".

The M3 Pro vs. Max complicates this.

I wonder if the M4 or so will give more choices. I really don't need those GPU cores, but I would benefit from more CPU cores.
 
I wish Apple just released the whole line and updated all the M's in all models at the same time.
My guess:

They still have to figure a few things out for the M3Ultra and updating just the M2Max Studio would have been weird.

With things going like they are I'd expect my M1Max Studio (base model) to last me far beyond the point where even a base MacMini would be big upgrade....
 
  • Like
Reactions: golfstud
I'd throw that right back: yes, of course high multi-core results are great in isolation. But in reality, you'll either

a. be using that machine for a lot more mundane stuff as well, at which point single-thread matters more, and e-cores help keep it cool, quiet, and environmentally friendly / cheaper for your energy bill, or
Then I throw it back too: for this you have the single-core result, testing again for it with the multi-core is redundant.
Well, what else would you end up with? Many high-performance tasks are either on the GPU instead these days (including, well, rendering), or they aren't heavily parallelizeable (software development — too reliant on I/O, and the dependency tree makes it hard to sync, so you're probably not gonna be scaling your build system to 96 cores, especially when you're in a JIT toolchain like Java or .NET).
Probably a gazillion things neither of us haven't heard about. In science there are lots of projects that should run on gpu, but nobody has time and/or expertise to port it. Or just more cost effective to leave is as-is. And in software dev also depends on the project. Do you have a large monolith, or 100+ microservices? The latter will scale well.

But I guess we went too far now from the topic.
 
After factoring in exchange rate, Australian Government taxes and regulation by ACCC that Apple provide seven days  Care plus two year warranty, the MBA 💻 M3 Pro has a starting price of:

Sucks to live in a highly taxed and regulated country. Enjoy your liberties my American cousins.
I live in Ukraine, where this configuration starts at $4340. So you don't have it so bad)
 
This is a great take.

I feel that, if there’s been a problem with the M-series branding, it’s that the Pro chips haven’t catered to a specific audience. Max is perfect for professionals, Ultra is for a minority and the base chip is for the majority.

But now that the base M3 is so good, it could realistically cater to previous Pro chip customers. Now it’s sits for those who just want a little more and have the other specs to go along with (ie multiple displays, I/O).
I see it backwards. M2 Pro with 6 and 8 p-cores catered to every professional who didn't needed graphics, and the Max to those who needed it. Now both are forced to buy the max or jump ship, and the benefit of pro is essentially reduced to display support. That creates a huge hole in the lineup, where you pay ungodly amounts of money for a feature most cheap pc laptops have.
 
22 hours almost idling(web browsing, movie watching), totally worth nearly $5k for this, really nothing fancy.
Get back to me once you get 22h with decent graphics related work or gaming, then Apple can show off with 22h.

Anyway, regarding professional graphics, there is no workaround Nvidia RTX (ex. Quadro) cards, Apple graphics is a toy compared to Nvidia.

But yeah a speedy Apple Laptop might be useful to compile Apple Software, that’s it, but just until the heat and throttle kicks in.
The GPU speed of an M2 Ultra 76 cores is roughly that of a GPU somewhere between an RTX 4070 Ti and 4080, hardly something to sneeze at. I found this with a simple web search and found a Tom’s Hardware article on its speed. M3 graphics cores are what Apple concentrated mostly on when moving from M2 to M3, so I would guess the future M3 Ultra will rival top end Nvidia and might beat the highest end card since it will probably have 80 GPU cores along with each core’s individual improvements.

I would also point out the M2 Ultra’s GPU typically uses 45 watts of power with a peak of 90 watts compared to the over 500 watts of power Nvidia cards use these days. IIRC, the RTX 4090 can use up to 600 watts of power. The entire Mac Studio Ultra only has a 370 watt power supply for the whole computer while you’d need a 1000-1200 watt power supply to run a PC with a high end graphics card. Nvidias run so hot, they’ll raise the temperature of a room dramatically, the chief reason I gave my gaming PC to my son. Imagine what Apple could do with their GPU’s if they used even half the power Nvidia’s require. I have yet to hear the fans on my M2 Ultra Mac Studio and I’ve run some serious stuff on it that would make my MacBook Pro M1 Max fans scream. If I only have the power of a 4070 Ti but have a silent computer that doesn’t turn my room into a sauna, I’ll take it.
 
STOP! No forum cred for you today hurling a positive comment like that around here.

Next time try and use crowd-pleasing words like underwhelmed/lame/pathetic/pitiful/bored/snoozer/hot garbage/etc. somewhere in your comment. Bonus points are awarded when that's accompanied by an oversized high school-ish eye-roll.
You forgot calling Tim Cook "Timmy".....
 
we need for serious 3D rendering the power of a 4090 in a laptop, and that could be archieved with the ultra version of this new m3 not the max version. I wouldn´t mind using more watts to get more power on these laptops.
 
Last edited:
  • Like
Reactions: Armada2
we need for serious 3D graphics the power of a 4090 in a laptop, and that could be archieved with the ultra version of this new m3 not the max version. I wouldn´t mind using more watts to get more power on these laptops.
All of those power that cannot be used for most games because so few are on Mac (even fewer than Linux).
 
I thought the switch from Intel was supposed to stop long waits
The “long waits” were several years where Intel just shipped the same old thing with tweaked higher speeds. What we’ve gotten from Apple is actually modified core designs none of which tool several years to be available.
 
Looking forward to the M3 Pro in the Mac Studio. I thought the switch from Intel was supposed to stop long waits, hopefully this a hangover from Covid disruptions and things will change.
Intel released new models every year, but given their prior history of tick tock, they had nearly a decade of tock, tock, tock releases. Apple got fed up with intel’s promises that lead to thermal disasters on MBP models when Apple had designed them for the chips Intel promised rather than the chips they actually delivered.

Apparently the long waits are over. Initially Apple was having a one year cycle for base M chips with the Pro and higher chips having a 18 month cycle. That is apparently over. Rumor has it the M2 Pro/Max/Ultra were supposed to have come out a year ago instead of January, so it’s likely Apple is settling into a one-year cycle for all of their SoC’s. That 18 month cycle was a bit worrying because the Pro/Max/Ultra chips were falling behind the A-series architecture. But the M3 family completely skips the A16 architecture, going directly from the A15-based M2’s to the A17 Pro-based M3’s. On top of that, instead of the M-series chips coming out a year after the A-series chip they’re based on, the M3’s came out only a month and a half after the A-series chip.
 
Then I throw it back too: for this you have the single-core result, testing again for it with the multi-core is redundant.

I think the idea of the Geekbench 6 multi-core result is to give a realistic measure.

I guess you could argue that Geekbench 6 should really offer three results — the single-core result, the 5-like multi-core result, and a "realistic core utilization" result.


Probably a gazillion things neither of us haven't heard about. In science there are lots of projects that should run on gpu, but nobody has time and/or expertise to port it. Or just more cost effective to leave is as-is.

I would wager if it hasn't been ported to GPU, it probably also doesn't scale well to 96 cores…

And in software dev also depends on the project. Do you have a large monolith, or 100+ microservices? The latter will scale well.

Well, only if all of those microservices need a recompile, which they hopefully rarely do (or else you really have a monolith after all :) ).

Swift scales to many cores better than .NET does, but I think that's mostly a function of AOT. And even with Swift, we see a typical Amdahl's law-like curve in the toolchain — 10 cores are only 11% faster than 8 (rather than 20%), and 20 cores are only than 30% faster than 10 (rather than, y'know, 100%).

I think the point of diminishing returns is important, including for the M3 Max: yeah, it reaches roughly the result as the M2 Ultra, but it needs a third fewer cores to do so. That means I'm skeptical about the usefulness of the Ultra even for most Mac Studio buyers, and I'm especially skeptical about the usefulness of the Threadripper.

Another complication here, though, is that I think Threadripper offers a Turbo Boost-like mechanism, and I think Apple's SoCs so far do not.

 
  • Like
Reactions: canon-cinema-0r
the M3 family completely skips the A16 architecture, going directly from the A15-based M2’s to the A17 Pro-based M3’s.

Technically, we don't know yet that it does. I've seen speculation that the M3 has the A17's GPU, but the A16's CPU. It's hard to tell so far from the data we have.
 
  • Like
Reactions: canon-cinema-0r
The “long waits” were several years where Intel just shipped the same old thing with tweaked higher speeds. What we’ve gotten from Apple is actually modified core designs none of which tool several years to be available.
Intel released new models every year, but given their prior history of tick tock, they had nearly a decade of tock, tock, tock releases. Apple got fed up with intel’s promises that lead to thermal disasters on MBP models when Apple had designed them for the chips Intel promised rather than the chips they actually delivered.

Apparently the long waits are over. Initially Apple was having a one year cycle for base M chips with the Pro and higher chips having a 18 month cycle. That is apparently over. Rumor has it the M2 Pro/Max/Ultra were supposed to have come out a year ago instead of January, so it’s likely Apple is settling into a one-year cycle for all of their SoC’s. That 18 month cycle was a bit worrying because the Pro/Max/Ultra chips were falling behind the A-series architecture. But the M3 family completely skips the A16 architecture, going directly from the A15-based M2’s to the A17 Pro-based M3’s. On top of that, instead of the M-series chips coming out a year after the A-series chip they’re based on, the M3’s came out only a month and a half after the A-series chip.
I think you missed my point. The chips are progressing fast. Slow chip updates can no longer be used as an excuse for long gaps between hardware updates. The Mac Studio is likely to lag a long time behind the chip release again.
 
I really need to dig up some couch and dryer money for one of these. The performance to battery life ratio is out of this world.
 
Wow. Just step back and ponder how crazy it is to be able to buy a laptop with almost 100 billion transistors that also gets 22-hour battery life and is two-thirds of an inch thin and weighs under 5 pounds. The continual scaling and shrinking of technology is amazing to watch.
I remember when they released the G4 and promoted it as a "supercomputer on a chip" capable of sustained gigaflop. Never thought it could get to where it is today and it keeps going...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.