Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
This makes me want to buy one even though have absolutely no use for it except for commenting on Macrumors.

Better wait for the M3 Ultra. Safari is super snappy on the M3 Ultra which boost your productivity with commenting on MacRumors.
 
  • Haha
Reactions: Jacoblee23
With respect to the "Pro" laptops: starting at 8GB in 2023 is terrible

With respect to the iMacs leapfrogging over M2 is weak

My base M2 Mini has TB4, while base M3 has TB3, hardware regression there. Although, 3 & 4 both support 40Gb/s ... if past Apple history proves anything they'll use version 3 versus 4 as an artificial cut off point for Mac OS upgrades in the future...

And then finally the 150/300/400/800 bandwidth of their unified memory is kind of ridiculous. Before it was 85-ish, 100, 200 or 400 throughput in versions one and two. They have the die shrink available to them, but they opted not to push the boundaries, and to me that seems lazy, as they own the whole stack now.

And finally, they rarely compared anything of real value to M2. It was all M1 or Intel stuff, so clearly they just are trying to lure you into an upgrade, all thepeople on old Intel hardware, and that's a money decision, not a technology, decision, and I thought we were told that, "we ain't seen nothing yet." They are just itching to drop support for Intel machines and it's gonna come sooner than later I think.

To me, it's clear that they artificially slowed things down since they were so far ahead of the competition, imo a sad trombone is warranted.

And really minor, they took away support for high impedance headphones... Really just give us a decent dac/amp which is already in my m2 base Mini.

I will have however, cut them slack for Wi-Fi seven, the lack of...

Finally, in the reading of multiple websites, nobody's running out to trade in their M2 machines handover fist, so it seems like initially it's going to be a slow roll until M4.

I absolutely could be wrong though...
Wow, some people just can’t take good news.
 
GPU compute benchmarks are live. M3 Max about the same as M1 Ultra.


It's the level of an AMD Radeon RX 6850Mobile XT

If it maintains the same performance improvement toward the M2, the M3 Ultra will be at the level of a Nvidia GeForce RTX 2080 Ti or a AMD Radeon RX 7800 XT.

While in single and multicore CPU performance Apple Silicon can compete with the best x86 offerings, it's not the same for the GPU raw power.

Btw, the M3 Ultra will be at the very top by a good margin in multicore CPU performance, but it will miss the single core top by a 5-10% margin, if Apple is able to push a 10% clock improvement on both the Studio and the Pro, they would be at the top even in single core.
 
  • Like
Reactions: Chuckeee
GPU compute benchmarks are live. M3 Max about the same as M1 Ultra.

Why use an OpenCL benchmark, a standard macOS doesn’t support and hasn’t for a while? That’s essentially like testing the speed of an emulator, which will never look good. Aztec Ruins Offscreen and other tests such as Metal-based tests are much better and put the GPU’s far ahead of the weak OpenCL scores. The Windows equivalent would be to turn off DirectX 11 and 12 and expect Windows machines to do well in graphics performance.
 
Why use an OpenCL benchmark, a standard macOS doesn’t support and hasn’t for a while? That’s essentially like testing the speed of an emulator, which will never look good. Aztec Ruins Offscreen and other tests such as Metal-based tests are much better and put the GPU’s far ahead of the weak OpenCL scores. The Windows equivalent would be to turn off DirectX 11 and 12 and expect Windows machines to do well in graphics performance.

Because Metal benchmarks don't show up yet.

And OpenCL isn't fully deprecated. It's still used by many apps including Photoshop which Apple benchmarks themselves. There's no Metal version of Photoshop.

So comparing the CL benchmarks versus other Macs is fine. Just don't compare against PCs because they use more modern CL version.
 
Because Metal benchmarks don't show up yet.

And OpenCL isn't fully deprecated. It's still used by many apps including Photoshop which Apple benchmarks themselves. There's no Metal version of Photoshop.

So comparing the CL benchmarks versus other Macs is fine. Just don't compare against PCs because they use more modern CL version.
That’s precisely my point. Using OpenCL benchmarks are not going to express the true abilities of the GPU cores, but rather a handicapped version that isn’t even available in macOS when first installed and has to be installed from other sources. Using OpenCL is worthless as a measurement of the true graphics performance, so I wonder why anyone still does those benchmarks. If Apple were still supporting OpenCL, then I would agree that a benchmark would be valid.

If the objective is to do a comparison between Windows and Mac, which many have done in this thread and is what prompted me to post on this story, an old version of OpenCL would make Windows look terrible, too. If someone were to wonder about gaming on a Mac, looking at those scores would make people think that Mac graphics are ten years behind when in reality, the graphics are quite competitive with Windows. We’ll find out just how competitive once the new Macs are released and better benchmarks are reported. If the new M3 Max is indeed close to the M2 Ultra, then the M3 Max will be competitive with some of the fastest Windows graphics cards.

I will agree with you that using this OpenCL benchmark should not be used for any comparison with Windows.
 
It's the level of an AMD Radeon RX 6850Mobile XT

If it maintains the same performance improvement toward the M2, the M3 Ultra will be at the level of a Nvidia GeForce RTX 2080 Ti or a AMD Radeon RX 7800 XT..

Those AMD and nVidia cards are more powerful…only when they’re plugged in. And they consume what, 5x the power?
 
Those AMD and nVidia cards are more powerful…only when they’re plugged in. And they consume what, 5x the power?

Just to let you know we also have mobile 3D chips, which aren't as powerful, but are a fraction of the size and power consumption. Asus offers precisely some of these.
 
I agree, those Geekbench results seem very fishy. Apple says that the performance on the M3 Max can be up to 50% better and then some Geekbench results show up that indicate an almost exact 50% improvement?

That said, on the day of the "Scary Fast" event I posted that the "M3 is a good upgrade" and that it "is more than fine in most respects and the video upgrades are really significant." Whereas it seemed that many here on CN were saying that the event was completely disappointing with some even claiming that they saw no justification to upgrade from their Intel Macs.

But, I also questioned the fact that Apple didn't show CPU results comparing the M2 Pro to the M3 Pro and that left me wondering whether the M3 Pro was any faster than the M2 Pro (it may not be for the CPU, but still with better graphics and probably some power savings).

So, that was Monday and now based upon a few Geekbench results of unknown origin everyone seems blown away by what Apple has done with the M3.

I say wait until the M3 actually ships and then we'll know soon enough how much better it is than the M2. In most cases I expect it will be quite good.
 
Guys, Apple is just maintaining the tradition of releasing a new gen Mac Pro and then outclassing it six months later with a cheaper machine.
I think whole Mac Pro strategy went to bin as manufacturing was well delayed and design failures resulting we never saw Extreme version. I expect Apple strategy was Mac Pro with Ultra and Extreme chips. If they managed to fix design failures we may see M3 Extreme next year.
If we will it will be real end of Apple silicon transition. Mac Pro with M2 Ultra was just placeholder.
 
Last edited:
  • Wow
Reactions: foliovision
Nice figures , but in memory bandwidth and gpu core count the M2 Ultra is king. That does not impact geekbench performance that much, but my guess is that in demanding graphical tasks the m2 ultra will remain around 35-40% faster than the m3 max.

But can’ t wait to see what the m3 ultra is capable of , expect to see it within the next 12 months.
 
That’s precisely my point. Using OpenCL benchmarks are not going to express the true abilities of the GPU cores

The Metal scores are up now and the result is the same. M3 Max performance aligns with M1 Ultra, except that there is now ray tracing for apps/games that will support it.
 
Those CPU multicore scores are not only the fastest ever for a laptop, they're the fastest ever for a laptop - by a LOT. Both the Intel Core i9 13980HX and the AMD Dragon Range Risen 9 7945HX are averaging around 16000 in Geekbench 6. There are some high outlier scores in the 17000s, but at least some of those scores either are not laptops (they're desktops and all in ones using laptop chips, possibly overclocked) or are VERY questionable laptops at best (10 lb gaming rigs 2 inches thick). The models I can identify as real laptops are in the low 16000s.

That's a 25% difference against the biggest, baddest, most powerful gaming rigs and mobile workstations, and ~20% against what seems to be a rogues' gallery of compact desktops and overclocked machines
 
Holy smokes! All aboard the 3 nanometer express ! A laptop nearly besting the Mac Pro, a mere 4.5 months after its debut.
Indeed the performance jump is outstanding vs the M2 Mac Pro. Yet you gotta think just what on earth where Apple engineers thinking designing it?

I'm hoping the design of the M series being fixated to a certain on die RAM isn't inherently set in stone forever and Apple can re-design for future iterations. At some point, as RAM begins to increase, that modular RAM will inevitably always bee cheaper than bundling on die as part of the SoC fabrication process.

It makes perfect sense given that the M1 and M2 Max chips had 12 cores (8 performance + 4 efficiency), while the M3 Max has a big jump to 16 cores (12 performance + 4 efficiency). 50% more performance cores, in addition to all the cores being faster, would logically lead to a result like this.

I'm not so sure that getting 50% more performance cores are resulting in only 15-25% actual SoC performance - at least from what we know via Benchmarks vs real world tasks + scripts and consistent deadlines needing to be reached daily/weekly/monthly/quarterly. We'll see soon enough if its a lot more performance that what the benchies are showing us.

Mac Studio Ultra users just lost half of its value.
^ ? Not if the work their doing on the Studio Ultra's ... still pay them $$ to cover the mortgage payments, food and other expenses with lots of leisure time, and their getting a LOT of use out of them. Value is always perceived by both the outsider looking in vs the user using the product/service. Sure an inflection point will be reached but not this soon.
 
Holy smokes! All aboard the 3 nanometer express ! A laptop nearly besting the Mac Pro, a mere 4.5 months after its debut.
I am actually wondering if this is why they never produced the Extreme version with M2. We could see this with M3 since this is the big jump of which there won't be the same jump like this for a number of years making an M3 Extreme make more sense.
 
I think you missed my point. The chips are progressing fast. Slow chip updates can no longer be used as an excuse for long gaps between hardware updates. The Mac Studio is likely to lag a long time behind the chip release again.
Long gaps to non-mobile machines were almost always due to Apple not prioritizing those systems. That still exists today. Apple went to their own chips because 80% of their sales were in mobile systems and, due to missteps by Intel, they were forced to, for example, ship laptops with desktop memory because Intel didn’t support high capacity LP RAM.

So the improved rate of release is ONLY to Apple’s mobile solutions. Desktops will get what they get when Apple’s ready to sell it. It’s not like anyone who need what the Mac Studio offers will ever be able to buy a Studio NOT made by Apple.
 
The M3 Ultra, assuming a 50%+ jump over the Max (doubling the cores will be more like a 50% improvement than a doubling (100% improvement) will be scary close to some VERY large PC workstations. Another 50% improvement from a hypothetical Extreme should beat ANY single-CPU workstation by quite a bit, including things running the new 96-core Threadripper and the like. The GPU will similarly beat any single card I can think of (maybe not some exotic AI-centered card - I know nothing of what they're like).

Even then, there will be PC workstations that can beat it - dual top-line Xeons or Epycs, with two to four GPUS.

There's a huge difference in livability, though. M2 Max is about a 90 watt chip (CPU plus GPU), and I think M3 Max should be similar (more performance cores add power draw, but die shrink takes it away). M3 Ultra will run on about 180 watts. A hypothetical M3 Extreme is still under 400 Watts, including the GPU load. Oversize the PSU a bit to be sure, and add in the load from (potentially a lot of) PCIe storage and the like, and it could run off a 750 watt power supply, with 1 KW providing quite a bit of headroom in case someone did something unexpected with the PCIe slots. Heck, if they put it in a Mac Studio case and avoided the possibility of heavily loaded slots, you could run the thing off a 600 watt PSU quite comfortably (you might not be able to cool it in the Mac Studio case).

The fans would be nearly silent in the Mac Pro case - that thing is WELL cooled. It plugs into a household 15 amp circuit, going through any 1500 VA UPS you pick up at Best Buy. It doesn't even need a dedicated circuit - the monitor, (although not three monitors or a laser printer), a phone charger and an LED desk lamp will fit just fine. It only takes a little caution to make sure the coffee maker is not inadvertently on the same circuit.

Have you seen what big, exotic Windows workstations take for power? EACH of their two CPUS draws as much as the whole M3 Extreme (including the GPU). Then there are two to four 350 watt GPUs. The power supply is not A power supply, it is multiple power supplies, each well over 1 KW. Forget about plugging THAT into a 15 amp outlet - many of them won't fit on a 20 amp outlet. Most commercial buildings (unless they're old) have 20 amp outlets, but anything beyond that is a custom wiring job. Your top designer isn't going to love hearing that her new office is the former copier room, because the only place her workstation will plug in is the outlet the copier used to plug into. Some big workstations will run off a 2200 VA UPS, and some won't. A 2200 VA UPS will JUST fit on a 20 amp office circuit - that's why they are popular. Once you get into a 3000 VA UPS, you're into strange outlets or hardwired power.

Big Windows workstations also run HOT and NOISY. If you're Pixar, it doesn't matter (except for environmental reasons) - these things live in the server room and they just use remote rendering - the computers on employees' desks are relatively ordinary, while the big iron is in racks in dedicated space. The server room has hundreds or even thousands of amps of power with hardwired UPS and dedicated cooling available . Heck, some of those server rooms used to have Crays in them....

What if you're a freelancer or part of a five person firm? Your office might be in an old three-story building downtown. Wouldn't you rather just set your workstation on the floor next to your desk, rather than paying the electrician to install it and then dealing with its banshee wail?
 
  • Like
Reactions: Giant_enemy_crab
My base M2 Mini has TB4, while base M3 has TB3, hardware regression there. Although, 3 & 4 both support 40Gb/s ... if past Apple history proves anything they'll use version 3 versus 4 as an artificial cut off point for Mac OS upgrades in the future...
TB4 started at M1 family. tbMBP and base M3 MBP14 have Thunderbolt 3 and same with iMac 2021 and iMac 2023. It wouldn't be different when the iMac is updated with M2. It looks like they segment TB4 only for higher end MBPs and their headless desktops. It really doesn't surprise me at all when they announced it.
 
The M3 Ultra, assuming a 50%+ jump over the Max (doubling the cores will be more like a 50% improvement than a doubling (100% improvement) will be scary close to some VERY large PC workstations. Another 50% improvement from a hypothetical Extreme should beat ANY single-CPU workstation by quite a bit, including things running the new 96-core Threadripper and the like. The GPU will similarly beat any single card I can think of (maybe not some exotic AI-centered card - I know nothing of what they're like).

Even then, there will be PC workstations that can beat it - dual top-line Xeons or Epycs, with two to four GPUS.

There's a huge difference in livability, though. M2 Max is about a 90 watt chip (CPU plus GPU), and I think M3 Max should be similar (more performance cores add power draw, but die shrink takes it away). M3 Ultra will run on about 180 watts. A hypothetical M3 Extreme is still under 400 Watts, including the GPU load. Oversize the PSU a bit to be sure, and add in the load from (potentially a lot of) PCIe storage and the like, and it could run off a 750 watt power supply, with 1 KW providing quite a bit of headroom in case someone did something unexpected with the PCIe slots. Heck, if they put it in a Mac Studio case and avoided the possibility of heavily loaded slots, you could run the thing off a 600 watt PSU quite comfortably (you might not be able to cool it in the Mac Studio case).

The fans would be nearly silent in the Mac Pro case - that thing is WELL cooled. It plugs into a household 15 amp circuit, going through any 1500 VA UPS you pick up at Best Buy. It doesn't even need a dedicated circuit - the monitor, (although not three monitors or a laser printer), a phone charger and an LED desk lamp will fit just fine. It only takes a little caution to make sure the coffee maker is not inadvertently on the same circuit.

Have you seen what big, exotic Windows workstations take for power? EACH of their two CPUS draws as much as the whole M3 Extreme (including the GPU). Then there are two to four 350 watt GPUs. The power supply is not A power supply, it is multiple power supplies, each well over 1 KW. Forget about plugging THAT into a 15 amp outlet - many of them won't fit on a 20 amp outlet. Most commercial buildings (unless they're old) have 20 amp outlets, but anything beyond that is a custom wiring job. Your top designer isn't going to love hearing that her new office is the former copier room, because the only place her workstation will plug in is the outlet the copier used to plug into. Some big workstations will run off a 2200 VA UPS, and some won't. A 2200 VA UPS will JUST fit on a 20 amp office circuit - that's why they are popular. Once you get into a 3000 VA UPS, you're into strange outlets or hardwired power.

Big Windows workstations also run HOT and NOISY. If you're Pixar, it doesn't matter (except for environmental reasons) - these things live in the server room and they just use remote rendering - the computers on employees' desks are relatively ordinary, while the big iron is in racks in dedicated space. The server room has hundreds or even thousands of amps of power with hardwired UPS and dedicated cooling available . Heck, some of those server rooms used to have Crays in them....

What if you're a freelancer or part of a five person firm? Your office might be in an old three-story building downtown. Wouldn't you rather just set your workstation on the floor next to your desk, rather than paying the electrician to install it and then dealing with its banshee wail?

A few areas I disagree with:

1. The whole power consumption is overrated. This depends on the region and needs. In our state, power costs 9-10 cents a KwH. And most companies/labs where these workstations will sit, have lights running near non-stop and other areas where power consumption can be optimized.

2. Noctua / large fans and liquid cooling were becoming mainstream last decade. You cannot buy a 2.5-3k Windows gaming PC without liquid cooling and the large fans make many high end workstations virtually quiet. That and even with the challenges, developers will want their CUDA and OpenCL platforms.

3. The niche borderline scenarios you’re dreaming up aren’t usually a factor. I’d wager that’s <1% of the target population. Having said that, there are limitations of the Mac OS environment which is why Windows has a hold in the area of gaming, development IDEs etc.

I split my time between Mac OS and Windows on the desktop side and all my server side stuff is on Linux. It’s great to have choice and instead of building an i9 with a 4090 for my next machine, I’ve decided to get an M3 Max. I don’t quite think though that an Ultra will result in developers abandoning their platforms that make them money today. Or discovering out of the blue that my office’s circuit can’t support the power draw it was supporting yesterday. Just because the M3 Ultra came out.
 
  • Like
Reactions: tobybrut
3. The niche borderline scenarios you’re dreaming up aren’t usually a factor. I’d wager that’s <1% of the target population. Having said that, there are limitations of the Mac OS environment which is why Windows has a hold in the area of gaming, development IDEs etc.
There are three reasons why Mac is so inferior in gaming. One is the inadequacies of Metal over far more mature technologies like DirectX. Two is that Windows have sold so many more units than Macs over the decades that gaming companies simply don’t bother with the Mac. The opposite holds true in mobile gaming where iOS is the place to be. Three is that Microsoft evangelized gaming to a much larger degree, even buying up gaming companies to do it themselves, while Apple’s efforts have been at best half-hearted. Apple needs to really step it up and buy a few gaming companies themselves if they ever want to get serious. It’s the one area where Apple simply can’t compete, but they could if they actually tried. Every time we think they are, Apple drops the ball and doesn’t follow through. Why they don’t just defies explanation. Apple is now pushing mesh shading and ray tracing, something PC’s have had for eons, but will Apple follow through and take advantage? I‘m not holding my breath. They’ll probably disappoint us again with three or four AAA games, but nothing more. Microsoft just closed the deal on Activision-Blizzard, one of the biggest names in gaming. Here I am hoping Microsoft doesn’t shut down one of the few native Mac AAA games, World of Warcraft, and make it exclusively Windows/Xbox. I’d have to go build another gaming PC.

What I want to see is for Apple to buy up a few gaming companies, publish some of their own AAA games, beef up the Apple TV to the point where it can compete with PS5’s and Xboxes, and put a priority on making Metal top notch. Just think how many people have a gaming PC but use Macs for everything else. If Apple really pushed, a lot of people would just forgo that gaming PC.

I’ll agree with you that XCode is not nearly as good as Microsoft’s Visual Studio line of products. As a software developer for over 25 years, VS was my environment of choice, sometimes even using Xamarin over native XCode for building iOS apps, though I’ve used both.
 
We're looking at two completely different markets here (actually three, because the upper end M-series chips are fast enough to attract interest from scientists and other folks who do intensive modeling). There are consumer and business markets too, but we've been talking about the M3 Max and above, which are mostly of interest to the creative and data-crunching markets.

There's the gaming market, which is the largest single market for high-end PC hardware, and IN WHICH APPLE HAS NOT REALLY PARTICIPATED IN YEARS. As a photographer, I'm actively unhappy to see Game Mode in Sonoma, because it means that some of the instability games bring to Windows (the hooks that allow games to function, even when not playing them) may be coming to the Mac. Games want to access the hardware more directly than Photoshop or Final Cut Pro, and allowing that makes the OS less stable for other uses. I'd rather have a machine that can't game past the Candy Crush level, but doesn't have the instabilities, than one that runs AAA games but crashes more.

There's the creative market, which is what Apple has been courting. At the M3 Max level, it includes photographers (professional, artistic and serious hobbyists), artists working in film and video and musicians, among others. There are other smaller categories, but those are the big three. Most of the still photographers will probably not look at M3 Ultra because M3 Max is fast enough to stitch our largest panoramas and run our gnarliest noise reduction. I'm a landscape photographer working in 100 MP medium format and printing as large as 40x60", and I'm going M3 Max, thinking it will last me through several generations of larger and larger sensors.

Film and video is the core of the market for M3 Ultra performance, especially folks doing a lot of 3D renders, but also folks working in 8K. Musicians with especially power-intensive needs may also be looking at M3 Ultra.

If Apple were to introduce an M3 Extreme, the market would largely be the top end of the 3D animation world. It might well be a $20,000 machine, but Pixar is happy to pay $20,000 a shot. $20,000 is also a lot more accessible to independent animators or those who work in small firms than $100,000. A lot of those 3D animators are actually game DESIGNERS - very high-end Macs have a role in gaming, but in designing them rather than playing them.

Finally, there's the scientific (and related data-crunching) market. Back in the PowerPC days, when Power Mac G5s could outrun any PC in a lot of numerical work, scientists built several Frankenstein supercomputers composed of a bunch of G5 Macs (once the G5 Xserves came out, they preferred those because they stack better) linked together. Now, they like Mac Studios for this, either individually (or in small piles) on their desks or linked in large piles in the data center. They get a lot of crunching power for their grant dollars (and they're easy to administer).

Assuming you're a creative pro or a scientist (used broadly to include everyone who crunches tons of data), you are very likely to care about stability and power consumption. You don't want your work computer running as close to the edge as Windows gaming machines do. Windows workstations don't run as close to the edge (on purpose) as gaming rigs, but they're still dealing with power dissipation issues. Macs' power per watt is enormously attractive in a range of fields

If you're rendering or data-crunching on a stack of computers in a data center, you probably don't care about idle power, since it'll be running full-out continuously, but photographers (for example) often use the same machine to edit and print that they use to do e-mail and send out invoices. f you're a photographer and also a writer (for example), 15 hour battery life in Word is VERY relevant, since you may like to write in coffee shops and own one computer (I know this particular use case because it's my own). I don't want to carry 9-12 pounds of computer plus power adapter (and have to find the one outlet in the coffee shop) when I'm writing an article. If you're an indie filmmaker, you probably write your own scripts and grant proposals.

If you're a scientist, you'd like your desktop computer to plug right in to your office outlet without worrying if your coffee cup warmer is going to shut down the computer in the middle of a model run. A scientist (or a 3D designer) can also build a small cluster (say four Mac Studios linked by 10Gb Ethernet) that STILL sits in a corner of her office and plugs into regular power.
 
there are limitations of the Mac OS environment which is why Windows has a hold in the area of gaming, development IDEs etc.

The Mac is a perfectly common environment to run IDEs.

As a photographer, I'm actively unhappy to see Game Mode in Sonoma, because it means that some of the instability games bring to Windows (the hooks that allow games to function, even when not playing them) may be coming to the Mac. Games want to access the hardware more directly than Photoshop or Final Cut Pro, and allowing that makes the OS less stable for other uses. I'd rather have a machine that can't game past the Candy Crush level, but doesn't have the instabilities, than one that runs AAA games but crashes more.

Game Mode doesn't give more direct hardware access, nor does it cause instability. It simply prioritizes resources for the frontmost fullscreen app when that app happens to be a game.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.