Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Makes sense for Apple to branch off the Pro and Max into their own designs rather than using the Pro as the basis for the Max. The fact that the Pro and Max offered the same CPU performance left a hole in their line up. The M3 now allows them to offer more variations in performance across their product lines; just look at the M3 MacBook Pro product line now...

14": M3/8/10, Pro/11/14, Pro/12/18, Max/14/30
16": Pro/12/18, Max/14/30, Max/16/60

There's no reason for concern at the "lack" of performance in the M3 Pro, it's much better situated between the base M3 and the Max. And a score of 15,000 is nothing to scoff at for a "mid-range" product...

M3: 11,000
M3 Pro: 15,000
M3 Max: 21,000
A mid range £2,099 product? 😬
 
I think part of the reason why PowerPC failed was because Apple relied on others (Motorola and IBM) to make their chips, and it was an incredibly niche market for those companies and therefore they didn't have as strong an incentive or the resources to keep the chips current with Intel and AMD. Indeed the PowerPC G5 was basically a re-hashed IBM POWER4 chip, and IBM had little incentive to try to scale POWER4 to an energy efficient laptop design given that it was otherwise a beefy workstation/server chip.

To expand on the historical context: IBM and Motorola were interested in that alliance because they thought they could compete with "Wintel". They wanted to — and briefly tried to — make desktops and laptops that ran PowerPC. But, it turned out that Windows + x86 was (at the time) too strong for a different architecture to compete. This was the early 1990s, so it was basically the tail end of "IBM still thinks it can do PCs". So, both of them eventually gave up on that dream, and IBM focused PowerPC (nowadays essentially the same as POWER) on server stuff, and some consoles; Motorola focused PowerPC on embedded stuff and eventually spun it off altogether to Freescale (now NXP), who ended up abandoning it in favor of ARM.

If AIM's position at the time had been stronger, and Intel/Microsoft's been weaker, things might've worked out differently. For example, if IBM ThinkPads with PowerPC had been a moderate success, they might've been more interested to continue making efficient cores.
 
  • Like
Reactions: tutubibi
🤔

There's a lot of debate and some strong opinions...

Has anyone here actually used an M3 machine of any description yet?

😆

Well, no.

Only the reviewers have them, and they're embargoed, probably until Monday.
 
Since about the A14, things have slowed down. Is that a result of some key people leaving for Nuvia? Is it that Apple is preparing a bigger bang that isn't quite ready?

Not at all. Qualcomm's Nuvia-based SoC is due out next summer, with 12 p-cores @80W it hit 15,000 on GB. So the M3 Pro with 6p/6e matches that performance, with a much lower TDP today, not 8 months from now. So the loss of those Nuvia engineers has not slowed Apple down. More than likely Apple has refined the CPU design to the point that the only gains come from the fab process. At least until they make some significant ISA changes.
 
  • Like
Reactions: matrix07
I think neither is a good deal in 2023. You're keeping that thing for three to five years, and at that point, macOS Death Valley will make 8 GiB feel very limited.
I'd argue that the kind of people buying the base model are also likely to be the ones to keep that thing for 8+ years.
 
  • Like
Reactions: Warped9
A mid range £2,099 product? 😬

Range is relative to the product line, i.e., other products in the line, not the price of other computers on the market. What you might consider "expensive" someone else might consider reasonable. Especially someone who has used Macs before and gotten many, many years of use out of them.
 
M3 is for people upgrading from M1 or older Intel Macs. People generally don’t upgrade their laptops as often as they would a phone.
 
  • Like
Reactions: tekksan
I'd argue that the kind of people buying the base model are also likely to be the ones to keep that thing for 8+ years.
Looking at it another way...

As a happy owner of an M1 Air with 8GB memory, it would now seem likely that macOS is going to have to be capable of running on 8GB machines for a bit longer than if the base had just been upped to 12 or 16GB... 🤷‍♀️ maybe?

(Obviously I like to see progress - whilst being a little self centred! 🤣)
 
  • Like
Reactions: smulji
M3 is for people upgrading from M1 or older Intel Macs. People generally don’t upgrade their laptops as often as they would a phone.
I recently got my current phone three months ago. My previous phone was five years old. I got my current iPad a year ago. The previous iPad was five years old. I plan on getting a new computer soon given my current computer is twelve years old.
 
I think neither is a good deal in 2023. You're keeping that thing for three to five years, and at that point, macOS Death Valley will make 8 GiB feel very limited.
The thing is that Apple has tons of market research into what actual real users who aren’t tech geeks use. We don’t know exactly what that market research says, but if it said that nobody wants 8/256, Apple would immediately bump their minimum specs to 12/512 or 16/512. That they don’t tells you a lot of people are fine with 8/256. I know several people who get along fine with that much and still have room left over, including my wife who spent most of her career writing documents. If your drive is full of Word or Excel documents, you don’t need a whole lot. If your drive is full of 8K ProRes footage, 4TB might not be enough, but no sane company’s going to set their base models to that. Remember the base configurations are for the least needy users of that tier. This is why the M3 Max starts at 36GB of RAM and not 8GB. The M3 Max’s target audience can’t get by on 8GB, but the M3 users can.

You simply cannot base the entire market on your own needs. Me, I need a minimum of 1TB, but I’m not yelling at Apple to make 1TB the minimum because I know I’m not the typical user. Neither is anyone else in this forum.
 
It's thinking like this at Intel which lead to them being in their current predicament.
No - Intel's problem is that they have been "too big to fail" and able to succeed through politics rather than innovation ever since IBM "made" them by picking their kludgey stopgap pseudo-16/32-bit 8086 (EDIT) 8/16 bit 8088) for the original IBM PC. (This was paired with Microsoft's kludgy CP/M knock-off operating system to make a deeply mediocre machine who's only notable feature was the three magic letters on the front and an army of pin-suited salespersons with an existing office equipment and mainframe empire). Sure, Intel arguably got the whole ball rolling in the 1970s with the 8080 - but by the 80s they'd lost ground to the Zilog Z80 and even the 6502 and several "proper" 32 bit chips were in the pipeline. Intel and Microsoft managed to lock the industry into 1970s CPU architecture for a couple of decades - when we could have had 68k, a clean 32 bit instruction set, a Unix-based OS and high-level source compatibility

Even the current x86-64 instruction set was designed by AMD, not Intel.

The first ARM in 1987 wiped the floor with the contemporary Intel 80286 performance-wise - but it didn't run Windows (except under emulation), so it went nowhere.

Intel's Unique Selling Point is legacy compatibility, and this is now being undermined by a perfect storm of industry changes: There's Intel's almost total failure in the emerging mobile market, which is now dominated by ARM. The rise of Linux and web-based tech with its roots in Unix/Linux - and its culture of hardware-agnostic software design and open source (there's some advantage to running Linux on x86 but most of the key Unix/Linux software packages have supported ARM and other architectures since forever) - is giving Intel serious competition in the server space where power consumption is also a big issue. Apple - even if they remain the underdog in the personal computer market - have demonstrated to the world that you can make high-performance laptops and low-end desktop workstations without x86, and Microsoft seem to be having a serious attempt at Windows on ARM this time. Then, in general as computers have got more powerful, more and more software is written in portable, high-level code and uses operating system frameworks for things like graphics, multithreading, vector processing and neural networks, rather than direct hardware access. An increasing amount of stuff is written in scripting languages like Python or Javascript Even modern Windows ships applications as Common Language Runtime bytecode rather than x86 or ARM binaries, Android apps typically ship as bytecode and even the Apple App store has the capability to distribute "compile on delivery" bytecode. Apple have also demonstrated - with Rosetta 2 - how effectively x86 binaries can be translated to ARM once you've kicked out all of the legacy stuff. I think Windows on ARM's main problem is with bits of legacy Win 16, Win 32 etc. binaries still floating around.

Put simply, Intel's business model relies on people needing to run legacy code, and people ain't writing legacy code any more - so the clock is ticking.

The one area where Apple Silicon has failed to convince is in the high-end personal workstation market - the new Mac Pro satisfies a very limited niche of people who need high PCIe bandwidth but not discrete GPUs. If you really needed the 2019 Mac Pro an x86 tower is probably still the tool for the job. Trouble for Intel there is that the only people buying Intel are the dogmatic "workstations need Xeon because they just do" crowd and AMD are slaughtering them on price/performance. Plus, the whole idea of a high-powered personal workstation isn't for the ages, as the industry shifts to cloud computing (and NVIDIA have some nice ARM-based datacentre-grade iron such as Grace/Hopper to show you).

Whatever you do with x86 tech, it is always going to be carrying around the extra weight of supporting the complex x86 instruction set(s) that more modern RISC-like ISAs don't need. Intel's future is probably to make ARM, RISC-V or some other ISA - and rely on Rosetta-esque translation for x86 support. Trouble is, then, they're going to have to compete with multiple competitors on technical merit, which they haven't needed to do for 40 years.

While a part of this is down to Apple shaking up the industry from time to time, it certainly isn't about what Apple released recently - it goes back as far as the Newton (when they invested money in ARM) plus of course the iPhone's role in promoting the modern mobile scene.
 
Last edited:
I mean… was anyone with an M2 Pro really looking to jump at an M3 Pro? I understand why Apple compared these new chips to Intel and M1.

may not be a big jump from M2, but folks with older chips will be bigger differences. 🤷🏾‍♂️
 
  • Like
Reactions: tekksan and Warped9
I think neither is a good deal in 2023. You're keeping that thing for three to five years, and at that point, macOS Death Valley will make 8 GiB feel very limited.

Love that macOs name!!! Haha!

Disagree though. No one but geeks and nerds give a crap about specs. Most young people use their phones as their only computing device. Apple's base systems are made for these people when they want to move "up". They're not going to need much more than what their phone already provides.

I've had an M1 mini with 8/256 for almost 3 years now, and haven't had any problems, and I'm not an average user; programming hobbyist using Xcode, drawing floor plans, designing furniture, etc.
 
  • Like
Reactions: Warped9
I am hoping apple did this so that they can put the pro in the 15” inch air. That would be a great machine.
 
The thing is that Apple has tons of market research into what actual real users who aren’t tech geeks use. We don’t know exactly what that market research says, but if it said that nobody wants 8/256, Apple would immediately bump their minimum specs to 12/512 or 16/512. That they don’t tells you a lot of people are fine with 8/256.

Well, it tells you two things: that there are people who get by with 8/256, and that there are also people who will begrudgingly give Apple another $400 to upgrade each of those.

I know several people who get along fine with that much and still have room left over, including my wife who spent most of her career writing documents. If your drive is full of Word or Excel documents, you don’t need a whole lot.

Sure. But the question should always also be: "will this still work for me four years from now". Because you're going to want to keep that machine for a while.

You simply cannot base the entire market on your own needs. Me, I need a minimum of 1TB, but I’m not yelling at Apple to make 1TB the minimum

Neither am I, although you'd be surprised how cheap a 1 TB SSD is today.
 
No - Intel's problem is that they have been "too big to fail" and able to succeed through politics rather than innovation ever since IBM "made" them by picking their kludgey stopgap pseudo-16/32-bit 8086 for the original IBM PC. (This was paired with Microsoft's kludgy CP/M knock-off operating system to make a deeply mediocre machine who's only notable feature was the three magic letters on the front and an army of pin-suited salespersons with an existing office equipment and mainframe empire). Sure, Intel arguably got the whole ball rolling in the 1970s with the 8080 - but by the 80s they'd lost ground to the Zilog Z80 and even the 6502 and several "proper" 32 bit chips were in the pipeline. Intel and Microsoft managed to lock the industry into 1970s CPU architecture for a couple of decades - when we could have had 68k, a clean 32 bit instruction set, a Unix-based OS and high-level source compatibility

Even the current x86-64 instruction set was designed by AMD, not Intel.

The first ARM in 1987 wiped the floor with the contemporary Intel 80286 performance-wise - but it didn't run Windows (except under emulation), so it went nowhere.

Intel's Unique Selling Point is legacy compatibility, and this is now being undermined by a perfect storm of industry changes: There's Intel's almost total failure in the emerging mobile market, which is now dominated by ARM. The rise of Linux and web-based tech with its roots in Unix/Linux - and its culture of hardware-agnostic software design and open source (there's some advantage to running Linux on x86 but most of the key Unix/Linux software packages have supported ARM and other architectures since forever) - is giving Intel serious competition in the server space where power consumption is also a big issue. Apple - even if they remain the underdog in the personal computer market - have demonstrated to the world that you can make high-performance laptops and low-end desktop workstations without x86, and Microsoft seem to be having a serious attempt at Windows on ARM this time. Then, in general as computers have got more powerful, more and more software is written in portable, high-level code and uses operating system frameworks for things like graphics, multithreading, vector processing and neural networks, rather than direct hardware access. An increasing amount of stuff is written in scripting languages like Python or Javascript Even modern Windows ships applications as Common Language Runtime bytecode rather than x86 or ARM binaries, Android apps typically ship as bytecode and even the Apple App store has the capability to distribute "compile on delivery" bytecode. Apple have also demonstrated - with Rosetta 2 - how effectively x86 binaries can be translated to ARM once you've kicked out all of the legacy stuff. I think Windows on ARM's main problem is with bits of legacy Win 16, Win 32 etc. binaries still floating around.

Put simply, Intel's business model relies on people needing to run legacy code, and people ain't writing legacy code any more - so the clock is ticking.

The one area where Apple Silicon has failed to convince is in the high-end personal workstation market - the new Mac Pro satisfies a very limited niche of people who need high PCIe bandwidth but not discrete GPUs. If you really needed the 2019 Mac Pro an x86 tower is probably still the tool for the job. Trouble for Intel there is that the only people buying Intel are the dogmatic "workstations need Xeon because they just do" crowd and AMD are slaughtering them on price/performance. Plus, the whole idea of a high-powered personal workstation isn't for the ages, as the industry shifts to cloud computing (and NVIDIA have some nice ARM-based datacentre-grade iron such as Grace/Hopper to show you).

Whatever you do with x86 tech, it is always going to be carrying around the extra weight of supporting the complex x86 instruction set(s) that more modern RISC-like ISAs don't need. Intel's future is probably to make ARM, RISC-V or some other ISA - and rely on Rosetta-esque translation for x86 support. Trouble is, then, they're going to have to compete with multiple competitors on technical merit, which they haven't needed to do for 40 years.

While a part of this is down to Apple shaking up the industry from time to time, it certainly isn't about what Apple released recently - it goes back as far as the Newton (when they invested money in ARM) plus of course the iPhone's role in promoting the modern mobile scene.
Don't sugar coat it like that, kid, tell 'em straight. :)
 
He’s comparing M2 Max to M3 Pro, why?

Why does article say M2 Pro vs M3 Pro? Just checked the tweet unless I got something wrong, he compared M2 Max to M3 Pro?
 
I'm gonna wait a couple of years, save up my money and buy an M5. The BMW, not the Mac. This M1 MacBook Air I'm using right this minute is way more powerful than I need. My 2013 Volvo S60, not so much.
 
I am hoping apple did this so that they can put the pro in the 15” inch air. That would be a great machine.
The Air won’t get the Pro chip and won’t need to as the base M3 beats the M1 Pro chip and is near equal to the M2 Pro.

This will likely be Apple’s practice going forward: the iMac and MacBook Airs will get the base chips (even as they continue to get more powerful) while the Pro and Max chips will be for the MacBook Pros and Mac Studio.
 
If Apple was going to start limiting certain aspects of their chips generation by generation, they should have done so since the start.

The SSD speeds, the memory bandwidth, and the number of cores. Then, its customers would not have know they were limiting them on purpose, or at least it wouldn’t be as obvious. Less backlash.

For now, M1 and M2 users should be pretty happy that their systems weren’t limited or as limited by comparison as M3. I’m glad they didn’t go that route for M1 and M2 for the most part.

Apple isn't "limiting" anything. They've made a machine that is faster and more efficient, at the same base price point, than the previous. In the end, the overall performance counts, not the design designs made to get there.

As a group Apple’s M-series computers are impressively fast and power efficient, and they’ve been upgraded significantly since they were introduced three years ago. In real world applications no one is going to give a damn about the finer points in terms of specs because in real world applications these machines have already proven themselves. M3 will be no different.

Exactly. Most users don't care about the specs, they just want a machine that does what they need at a price and performance level they feel is a good value.

8GB isn't a disaster for the average user on Mac, I totally agree. 256GB base storage however is a spit in the face whether Mac or PC.

I disagree. Unless yo are saving large amounts of videos or massive number of photos, most users never get near filling a 256 GB drive.

Anecdotally, I have 20+ years of client work on my drive, and the only way I ever need more than 256 is when I decide to take over 50 ripped videos with me on a trip (most of which never get watched anyway.). An external drive would work just fine in such a scenario.
 
  • Like
Reactions: Chuckeee
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.