Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Let’s see how the chip fares when it gets released later this year. My guess is that the claims, like many from PC manufacturers, are grossly inflated.
AMD is now releasing their 16 core Zen4 chips in laptops. Apple will be far behind there.

PC laptops increasing 80 percent this year. Apple needs M2 Pro to be 80 percent faster to keep up. Won't happen.
 
Backwards compatibility will always be a structural limitation of x86 based CPU’s. For every step forward they take, they have to make sure old 16-bit commands can still be decoded alongside 32 and 64 bit. That complexity can only be reduced by breaking backwards compatibility, and neither AMD nor Intel wants to be the first to say, “My chip isn’t compatible with what you do today.”
That's better than not being able to run any of my Mac apps anymore. Apple killed my X86 compatibility, my bootcamp, and my 32 bit app compatibility. I now have a giant library of 32bit Steam games for Mac, not one that can I play. Can't play the newer games either without bootcamp. So wiped out my 32bit and my new 64bit windows apps in one swoop.

Ticks me off actually. I'd like to see boot camp for M2 Pro before I commit to buying another Mac product. I'd much rather have a Macbook Air with this Ryzen chip and bootcamp than the M2 we have now.
 
  • Like
Reactions: gusmula and v0lume4
AMD is now releasing their 16 core Zen4 chips in laptops. Apple will be far behind there.

PC laptops increasing 80 percent this year. Apple needs M2 Pro to be 80 percent faster to keep up. Won't happen.
Is an 80% improvement in speed that impressive if users almost never end up benefiting from it?

The unique value proposition of the M1 chip so far is that it offers long battery life and sustained performance even when your laptop is not plugged in to an external power source.

With that many cores, it sounds like these processors will either need lots of thermal headroom (meaning thick and bulky laptops), or they throttle all too quickly. We also don't know the impact this has on battery life.

I am willing to bet that these advertised paper gains will not translate into significant real-world benefits.
 
as long as you love a broken ecosystem 😀
Ecosystem? I’d rather use third party services that work across any device rather than services that only work with one company’s products. Using Dropbox, Google Photos, Prime Video, etc. allow me to switch between any device at ease: Windows, macOS, iPhone, Android, etc. Now THAT is an ecosystem to me.
 
Ecosystem? I’d rather use third party services that work across any device rather than devices that only work with one company’s products. Using Dropbox, Google Photos, Prime Video, etc. allow me to switch between any device at ease: Windows, macOS, iPhone, Android, etc. Now THAT is an ecosystem to me.
than great for you.
to me apple's ecosystem is far more integrated than anything out there... and the hardware is nothing but amazing, build quality - software stability and updates for years - design language etc.
 
  • Like
Reactions: visualseed
than great for you.
to me apple's ecosystem is far more integrated than anything out there... and the hardware is nothing but amazing, build quality - software stability and updates for years - design language etc.
You have to understand that while you may see that as a good thing (which I'm sure Apple Marketing would love to see), others may see that ecosystem as a 'walled garden' meaning once you're in, it's difficult to get out. YMMV.
 
Is an 80% improvement in speed that impressive if users almost never end up benefiting from it?

The unique value proposition of the M1 chip so far is that it offers long battery life and sustained performance even when your laptop is not plugged in to an external power source.

With that many cores, it sounds like these processors will either need lots of thermal headroom (meaning thick and bulky laptops), or they throttle all too quickly. We also don't know the impact this has on battery life.

I am willing to bet that these advertised paper gains will not translate into significant real-world benefits.
Agree,,, every year the PC world promises these fantastic gains… and most of them half-truths or broken promises… laptops are going 80% gains sounds just too far fetched and just paper gains.

All I see is the PC cpu makers now include Apple in their comparisons more and more…
 
That's better than not being able to run any of my Mac apps anymore. Apple killed my X86 compatibility, my bootcamp, and my 32 bit app compatibility. I now have a giant library of 32bit Steam games for Mac, not one that can I play. Can't play the newer games either without bootcamp. So wiped out my 32bit and my new 64bit windows apps in one swoop.

Ticks me off actually. I'd like to see boot camp for M2 Pro before I commit to buying another Mac product. I'd much rather have a Macbook Air with this Ryzen chip and bootcamp than the M2 we have now.
Yes, you’re describing exactly WHY Apple has the performance per watt advantage they have AND will maintain it. No overly complex decoder/scheduler taking up valuable space and wasting valuable energy, and no spending cycles processing known inifficient code. It leaves legacy applications behind, but for those buying a Mac for the first time, who don’t have any legacy apps, they gain a serious benefit.

For anyone that places “legacy code execution” fairly highly on their list of requirements, I’d tell them to do like you and not pick up anything Apple made.
 
I could see Apple investing a comparable amount of money into chip design as AMD. Back in 2011, they had a thousand people working on chip design and in 2019 they picked up 2,200 people from Intel to do wireless chip work. AMD have around 12,000 people, not all of them are likely to be chip designers given there needs to be folk to do finance, marketing, investor relations, security, software development and more in that number. It's not inconceivable that Apple actually has a head count for silicon not too far off AMD's actual number. I agree nothing guarantees it

It's also useful to put in perspective that Apple probably sells more chips than AMD does when you tally up the iPhone, iPad, Apple Watch and Mx series devices they sell per year.

Versus Intel is harder to say since Intel has so much noise because of the much wider portfolio they have.
I have no idea who spends more on R&D. Like you, I could see Apple investing a comparable amount to AMD. Not so sure about Intel. My gut tells me they spend more.

Ultimately, though, my point was just that no processor architecture has scaled consistently for decades. Everything hits a wall. Apple has switched processors three times because of this. The same kind of people (maybe even the same people!) saying Apple Silicone is invincible and everything else is trash were saying the exact same thing about PowerPC at one point.
 
  • Like
Reactions: jakey rolling
Apple has the advantage of needing to design chips only for their unique use cases. In this case, they are able to prioritise battery life, sustained performance when not plugged in to an external power source, and better thermals.

This also means that Apple is willing to cede certain markets, like Intel’s emphasis on absolute performance, because Apple has no products designed to work under such conditions.

I feel it’s this combination of design, hardware, OS, software, and ecosystem integration that give Macs their unique advantage. As such, I don’t feel it is very meaningful to say “oh, this alternative beats the Mac in one area under very specific situations, therefore Apple is doomed”.
I agree with your assessment. That said, I see a lot of hubris among Apple fans when Apple Silicone is discussed, the same kind of hubris Microsoft fans exhibited in the 90s. My point was just that no processor architecture has managed to keep scaling and deliver impressive performance gains generation after generation. At some point, everything seems to hit a wall. I've been an Apple customer for 40 years. I've been through multiple architecture changes for this very reason. Anyone who thinks Apple will maintain it's performance lead forever, or who writes of Intel and others, is foolish.
 
You have to understand that while you may see that as a good thing (which I'm sure Apple Marketing would love to see), others may see that ecosystem as a 'walled garden' meaning once you're in, it's difficult to get out. YMMV.
sure, as long you have tech skills you can leave pretty much anytime you want within a week... already left once and returned.

imo, walled garden is a "smoke screen" while it means integration and open garden means fragmentation, every company implement one or the other in some way and want to "lock" you, its all depends on the level of it.
 
  • Like
Reactions: gusmula
You got proof or just spreading FUD? Get this nonsense out of here.

But instead, no, y’all gotta attack, when this should be a great thing that drives competition and doesn’t allow Apple to sit around and make idle improvements.
Considering that’s been the industry’s problem in the past since forever. And, since there’s no mention they’ve come up with another technology or solution to ridding cpu heat, he’s more than likely NOT wrong!
 
Last edited:
I agree with your assessment. That said, I see a lot of hubris among Apple fans when Apple Silicone is discussed, the same kind of hubris Microsoft fans exhibited in the 90s. My point was just that no processor architecture has managed to keep scaling and deliver impressive performance gains generation after generation. At some point, everything seems to hit a wall. I've been an Apple customer for 40 years. I've been through multiple architecture changes for this very reason. Anyone who thinks Apple will maintain it's performance lead forever, or who writes of Intel and others, is foolish.

Those architecture changes happened for different reasons.

68K to PowerPC happened because Apple found PowerPC to be a more compelling architecture for the future than 68K. The 68060 existed, but Apple, one of its biggest potential customers, wanted something more forward-looking.

But over time, IBM's and Motorola's interests diverged too much from Apple's. And moving to x86 came with other advantages, such as better Windows support. So Apple moved to x86, which had just seen a renaissance due to the Pentium M.

Then, for a variety of reasons (but mostly complacency), Intel did poorly in the late 2010s. And x86 mattered less. Meanwhile, Apple had done really well on their mobile CPUs since the late 2000s. So Apple moved again.

These are all quite different. Part of the reason Apple moved away from PowerPC was that they didn't, at the time, hold much weight against IBM and Motorola to compel them to make good desktop and laptop CPUs. This is very different from today.
 
Those architecture changes happened for different reasons.

68K to PowerPC happened because Apple found PowerPC to be a more compelling architecture for the future than 68K. The 68060 existed, but Apple, one of its biggest potential customers, wanted something more forward-looking.

But over time, IBM's and Motorola's interests diverged too much from Apple's. And moving to x86 came with other advantages, such as better Windows support. So Apple moved to x86, which had just seen a renaissance due to the Pentium M.

Then, for a variety of reasons (but mostly complacency), Intel did poorly in the late 2010s. And x86 mattered less. Meanwhile, Apple had done really well on their mobile CPUs since the late 2000s. So Apple moved again.

These are all quite different. Part of the reason Apple moved away from PowerPC was that they didn't, at the time, hold much weight against IBM and Motorola to compel them to make good desktop and laptop CPUs. This is very different from today.
Each time Apple has switched processor architecture, it has been for performance reasons. The backstory might be different each time, but the need has always been the same. Every processor architecture hits a wall at some point, or at least loses it's momentum generation over generation. Each time Apple has switched architectures, it has been to a new architecture that promises better performance over a stagnating architecture.

Yes the 68060 existed, but Apple wanted something more forward-looking, as you say, and PowerPC promised a better roadmap, better performance, etc. How is that different from the Intel to AS switch? PowerPC promised better performance over the legacy 680x0 series, just like AS promises (and currently delivers) better performance over the stagnating x86 architecture. When PowerPC hit a wall and it became clear the G5 would never work in a latop...they switched again. Same underlying reason. The architecture was stagnating and another architecture promised better performance.

I don't see either of the previous switches being any different from the switch to AS, except that this time around it's Apple designing the chips, which no doubt gives them a number of advantages.
 
I don't see either of the previous switches being any different from the switch to AS, except that this time around it's Apple designing the chips, which no doubt gives them a number of advantages.
One of those advantages being that they can drastically change the underlying infrastructure to whatever provides the performance envelope they desire at a given time. It’s not like Apple could have gone to Intel and asked them to remove the decoder section because “our OS will ONLY be dealing with 64 bit instructions going forward and removing that will make the chips you make for us SO much more efficient.” At any point, if it’s determined that any “wall” is on the horizon, the hardware folks have direct access to the software folks and they can work together to materially alter how the CPU functions and still ensure that the compiled code is as performant or more (see Apple’s bitcode). Apple Silicon at this point just means “the chips Apple produces”. Other than being compatible with the ARM instruction set, what “Apple Silicon” IS can and I believe will change such that it’ll be as performant as Apple needs it to be at a given time. They won’t so much switch from “Apple Silicon” to “something else”, the “something else” will just be what Apple calls “Apple Silicon” at that time.

I think of Apple’s architecture changes as “what’s the best platform for macOS” and actually having the flexibility to make the move as the times and technology allows. For all other vendors their choices are x86 or… another slightly different x86. Going forward, Apple’s choices will be “silicon custom designed and tailored to provide the best performance for the OS that’s been designed to run on it” or “general purpose CPU”.
 
One of those advantages being that they can drastically change the underlying infrastructure to whatever provides the performance envelope they desire at a given time. It’s not like Apple could have gone to Intel and asked them to remove the decoder section because “our OS will ONLY be dealing with 64 bit instructions going forward and removing that will make the chips you make for us SO much more efficient.” At any point, if it’s determined that any “wall” is on the horizon, the hardware folks have direct access to the software folks and they can work together to materially alter how the CPU functions and still ensure that the compiled code is as performant or more (see Apple’s bitcode). Apple Silicon at this point just means “the chips Apple produces”. Other than being compatible with the ARM instruction set, what “Apple Silicon” IS can and I believe will change such that it’ll be as performant as Apple needs it to be at a given time. They won’t so much switch from “Apple Silicon” to “something else”, the “something else” will just be what Apple calls “Apple Silicon” at that time.

I think of Apple’s architecture changes as “what’s the best platform for macOS” and actually having the flexibility to make the move as the times and technology allows. For all other vendors their choices are x86 or… another slightly different x86. Going forward, Apple’s choices will be “silicon custom designed and tailored to provide the best performance for the OS that’s been designed to run on it” or “general purpose CPU”.
You bring up some good points. However, let's not forget that other custom architectures (ie: SPARC, MIPS) have existed and they all faced obstacles. Most of them are gone today. Apple being in control no doubt gives them a lot of advantages, but it doesn't guarantee anything. That said, I think we're in a very different era in general. Whereas most of the yesterday's devices used "general purpose" CPUs, mostly from Intel, a huge percentage of today's devices use custom chips (mostly ARM-based). With the way software development has changed too, the underlying processor architecture isn't as important as it once was. All of that said, I stand by my original point which was simply that it's foolish to assume Apple's impressive performance gains will continue forever or that Intel won't be able to compete.
 
Each time Apple has switched processor architecture, it has been for performance reasons.

Steve's main argument for switching to Intel was "performance per watt". It's not that the G5 was particularly bad at desktop performance; it's that IBM wasn't interested in making it efficient enough for laptops.

Things are quite similar now: yes, AMD and Intel can make high-performance CPUs. But Apple is currently king in efficiency, which serves most of their line-up better.

Every processor architecture hits a wall at some point, or at least loses it's momentum generation over generation.

Maybe, but I'd argue the reasons matter. Is it that the vendor stopped being interested in the market segment? (See: IBM and Motorola mostly leaving desktops and laptops behind in the early 2000s; IBM focused on servers, and Motorola/Freescale on embedded devices.) Is it that they grew complacent? (See: Intel, ca. 5 years ago.)

One might worry that Apple grows complacent because they're sufficiently ahead.

I don't see either of the previous switches being any different from the switch to AS, except that this time around it's Apple designing the chips, which no doubt gives them a number of advantages.

Right, if Apple had been designing their own chips as part of the A-I-M alliance back then, then maybe things would've been different. But they had neither the money nor the interest in doing so. Now they do.
 
  • Like
Reactions: JamesHolden
Steve's main argument for switching to Intel was "performance per watt". It's not that the G5 was particularly bad at desktop performance; it's that IBM wasn't interested in making it efficient enough for laptops.
Wasn't interested? Or wasn't able? Do we really know?

Things are quite similar now: yes, AMD and Intel can make high-performance CPUs. But Apple is currently king in efficiency, which serves most of their line-up better.
Agreed.

Maybe, but I'd argue the reasons matter. Is it that the vendor stopped being interested in the market segment? (See: IBM and Motorola mostly leaving desktops and laptops behind in the early 2000s; IBM focused on servers, and Motorola/Freescale on embedded devices.) Is it that they grew complacent? (See: Intel, ca. 5 years ago.)
I think the reasons matter only in regard to the potential solution. I don't personally believe that lack of interest in a particular market segment was the sole reason for the PowerPC decline. I think they had some design/engineering issues that ultimately made the PowerPC architecture less suitable for desktop/laptop in the long run. Switching to Intel was an acknowledgement of that fact. Intel might have grown complacent in the past few years, but that's a solvable problem. Architecture design problems are much harder to solve.

One might worry that Apple grows complacent because they're sufficiently ahead.
That's one possibility. There's also the possibility that someone else simply builds a better chip.

Right, if Apple had been designing their own chips as part of the A-I-M alliance back then, then maybe things would've been different. But they had neither the money nor the interest in doing so. Now they do.
Agreed. And we're in a very different era of computing. Back in the PowerPC days, there were a lot of custom architectures on the fringes, like SPARC and MIPS, whereas mainstream consumer devices were all powered by general purpose third party CPUs. That has changed. Even Microsoft is designing CPUs now. Who would have imagined that even 10 years ago? Code is also a lot more portable these days, making the underlying architecture less relevant. Apple has always wanted to build the whole widget and they've certainly had an impressive run to date, but I certainly wouldn't dismiss Intel and AMD.
 
Wasn't interested? Or wasn't able? Do we really know?

I don't see why they wouldn't have been able to.

The history of PowerPC, though, is that IBM wanted it in part to get more volume on the POWER architecture (for servers), but also in part for their desktops. That latter part crumbled rather quickly, though; OS/2 was dying, and running Windows on PowerPC was only briefly even possible. It never caught on. Motorola wanted it in part as a 68K replacement (that worked, but there were far fewer customers at this point; for example, Amiga and Atari were at this point dying platforms, and NeXT moved from 68K to x86 instead of to PowerPC), in part to do their own computers (that worked especially well when Apple allowed clones), and in part for embedded. The embedded stuff lived on at Freescale and eventually NXP, but hasn't seen much love since either.

But if things had happened as originally hoped for, all three would've made PowerPC "PCs" throughout the 1990s and 2000s, and others would've licensed PowerPC as well. That would've provided enough volume to give Motorola and/or IBM an incentive to make an efficient fifth-generation PowerPC laptop chip. Instead, Motorola's G5 project (the 8x00) become embedded-only, and IBM's G5 project (the 970) was mostly focused on workstations.

I don't personally believe that lack of interest in a particular market segment was the sole reason for the PowerPC decline. I think they had some design/engineering issues that ultimately made the PowerPC architecture less suitable for desktop/laptop in the long run.

I'm not a chip engineer, but my understanding is that, on paper, PowerPC is far better suited than x86, which at this point is a bunch of hacks on top of each other. Intel wanted to replace it long ago, e.g. with the 860 and later the Itanium.

I haven't found someone who can explain to me why ARM would be inherently better suited for what Apple is doing than PowerPC. Instead, I imagine it's purely an accident of history: Apple was one of the founding partners anyway, so they had an architecture license; on top of that, producing ARM at scale was already very common in the early 2000s, so they could easily ask Samsung to make the original iPhone SoC. Everything else is just history. Apple made an ARM-compatible design with the A6 (the A4 and A5 were largely still Samsung-designed), and bit by bit made it more to their liking. They could've done this with PowerPC as well, but it had since fallen out of favor. (And I imagine their ca-2005 relations with Freescale weren't rosy, given how unhappy they were with later G4 revisions.)

Switching to Intel was an acknowledgement of that fact. Intel might have grown complacent in the past few years, but that's a solvable problem. Architecture design problems are much harder to solve.

Yes, but I don't think there's anyone who would say that x86 is a particularly good architecture design.

That's one possibility. There's also the possibility that someone else simply builds a better chip.

Two sides of the same coin, but yes.

Given how much money and long-term investment you need, I think that version is unlikely. They'd have to outspend Apple and have a lot of patience, and that raises the question: why? This also immediately rules out Qualcomm, because they mostly sell to low-margin customers.

Agreed. And we're in a very different era of computing. Back in the PowerPC days, there were a lot of custom architectures on the fringes, like SPARC and MIPS, whereas mainstream consumer devices were all powered by general purpose third party CPUs.

Right.

That has changed. Even Microsoft is designing CPUs now. Who would have imagined that even 10 years ago?

Well, Microsoft's designing them perhaps to the level Apple's A4 was Apple-designed. The SQ1 and 2 are very similar to Qualcomm's 8cx, and those in turn just use an ARM Cortex design. Apple, meanwhile, doesn't use Cortex at all, and hasn't since the A6.

Qualcomm, Nvidia and Samsung have at times toyed with their own designs, but have mostly gone back with not doing that. Apple is mostly the odd one out. Why? Because they can afford to, and it benefits them.

Code is also a lot more portable these days, making the underlying architecture less relevant.

Yep.

Apple has always wanted to build the whole widget and they've certainly had an impressive run to date, but I certainly wouldn't dismiss Intel and AMD.

Neither would I. But take any marketing claim like this with a bucket of salt. Really, what they're saying is "if you make the laptop thicker and add a louder fan, we're faster". To which Apple probably internally responds with: "yeah, we could do the same, but we really don't want to".
 
They did. Second picture. Showed AMD faster.
I should have stated "They should have "only" showed the M2 as a reference".
By showing the M2 and M1 Pro. They left themselves open to the issues I mentioned. As my first reaction to it was, "why are you not showing the M1 Max? You can get those. Or even the Ultra? Why did they pick M1 Pro and then M2? While also no mention of what the power levels are to meet or in this case beat them? Plugged in, on battery etc.

I mean, lots of folks complained about Apples graphs when the M1 came out. Then they tested them, and it more or less showed it to be pretty darn accurate. Again within the power (performance per watt) envelope.
Everything Lisa showed could be on full 45 watts. With exception to 30+ hour battery life. That could be on the 15 watt side. Whatever, it's not like they haven't shown themselves to be under performing against say intel or nvidia when they show their graphs and new products. As a lot of the time, they are pretty close or better and CHEAPER.
And I would be very fine with that same logic against M1/M2. They should be proud of this achievement, even if it is against a year and a half old M1 Pro. Intel isn't going to be in this power envelope for another year or so.
 
That's better than not being able to run any of my Mac apps anymore.
Did something happen to your previous Mac?
Apple killed my X86 compatibility, my bootcamp, and my 32 bit app compatibility.
Rosetta works pretty good. Stating they "killed" my x86 compatibility isn't an accurate statement.
Bootcamp, sure. But they had been under no obligation to provide it. It's a nice to have because they had intel chips. They don't anymore. Doesn't break your previous Mac's ability to work. I have a Mac Studio at work, and an iMac Pro at home. Both work fine. I even run Windows 11 ARM on my Mac Studio. Works VERY well for my needs. I'm not a gamer, so if that was what you had it for, then I would have advised against that. But, again you can still keep working with your original intel Mac.
I now have a giant library of 32bit Steam games for Mac, not one that can I play. Can't play the newer games either without bootcamp. So wiped out my 32bit and my new 64bit windows apps in one swoop.
You bought a Mac to play windows games? Even if that is the case. There are plenty of YouTube videos that can assist you getting an M1/2 Mac to play Windows games or even PS1/2/3/Switch/N64/etc games as well. It may take some work, but from what I have seen. Results can be pretty good.
Ticks me off actually. I'd like to see boot camp for M2 Pro before I commit to buying another Mac product.
Would you rather have intel, slowly coming out with new processors that are barely faster than the previous model.
AMD is only just getting better than M1 Pro. And within the performance per watt range. Bootcamp is not coming back. It is only a feature they can provide due to running intel chips. That ship has sailed. You can run many of the older Mac (intel) software with rosetta. If you're strictly gaming, that was a mistake. Straight up.
I'd much rather have a MacBook Air with this Ryzen chip and bootcamp than the M2 we have now.
That Ryzen chip would melt the air. It wouldn't be possible without down clocking it. And M1/2 chip is literally cooled with a thin piece of aluminum. Not much thicker than foil, and even it can thermal throttle if pushed for too long.
 
  • Like
Reactions: gusmula and jdb8167
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.