Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
If Intel had shown this level of competitiveness, commitment and focus, say 5-6 years back than Apple would still be using Intel processors in it's MAC. Good out come of Apple's M processor birth is it has put fire under Intel. When Intel lost Apple's business and AMD is coming after Intel's market share fast and furious, Intel will have to pull rabit from it's hat.
Bottom line, Apple will be ahead by the time Intel comes out with those processors in late 2023/24. Moreover, history of meeting deadline vs announcement is not on Intel's side. One big difference from past is Intel and Apple will be using the same TSMC fab process.
 
Well, making a CPU faster isn't that hard if you don't care about power efficiency.
It's not as easy as you might think. Apple can't just increase the clock to >5GHz if their architecture and manufacturing process isn't designed for it. The frequency scalability of Intel's current CPUs is ahead of everyone else's. Not even AMD can match it. Also, the power efficiency of the mainstream mobile Alder Lake parts isn't that far behind Apple. Saying that they "don't care about power efficiency" is hyperbole. They are still hobbled by the stumbles in their manufacturing process, which is why they are going to use TSMC for some of their CPUs.
 
Claiming to be faster than a 2+-year-old processor is a lot like my phone company telling me "20 Mbps DSL is 100 times faster than dialup!" Well, yeah, but...
 
  • Haha
Reactions: CarlJ
I’ve been developing (in my spare time) on Linux on ARM since 2016, in the form of Raspberry Pi’s (don't mistake them for toys, they can do quite a lot for their size, but they’re specifically aimed at being small and cheap). And Linux on ARM wasn’t some new experimental thing when I got there, it had been in wide use for years.

I'm aware that ARM is widely in use for phones, calculators, small devices, etc. True enough. Cards on the table, though, my thought is purely on the future of x86. Now I bought the hype for M1. (Literally! My wife and I each use one!) And I
was curious, from an IT infrastructure side, what this means. I saw the coming of M1 as the possible sunset of x86. Or that it should be. Wasn't Nvidia in a hurry to begin their ARM transition, for instance?

Everyone responding to this by throwing back their heads and laughing that "ARM has been in use for yeeeears!" I cock my head and say "really?" Where I am coming from is that Microsoft's Arm endeavor was kind of half-done and essentially DOA. Until the M1, you didn't have major desktop software releases compiling for ARM such as Adobe. I know Microsoft had compiled something for ARM, as did Chromium, but clearly the M1 made an ARM-derived versions of their software incredibly widespread rather than niche.

I maintain the M1 was the biggest blow to x86 in decades. Not that anyone is disagreeing with me on that. But my point was that it made a lot of everyday x86 workflows that people rely on look like it was suddenly replaceable.

What I'm thinking is in terms of how that will affect standard desktop development for usage applications. A lot of software that runs everything in everyday life is either Windows or Linux based. (In my own experience with unique medical equipment.) That's because it's expensive to develop something boutique, so if you can code an application to run an .exe on an x86 chip, then you're going to stick with that. If you can modify existing Linux distros for your software to just run on an x86 distro, you will do so.

If x86 can actually compete that well, it makes the Apple Silicon approach just Apple's approach. And Wintel staying with x86 means that software developers will know they can focus on the x86 exclusively if they have to for everything that runs work apps in a workplace/school/hospital, etc.

Forget ARM, Microsoft only now are porting Visual Studio to 64-bit. In the time that Apple have moved from PPC (32-bit/64-bit) to Intel (32-bit/64-bit) and now Apple Silicon (64-bit), Microsoft haven't been able to port their own flagship IDE product to 64-bit. Of course they haven't supported ARM, they don't even fully support the 64-bit instruction set introduced over fifteen years ago! (Windows XP 64-bit came out in 2005)

Good gravy. I'm all for dunking on Microsoft. But the PC world is a Microsoft world, with Linux dev as the remoras. That sounds like I am insulting Linux. I am not. I'm very pro Linux. Forgive the metaphor. It's more about who shoves the most water around and who has to cling on. New PC equipment has to run Microsoft well first and foremost, and it's up to Debian to follow through.

My bias is that I would like a more open space in IT. If I do want something as an alternative to Apple equipment, such as a Framework laptop, I expect I will be loading an x86 version of Linux on it for many versions to come if x86 continues to get hacked to compete. I was hoping we'd see ARM open up the desktop software space to compete directly with Apple Silicon. But maybe just not yet.
 
As said elsewhere, Intel history is filled with delays, unfulfulled promises and wrong statements. I do not believe them any more. Remember the Intel Optane Xpoint fiasco? It was going to be 1,000 times faster than current SSD, yet it is even slower (only marginally faster on random read/write), requires a lot of energy, is not available at large sizes and is very expensive.
 
The roadmap also says that Intel will utilize TSMC's 3nm process. Apple currently utilizes the 5nm process for its latest chips and is expected to adopt the 3nm chip architecture in 2023 with the "M3" Apple silicon chip and A17 chip in the iPhone 15.

This will be an interesting time. Process parity means we'll see a better comparison of the architectural directions each company has taken. I'll also be curious to see if Intel continues to have trouble getting product out on schedule even when they don't have process issues to point to.
 
If that’s the case, it’s an utterly dumb statement, as M3 is a completely unknown quantity at the moment.
Is it? Considering the M-series is simply a beefed up A-series, which has been out for many iterations, it's not that hard to take a decent educated punt on what the M3 will be capable of.
 
Except Intel has been moving at a glacial pace for almost a decade. Their manufacturing was a generation ahead of everyone else, now they're behind. It would be very welcome news if they turned this around.

Of course, during that decade, their roadmaps were always promising. It just was delayed again and again.
Yes, but they did have a wake up call, and clean out their CEO and much of their management for a fresh start. The sleeping giant got poked. Like everyone, I'll be waiting to see what happens, but I suspect things are vastly different now.
 
Everyone responding to this by throwing back their heads and laughing that "ARM has been in use for yeeeears!" I cock my head and say "really?" Where I am coming from is that Microsoft's Arm endeavor was kind of half-done and essentially DOA. Until the M1, you didn't have major desktop software releases compiling for ARM such as Adobe. I know Microsoft had compiled something for ARM, as did Chromium, but clearly the M1 made an ARM-derived versions of their software incredibly widespread rather than niche.

Microsoft's was half done, they didn't commit to it well and tried to push out Windows RT as the strategy there. Microsoft's consistent value over the years has been the ability to run your earlier Windows apps on later versions and their ARM strategy tried to push away from that with technology that wasn't mature. Microsoft tried to push that they could one up Apple and have one operating system for the tablet and the desktop which failed because the need to rewrite to support ARM meant the apps weren't there and the emulation at launch wasn't sufficient to handle the use cases. Microsoft looked at Apple's delineation between iOS and macOS, put it down to a technical problem but didn't address the human factors: their customers being told it ran Windows expected it to run Windows apps as any other device would. Microsoft's strategy and marketing let it down in addition to not investing in the transition.

That gives place to juxtapose against Apple's strategy of keeping iOS and iPadOS as distinct platforms from macOS with different interface expectations and capabilities. They worked on Catalyst to enable the mobile apps to come to the desktop but didn't push the other way around (Microsoft's failing). They continue to keep the two OS as independent but by removing the barriers for the mobile apps to run on the desktop. A not dissimilar strategy is Google permitting Android apps to run on their Chrome OS devices.

When Apple did (finally) do ARM on the desktop (and seriously I feel like I've been expecting this shift for over half a decade or more), they handled the platform compatibility problem significantly better than Microsoft on two fronts. The first is the most obvious: they embedded enough support into Apple Silicon to effectively emulate x86 64-bit code as a part of their Rosetta branding bringing amazing performance to running apps that for those who came from earlier Intel chips would barely notice the performance drop (it also helps that they killed support for 32-bit libraries in Catalina too). The second part is that Apple have already built an easy method of making universal binaries and libraries to enable supporting Apple Silicon an extra checkbox to support. This is building on their last platform shift from PPC to Intel where they did much the same with apps running side by side and those apps that are updated having support for native code regardless of where they ran. Microsoft has this with .Net but anything that runs native code means that it's end users need to be aware of the architecture and picking the right one. These two together made the transition smoother where from day one most of the old Intel apps continued to work fine and the incremental work to update to add ARM support was really more dependent upon how long your coding dependencies took to change and rewriting any x86 specific code portions.


What I'm thinking is in terms of how that will affect standard desktop development for usage applications. A lot of software that runs everything in everyday life is either Windows or Linux based. (In my own experience with unique medical equipment.) That's because it's expensive to develop something boutique, so if you can code an application to run an .exe on an x86 chip, then you're going to stick with that. If you can modify existing Linux distros for your software to just run on an x86 distro, you will do so.

To be honest I think the shift will be that increasingly the desktop OS will fall away and it'll be Android taking over for the cheap consumer platforms and integration with other hardware done through there or maybe dedicated Linux installs. That's already increasingly the case where control systems are leveraging Linux to handle the more complicated use cases.


Good gravy. I'm all for dunking on Microsoft. But the PC world is a Microsoft world, with Linux dev as the remoras. That sounds like I am insulting Linux. I am not. I'm very pro Linux. Forgive the metaphor. It's more about who shoves the most water around and who has to cling on. New PC equipment has to run Microsoft well first and foremost, and it's up to Debian to follow through.

Your statement was that Linux has to follow the Wintel market which perhaps if you purely consider desktop PC's that might be true but the world has changed. Linux on ARM isn't obscure, Samsung sells almost as many Linux ARM powered devices as the entire PC industry did in 2021. This belies a pivot for dominant platform that is moving where we do our business. Increasingly apps aren't written for Windows, they're written for the web, iOS and Android. Even on the desktop, Microsoft has been losing marketshare there too apparently now at 75% down from 80% in 2020 and 85% in 2019. The server space is a little more interesting with more than 50% of Microsoft's Azure platform running Linux machines whilst both Google and Amazon are almost entirely Linux deploys (including their hypervisor tiers). Many of the top tech companies also all run Linux.

In bringing up Debian's ARM support since 2000, I raise it to say that the Linux folk have had porting to other architectures as something they've done for a while now. They don't put all of their eggs in the x86 basket but continue to support multiple ISA's. The open source world has the advantage that with all of the source code available one can readily recompile to target different platforms and fix any issues as you go. If anything the Linux world is leading the way in multiplatform support, doesn't need to follow through in so far as it's already there.
 
This is great news! Given the screwy world political situation at the moment, there’s the distinct possibility that TSMC may not be an option for Apple in the future. If Intel can make ultra efficient chips that rival Apple’s, that could be a future source for Mac computers. Intel is making major investments in U.S. facilities possibly making them a more attractive option for the future.
 
Does not matter what Intel roadmap says. What truth is and will be? It does matter.
That's probably why the rumor suggests they are using TSMC's 3nm process, which is the same process Apple is expected to use for the M3.
 
This is great news! Given the screwy world political situation at the moment, there’s the distinct possibility that TSMC may not be an option for Apple in the future. If Intel can make ultra efficient chips that rival Apple’s, that could be a future source for Mac computers. Intel is making major investments in U.S. facilities possibly making them a more attractive option for the future.
Except Intel will be using TSMC as well.
 
In 2 years Intel will release a chip that it faster than a chip that will then be 3 years old. As context, Intel has not executed to plan on their 2 year road map in a half dozen years. Mark me down as not impressed?
 
That's probably why the rumor suggests they are using TSMC's 3nm process, which is the same process Apple is expected to use for the M3.
Intel is moving to a "chiplet" approach and some of those chiplets are expected to be fab'd at TSMC 3nm, but certainly not all, which exactly remains to be seen
 
But Intel does actually manufacture their own chips, Apple doesn’t, so as I said…

Yes, but “Intel's recent investor day roadmap stated the use of 'Intel 20A' and 'TSMC N3' for the Arrow Lake CPUs so it looks like the Compute tiles which include the CPU and the GPU are going to utilize an external foundry node from TSMC while certain SOC/IO IPs will be relying on Intel's own 20A node”
 
But Intel does actually manufacture their own chips, Apple doesn’t, so as I said…

Intel does manufacture many of their own chips but per the article:
The roadmap also says that Intel will utilize TSMC's 3nm process.

In this case TSMC will be producing these chips just as they're producing Apple's. It speaks to me that they don't think they will get the process where they need it by then.
 
Just because something is on a roadmap doesn't make it true or even plausible. I have my own roadmap showing that I'll have magical powers by 2023 (or, conservatively, by 2024).
 
  • Like
Reactions: Moonlight
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.