Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Apple is only making chips for Apple notebooks which means x86 is still very much important for Window machines. Intel losing Apple as a customer is a big blow but that is far from being a coup de grace.

Intel should be more worried about AMD at this point because they are making superior silicone for both desktops and laptops.

AMD is a short term threat. OEMs using Arm with blessing and support from Microsoft is the greater long term threat.
 
We can speculate and extrapolate based on Tim Cook's ethos. He likes to control the whole widget, and Apple's own chip design has become more than good enough that there's no need to rely on a third party.
Its not just Tim Cook, Apple's ethos has been going that way for some time and I feel comfortable saying it started with Jobs.

Apple isn't a typical "tech" company. No matter how they describe themselves they are fundamentally a consumer products company that sell a user experience in their ecosystem. Its why they don't really care about the specs of their products, and being a generation or two behind the bleeding edge isn't a bit deal to them. They know the large bulk of their customers only care about how the phone or whatever works for them and could care less, for example, about the refresh rate of the screen. Like most consumer products companies, relying on a third party for a critical component (in this case the CPU) doesn't make business sense a lot of the time. You don't see GM buying engines from Ford, for example, and if they do its definitely not for their flagship vehicles.
 
I thought Intel said benchmarks don't matter. How strange.
Intel would rather use "real world scenarios", in which CPUs run different pieces of code, preferably, one that is highly optimised for intel CPUs and Windows, and another piece of code running under Rosetta.
These tests are so flawed when it comes to comparing CPUs that they can hardly be called benchmarks.
So for intel, benchmarks really don't matter. At least they're being honest. What they show may amount to something, but this does not correspond to benchmark.
 
  • Like
Reactions: Maximara
What I find most fascinating is that now Microsoft wants to build it's own ARM based CPUs and GPUs in house, and standardize on them (get away from x86 entirely for all Microsoft Surface hardware) in 2-3 years. It's an acknowledgement that while yes, AMD can eeek out a bit more efficiency and multi-core performance than intel, to *really* compete in this space, x86 in general has to go. With Microsoft putting more investment into ARM for windows, the x86 emulation (which has been crap so far but will get better), and pushing developers to start porting windows apps to arm, I see X86 as a whole going away much more quickly than people probably expect. Even for gaming, these sorts of chips will enable 8k VR at high frames per second on say, an all-in-one-headset due to the performance per watt finally making it viable (light, long battery) versus having some monster tower you have to hook up to.
 
  • Like
Reactions: Kilibee
What I find most fascinating is that now Microsoft wants to build it's own ARM based CPUs and GPUs in house, and standardize on them (get away from x86 entirely for all Microsoft Surface hardware) in 2-3 years. It's an acknowledgement that while yes, AMD can eeek out a bit more efficiency and multi-core performance than intel, to *really* compete in this space, x86 in general has to go. With Microsoft putting more investment into ARM for windows, the x86 emulation (which has been crap so far but will get better), and pushing developers to start porting windows apps to arm, I see X86 as a whole going away much more quickly than people probably expect. Even for gaming, these sorts of chips will enable 8k VR at high frames per second on say, an all-in-one-headset due to the performance per watt finally making it viable (light, long battery) versus having some monster tower you have to hook up to.

Yep. CISC designs have no inherent advantage in an era where we aren’t hand optimizing assembly code to save every byte of instruction memory. And they have some real disadvantages.
 
INTEL starting to resemble IBM of 1990's....arrogant "fat Roman Empire" syndrome.
In the case of IBM it was Bill Gates who came into their building and laid down his licensing demands and the IBM management said "well all the money is in HW". Then over the next 15 years IBM began a downward slide while Microsoft went on to conquer the world.
INTEL just got fat and lazy being #1 for so long...their future is before them...a long slide down.
 
  • Like
Reactions: Maximara
What O cant understand is how is it possible “grown ups” (intel and pcworld) are taking seriously this top intel/amd chips vs base M1 apple chips.

is like saying I’m as strong as a 10 years old kid, or even I’m stronger!!!

Just wait for those bullies to run to their mother’s when the M1’s older brother come to town after springbreak. What would they said?
 
  • Haha
Reactions: Maximara
They have a license. They used to sell Arm processors (remember StrongARM, which they acquired from DEC?)

The problems, of course, with your suggestion:

1) so what’s their value added? Why would people buy an Intel Arm processor instead of a nVidia one (presumably which will trounce Intel in graphics) or a Qualcomm one? (Which will come will all sorts of radio functionality)

2) Intel’s business model is selling commodity chips to customers. It doesn’t customize them for the needs of each customer. So how does an Intel chip meant for “everyone” compete with Apple chips, which are customized for Apple’s needs? Doesn’t that mean that, over time, superior Apple machines will encroach into the “rest of the market?”

3) Intel’s fabs have been bad for years. Their entire reason for existence in the past was the wintel alliance and their superior fabs. With the wintel alliance broken, and their fabs a mess, what’s their reason for existence? To design chips to be fabbed on TSMC? But there are many companies that are more efficient at and better at designing chips. So if everyone is using TSMC, Why would anyone buy intel’s inferior designs?
Not that I love Intel, but I do work in a(not Intel ) fab.
You are confusing chip DESIGN with MANUFACTURING. Those are very, very different things.
For example nVidia designs GPUs but does not manufacture them. TSMC manufactures but does not design GPUs.

Process integration (we figure out how to actually make what the design guys drew) is a huge part of making chips at the bleeding edge. A 5nm chip has poly silicon lines less than 25 atoms wide. 3nm will cut that to 15.
Each device is an astounding feat that would be practically magic even in the early 1960’s.
 
What O cant understand is how is it possible “grown ups” (intel and pcworld) are taking seriously this top intel/amd chips vs base M1 apple chips.

is like saying I’m as strong as a 10 years old kid, or even I’m stronger!!!

Just wait for those bullies to run to their mother’s when the M1’s older brother come to town after springbreak. What would they said?
The problem is that the M1 chip is just a start. There are several red flags that a company like Intel must watch out for. A typical _huge_ red flag is someone coming in from nowhere and competing. Not (yet) beating you, but taking market share. In the next two years, apple will take 10% market share from Intel. And who nice, if Apple and Microsoft make a deal, that could be quite fatal for Intel. Especially if Microsoft makes Windows for Arm run better than Windows for Intel.
 
  • Like
Reactions: Maximara
Not that I love Intel, but I do work in a(not Intel ) fab.
You are confusing chip DESIGN with MANUFACTURING. Those are very, very different things.
For example nVidia designs GPUs but does not manufacture them. TSMC manufactures but does not design GPUs.

Process integration (we figure out how to actually make what the design guys drew) is a huge part of making chips at the bleeding edge. A 5nm chip has poly silicon lines less than 25 atoms wide. 3nm will cut that to 15.
Each device is an astounding feat that would be practically magic even in the early 1960’s.
Not sure why you think I’m confusing design and manufacturing. My point was that Intel is bad at design and used to be (but is not now) good at manufacturing.

I designed many CPUs at sun, exponential, and AMD, so I’m clearly aware of the differences.
 
  • Like
Reactions: ErikGrim
Hey Intel, the time you took to do this silly little presentation you should have just spend it on fixing your processors
 
Hey Intel, the time you took to do this silly little presentation you should have just spend it on fixing your processors
Well, I’ll give Intel this much - anyone who wasted their time on the PowerPoint in question was certainly not someone who’d be of any use in fixing their manufacturing and design.
 
AMD is a short term threat. OEMs using Arm with blessing and support from Microsoft is the greater long term threat.

I agree with you that this was probably aimed most at OEMs considering a switch to ARM, but given the poor quality of these charts honestly I have to wonder at why - if they matter at all this could even backfire. OEMs considering adding WoA lineups are (or should be) going to do their own intensive cost-benefit analysis including performance. And these charts have such obvious flaws that to even just semi-informed consumers they actually succeed at making TGL look worse than it is. Those tasked with making such decisions at OEMs would probably be bemused if not outright dismissive of these charts. Uninformed consumers who *might* be influenced are unlikely to come across them at least not in a context that doesn’t cast doubt on them.

I mean making the hardware inconsistent between charts is especially ... obvious.
 
I agree with you that this was probably aimed most at OEMs considering a switch to ARM, but given the poor quality of these charts honestly I have to wonder at why - if they matter at all this could even backfire. OEMs considering adding WoA lineups are (or should be) going to do their own intensive cost-benefit analysis including performance. And these charts have such obvious flaws that to even just semi-informed consumers they actually succeed at making TGL look worse than it is. Those tasked with making such decisions at OEMs would probably be bemused if not outright dismissive of these charts. Uninformed consumers who *might* be influenced are unlikely to come across them at least not in a context that doesn’t cast doubt on them.

I mean making the hardware inconsistent between charts is especially ... obvious.

If intel is no good at making chips anymore, what makes you expect them to be good at making PowerPoints? ;-)
 
  • Like
Reactions: crazy dave
Intel should say something like they are excited by the competition and promise people that they have great things coming up and that fans won't be disappointed.
The fans in my intel macs are never disappointed. They cheer loudly when I use about any app other than textedit.
 
  • Like
Reactions: ErikGrim
Agreed, all that matters is real world performance; however, I don't see how Microsoft can tightly integrate software and hardware the way Apple does, so Intel is still very much alive.
The only thing I can see Microsoft doing is creating their own ARM processor and the license the processor and Windows together. Which would basically outprice Intel because Intel is always expensive AND it doesn't include a Windows license.
 
  • Like
Reactions: Maximara
i'm sorry... I don't remember you calling Apple's most vague language during M1 macbook announcement as "carefully crafted"... Intel is being stupid, but let's not be hypocrits here..
Well, to be fair, one this is a Mac fan web site and two Apple was trying to be vague on purpose. They didn't want to completely put shade on Intel - because they are still using Intel. At least, that is the impression I got. Because even for Apple, those charts and comparison's was way to vague.
 
  • Like
Reactions: KPOM
Agreed, all that matters is real world performance; however, I don't see how Microsoft can tightly integrate software and hardware the way Apple does, so Intel is still very much alive.
Microsoft did work with Qualcomm to make a semi-custom version of one of its ARM processors. They would need tighter coordination, and Qualcomm would need to use its design license rather than take a stock ARM design in order to replicate what Apple has done with M1, but now that they can see the benefits in a shipping product, perhaps they would be willing to make the investment.
 
They got a new CEO about a two weeks ago, he must be exercising his intelligence.
Boy is Intel in trouble if this is an example of their CEO's intelligence. :p This form of FUD might have worked 10 years ago but with everybody and his brother testing the M1 Macs against higher end Intel chips and showing the testing on Youtube it isn't going to work. Typical Intel though - use antiquated methods to deal with modern problems.:D
 
Well, to be fair, one this is a Mac fan web site and two Apple was trying to be vague on purpose. They didn't want to completely put shade on Intel - because they are still using Intel. At least, that is the impression I got. Because even for Apple, those charts and comparison's was way to vague.
I think Craig Federighi did say that the charts were “real” and based on actual tests. Apple didn’t release too many specifics, but did say which Intel Macs they were comparing the M1 Macs to (they used the 2020 i7 Air and 8th-gen i7 MacBook Pro). I agree that they weren’t trying to cast too much shade since they are still selling Intel models, and it appears that this time they really might need the full 2 years to make the transition.

Apple also appeared to be making general observations, for which the graphs may well be accurate. Obviously certain tasks will be better suited to different processors and operating systems, particularly based on software optimization. Intel’s comparison of exporting PDFs of PowerPoint presentations says as much about Microsoft‘s coding as it does the M1’s capabilities.

My own ”real-world” experience is that the M1 MacBook Air feels roughly comparable to the Ice Lake 13” MacBook Pro that it replaced. I used both for a couple of months (the extended holiday return period) before I concluded I’d keep the M1 Air and sell the Ice Lake Pro. Sure, I took a hit on the 13” Pro, but still netted enough to pay for the M1 Air. Ditching the fan noise and gaining battery life were worth it, plus Crossover allows Quicken for Windows (my last remaining Windows app) to run just as well as it did on the 13” Pro under Parallels or Crossover.

I had a hard time recommending the 2020 Intel Air (having tried it briefly last year), but have no qualms at all recommending the M1 Air even to people who I’d previously have recommended the 13” Ice Lake Pro (provided they don’t need 4 TB ports, 32GB RAM, or support for 2 external displays).
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.