Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The cognitive dissonance with which people discuss Intel is just incredible. Fastest chips? So what! Top talent moving to Inttel from Apple? So what!

When will people realize that Intel is moving back to its historic place in the industry, and has built in advantages that will keep that position inevitable in all but the shortest term.

It took a year and a half for Intel to pre-announce a chip that beats M1 on a single benchmark? So what! Intel's top talent left the building years ago to help Apple and others obsolete x86? So what! Apple has been shipping more processors and spending more on R&D than Intel for years? So what!

It's hard to know for sure what the future holds for Intel, but their historic place in the industry has been a one trick pony that uses process and market power to beat back competition. Looking forward, it's hard to see how Intel regains hegemony when people are no longer enthralled by that trick, they no longer have a process advantage, and their market power is severely eroded by the rise of mobile devices and thermal limits in the server room.
 
  • Like
Reactions: jdb8167
Folks who jump like this are not looking to cherish their legacy which is a major shift in industry , they are just looking at money and knowing Intel , this guy will find out that he cannot change intel’s work culture
It’s all about money. He learned that at Apple. Brand loyalty is the loser here.
 
Apple might just be hoping that the security of Intel’s chips gets as messed up as the T2’s, which can be compromised by a trivial USB gadget.
Oh, requires physical access. Basically, if your device’s physical security is sufficient, nothing to worry about. Ping me when there’s a remote attack that doesn’t require you allowing access to your computer. :)
 
Wow.

@cmaier you ever designed a pizza at Domino’s? Just wondering how similar it is. :)

Yes, that‘s my side job.


Just in case anyone is wondering: no, chip design is not “marketing fluff“ and a “pretty easy part of chip development.“

Design is when you take the architectural specification, break it down into high level logical blocks (and blocks within blocks), then implement all the way down to the physical level. You have to create the logic design, the circuit design, and the physical design. It means you figure out where every one of the billions of transistors are going to go, how they will be connected, where the wires that connect them will go, and the shape of each transistor. The design team has to take into account not just timing constraints and power constraints, but manufacturability requirements, electromigration and reliability concerns, the impact of process variations, etc. Most of the time it takes to develop a chip is spent in the “Design” stage.
 
Literally no one cares that Macs aren't great for gaming, except non-Mac owners. It's as valid an argument as telling a Ferrari owner that their car is sh-t because they can't fit a month's groceries in the trunk. It's technically true but they didn't buy their Ferrari to go grocery shopping.

You make no sense in my world. I’m a owner of several Macs and do care.

PS: I had a Ferrari. Believe me … gaming speed would be closer to a Ferrari ethos than how fast it runs Adobe Premier. Adobe Premier would be like groceries in the trunk :). I buy Macs because the product ethos is way closer to say … a nice safe Volvo then a Ferrari …

But hey if buying a Mac makes you feel closer to a Ferrari … who am I to judge.
 
Last edited:
  • Like
Reactions: nebojsak
The cognitive dissonance with which people discuss Intel is just incredible. Fastest chips? So what! Top talent moving to Inttel from Apple? So what!

When will people realize that Intel is moving back to its historic place in the industry, and has built in advantages that will keep that position inevitable in all but the shortest term.
While I agree that Intel has some advantage in legacy tech, they have completely missed the market shift into vast new markets for current and future trends.

  • Mobile phones have become the main daily "computer" for the bulk of the world's population - Intel missed out on that explosive growth in CPUs, GPUs, memory, modems, etc, with virtually a 0% market share. Huge chunks of the world skipped directly to phones without ever owning a computer...
  • Tablets - again, close to 0% market share
  • Servers - while Intel was has had a decent showing to date, I suspect that the major cloud-based CPUs are looking to migrate a large portion of future business to non-Intel processors.
  • AI - not sure what Intel's road map is here, but I see little that I would call a "built-in advantage".
  • Electric vehicles - again, I don't see much of a presence in this huge, potential market.
  • Medical monitoring and wearable devices - in some ways Apple is charging ahead in this area. Many people make fun of the Apple watch, but I see the watch as their ongoing "beta" program to test and develop wearable health sensors. The health care industry is probably the biggest worldwide market (it is 20% of US GNP), so any company that figures this out will have massive growth for decades to come. THis market mixes sensors, software, and demands extremely power-efficient hardware.
I do see Intel as historically a strong manufacturer, hence the moise about mixing a complete solution with providing foundry services to other companies that design in-house. But their x86 path has many potential pitfalls heading down the road.
 
If intel are only managing to tempt apple engineers right now, two years into their public release intel has bigger problems.

Ps I’m amazed apple don’t have clauses in their contracts to prevent this. But I get it’s hard to enforce.
 
I'm honestly not sure why people are so quick to discount Intel. When Intel is on top they're show, iterative and reactive. Now that they're the underdog I'm frankly impressed with the quick moves they're making right now. They're investing a hundred billions into new state of the art fabs and pivoting towards becoming a viable contract manufacturer. They're setting aside massive amounts of money designated for the retention of it's top talent as well as the acquisition of new talent. I see an Intel now firing on all cylinders and although it takes time to turn a massive boat such as Intel the right moves are in place. The things that set X64/X86 and Arm apart aren't as simple as risc vs cisc and my bet is that with the right engineers the line will be blurred even further. Apple didn't pick Arm because it was more advanced, it picked Arm because they wanted to design their own chips and X64 isn't licensed out so it wasn't an option in the first place.
 
If intel are only managing to tempt apple engineers right now, two years into their public release intel has bigger problems.

Ps I’m amazed apple don’t have clauses in their contracts to prevent this. But I get it’s hard to enforce.
Such clauses are not enforceable in California, at least for jobs like chip engineers.
 
I'm honestly not sure why people are so quick to discount Intel.
It’s not quick, though. Skylake was a while ago, but Intel hasn’t 100% recovered from that. They would have to have a string of successes to match the Skylake and after failures PLUS more before people will be giving Intel the “benefit of the doubt”.

Intel became a big name in a world where everyone assumed that it was BETTER to source chips from Intel than put in the effort to design your own (unless your use case was highly custom). Companies that, 10 years ago, would have been working with Intel to help design their small power efficient devices around Intel solutions are looking at designing their own because Intel either can’t deliver performant power efficient devices or they don’t WANT to deliver performant power efficient devices (to avoid cannibalizing their more expensive solutions). Add to that, the market described by Gartner is moving to “ultra mobile” devices, again, an area where Intel either can’t or won’t provide a performant solutions. All of those things together are reasons to be concerned about Intel’s future.

They’ll likely always be somewhat of a household name, like IBM is a household name, but how the company makes money and what they produce may be VERY different from today.
 
Yes, that‘s my side job.


Just in case anyone is wondering: no, chip design is not “marketing fluff“ and a “pretty easy part of chip development.“

Design is when you take the architectural specification, break it down into high level logical blocks (and blocks within blocks), then implement all the way down to the physical level. You have to create the logic design, the circuit design, and the physical design. It means you figure out where every one of the billions of transistors are going to go, how they will be connected, where the wires that connect them will go, and the shape of each transistor. The design team has to take into account not just timing constraints and power constraints, but manufacturability requirements, electromigration and reliability concerns, the impact of process variations, etc. Most of the time it takes to develop a chip is spent in the “Design” stage.
All of what you're describing are child's play compared to fabbing. Designing a chip is a joke.
 
Good stuff - I hope apple move back to intel if it makes sense performance-wise. Being chip agnostic is a strength and something apple have proven they can do in the past. There no rational reason to support M1 chips as users as if it were a football team.
 
All of what you're describing are child's play compared to fabbing. Designing a chip is a joke.

What? Fabbing a chip is an automated process. The fractured design is provided to the fab as a gdsII file which describes, for each layer of the chip, the exact dimensions and locations of each polygon. These are converted into masks for lithography. Then the whole process is automated. The people in the fab keep the machines running, but they don’t know anything about what they are making. I‘ve designed CPUs for a very long time, and spent a lot of time in a fab, as well, and I can assure you that you don’t know what you are talking about.
 
What? Fabbing a chip is an automated process. The fractured design is provided to the fab as a gdsII file which describes, for each layer of the chip, the exact dimensions and locations of each polygon. These are converted into masks for lithography. Then the whole process is automated. The people in the fab keep the machines running, but they don’t know anything about what they are making. I‘ve designed CPUs for a very long time, and spent a lot of time in a fab, as well, and I can assure you that you don’t know what you are talking about.
The design of the chip isn't difficult to replicate, nor is it the main driver of performance. You're talking about the low-level workers at the fab who're only responsible for running the machines. The people developing the lithography machines aren't going to be talking to you.
Don't give yourself more credit than you deserve. Chip design is child's play compared to fabbing. That is why there is only 1 company fabbing at sub-10nm at an acceptable yield while even Intel is trashing Apple in performance-per-watt with the i9-12900KF.
 
The design of the chip isn't difficult to replicate, nor is it the main driver of performance. You're talking about the low-level workers at the fab who're only responsible for running the machines. The people developing the lithography machines aren't going to be talking to you.
Don't give yourself more credit than you deserve. Chip design is child's play compared to fabbing. That is why there is only 1 company fabbing at sub-10nm at an acceptable yield while even Intel is trashing Apple in performance-per-watt with the i9-12900KF.
The people designing the lithography machines are not in the fab. So is the most difficult part of making a car the design of the robots? Your argument makes no sense.

And the design of the chip is absolutely difficult to replicate. It takes Intel two years to design a chip, using several hundred engineers (probably more than 500), many with advanced degrees. I can assure you that the fab machines you are talking about cost a lot less to R&D.
 
  • Like
Reactions: ericwn and jdb8167
The people designing the lithography machines are not in the fab. So is the most difficult part of making a car the design of the robots? Your argument makes no sense.

And the design of the chip is absolutely difficult to replicate. It takes Intel two years to design a chip, using several hundred engineers (probably more than 500), many with advanced degrees. I can assure you that the fab machines you are talking about cost a lot less to R&D.
Face it. He knows more than you. He’s a multi millionaire don’t you know? /s
 
  • Haha
Reactions: Unregistered 4U
The people designing the lithography machines are not in the fab. So is the most difficult part of making a car the design of the robots? Your argument makes no sense.

And the design of the chip is absolutely difficult to replicate. It takes Intel two years to design a chip, using several hundred engineers (probably more than 500), many with advanced degrees. I can assure you that the fab machines you are talking about cost a lot less to R&D.
lol...
I also mentioned ASML, not just TSMC. Both companies are making breakthroughs and Apple is just taking the credit for someone else's innovations like they always do. I guarantee you that if TSMC wanted, they could make an M1 design easily while Apple wouldn't even be able to fab a 28nm chip without turning the chip into a bomb.

Also, about your car analogy, the great Elon Musk stated that "Prototypes (designs) are easy. Manufacturing is hard." Any half-decent engineer can make a good design in whatever field he's in. However, it's the manufacturing process that's the most important part of any product, as the capabilities of the manufacturing process ultimately determines whether the engineer's design is feasible or not.

Notice how Intel started losing to AMD right when Intel lost the fab advantage? Notice how Intel's designs remained stagnant while they were stuck on 14nm? If design were so important, why couldn't Intel just out-design AMD's 7-nm CPU and beat them? Hint: They couldn't, because chip design isn't that important. It's a commodity skillset that any half-decent engineer can do.
 
  • Haha
Reactions: jdb8167
If that was true every chip produced on the same or similar node would have near identical performance.

Which is as far from the truth as it could get.

There is a difference between older A series and current Non-Apple ARM being made at 7 or 10nm.
There is a difference between the 1st and last gen of 14nm Intel chips.


But yeah every schmock could do an A15 after a few weeks of training. So now please show me where to get that training as I really want to take over that guys job.....
 
  • Like
Reactions: Unregistered 4U
If that was true every chip produced on the same or similar node would have near identical performance.

Which is as far from the truth as it could get.

There is a difference between older A series and current Non-Apple ARM being made at 7 or 10nm.
There is a difference between the 1st and last gen of 14nm Intel chips.


But yeah every schmock could do an A15 after a few weeks of training. So now please show me where to get that training as I really want to take over that guys job.....

It depends a lot on what the chip designers wanted and what their corporate goals were. If they decided to add smaller, more cores, you would get a different performance profile than what Apple is getting. They might have decided to down clock the cores to achieve higher yields and save on costs. Some of them were on Samsung's nodes, which are behind TSMC's.
However, these design decisions are child's play and on-par with customizing a pizza from Domino's in terms of intellectual rigor. It doesn't come close to the difficulty of developing the fab process.

Also, you are wrong about the 14nm Intel chips. The fab process was improved incrementally, because the incremental "tock" process was prolonged (14nm++++++++++++++) due to 10nm fab difficulties. The 11th gen intel 14 nm CPU is on a better fabrication process than the first one.

Notice how when Intel finally jumped to the 10nm process (Has some advantages compared to TSMC 5nm), they finally have a CPU with better performance per watt than the M1?

Apple fans are giving way too much weight to Apple's "designs". It's mostly just marketing junk.
To make a good analogy, it's like me ordering 100 pineapple pepperoni pizzas from Dominos and then reselling them while claiming that I custom-designed the pizza. While that is true, the taste of the pizza (Performance of the chip) is mainly thanks to how Dominos makes the pizza. If you ordered the same type of pizza elsewhere, the taste would be different.
 
Last edited:
  • Haha
Reactions: jdb8167
Days of multi-trillion dollar companies are ending.

For this to be true Apple would need to loose about 30% of it's value which might seem plausible on it's own, but inflation will push in the opposite direction and Apple would push back over 2 trillion whenever the next bull run starts (even if thats just 500 billion in todays money).

Even if Apple completely fails and goes the way of C= (that would require a decade long perfect storm) others would rise from it's ashes.
 
The people designing the lithography machines are not in the fab. So is the most difficult part of making a car the design of the robots? Your argument makes no sense.

And the design of the chip is absolutely difficult to replicate. It takes Intel two years to design a chip, using several hundred engineers (probably more than 500), many with advanced degrees. I can assure you that the fab machines you are talking about cost a lot less to R&D.
I had to show ignored content to figure out why you seemed to keep restating the obvious...

I think you're getting taken for a ride. It's just layer after layer of unfounded assertions:
  • Intel and Apple don't even have the same design goals (one doesn't care what it does as long as its x86, the other doesn't care what instructions it runs as long as it does what they need).
  • It's false on the face of it that two designs with the same goals will be equally performant. See: Intel/AMD, Apple/Quallcomm/Samsung. You shouldn't need to be a designer to understand that there are good and bad designs.
  • Sure, fabrication process technologies are a continuing physics challenge and important to the end product, but lithography isn't even the hard part of the fab process. I feel like they last read an article in the early 00's (diffraction limits, whatever shall we do?!)
  • The designer needs to understand the process to get the most performance from the design, so as hard as the process development might be, the design team needs to integrate that information into the design.
  • There's a lot of hard parts in making the best chips in the world, pointing at one machine as "the one" is like saying rolling the brass for the pipes is the hard part of Bach's Toccata and Fugues.
 
I had to show ignored content to figure out why you seemed to keep restating the obvious...

I think you're getting taken for a ride. It's just layer after layer of unfounded assertions:
  • Intel and Apple don't even have the same design goals (one doesn't care what it does as long as its x86, the other doesn't care what instructions it runs as long as it does what they need).
  • It's false on the face of it that two designs with the same goals will be equally performant. See: Intel/AMD, Apple/Quallcomm/Samsung. You shouldn't need to be a designer to understand that there are good and bad designs.
  • Sure, fabrication process technologies are a continuing physics challenge and important to the end product, but lithography isn't even the hard part of the fab process. I feel like they last read an article in the early 00's (diffraction limits, whatever shall we do?!)
  • The designer needs to understand the process to get the most performance from the design, so as hard as the process development might be, the design team needs to integrate that information into the design.
  • There's a lot of hard parts in making the best chips in the world, pointing at one machine as "the one" is like saying rolling the brass for the pipes is the hard part of Bach's Toccata and Fugues.
  • Intel and Apple don't use the same fab, so I'm not sure what your point is.
  • Apple is on TSMC 5nm. AMD is on TSMC 7nm. Intel is on Intel. Samsung and Qualcomm are on Samsung. Not an even comparison.
  • Sure.
  • lol... The fab tells the designer where the strength lies in the fab process. The designer just follows the fab's advice 99% of the time.
  • Fabrication is the hardest part.
 
  • Haha
Reactions: jdb8167
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.