Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.


Apple's former Director of Mac System Architecture Jeff Wilcox this week announced that he has left Apple to take on a new role at Intel. As noted on LinkedIn (via Tom's Hardware), Wilcox was part of Apple's M1 team and he had a key role in the transition from Intel chips to Apple silicon.

new-m1-chip.jpg

Wilcox's profile says that he "led the transition" for all Macs to Apple silicon, and prior to that, he developed the SoC and system architecture for the T2 coprocessor used in Intel Macs.When Wilcox announced his departure from Apple in December, he said that he was pursuing a new opportunity and that he was proud of what he had accomplished at Apple.Wilcox spent eight years working at Apple, and as of this week, he is the Design Engineering Group CTO at Intel. Wilcox says that he will be responsible for the architecture of all SoCs for all Intel client segments. Prior to working for Apple, Wilcox was at Intel where he served as a principal engineer on PC chipsets, and prior to that, he worked at Magnum Semiconductor and Nvidia.

The Apple silicon team is led by Johny Srouji, Apple's vice president of hardware technologies, and it's not clear if Wilcox's departure will have much impact on the development of Apple silicon chips going forward. Apple is well into its Apple silicon transition and is expected to complete it in 2022 with the launch of new Mac Pro and iMac Pro machines that use Apple silicon chips.

Intel CEO Pat Gelsinger in October said that he hopes to win back Apple's business in the future by creating "a better chip" than Apple can make. He also said that he is planning to ensure that Intel's products are "better than theirs," and that Intel has a more open and "vibrant" ecosystem. "I'm going to fight hard to win Tim's business in this area," he said.

Intel earlier this week introduced a new Core i9 processor designed for laptops, and the company has claimed that it is faster than Apple's M1 Max chip used in the 14 and 16-inch MacBook Pro models.

Article Link: Key M1 Mac Engineer Departs Apple for Intel
This is great competition benefits the consumer.

Also someone in his position will have a NDA limiting what they can do if they leave for a competitor and for how long. Plus SoC isn't anything new and Apple is using ARM for it's based so not totally a Apple design. Only thing of interest he has knowledge of at this point Apples five year timeline. So he'll have to talk around it but not directly about it. But if you worked in the industry you know these big companies know a lot about each other. The media is the ones that create the big tech war talk to sell their stories.
 
This is great competition benefits the consumer.

Also someone in his position will have a NDA limiting what they can do if they leave for a competitor and for how long.

Nope. Can’t use apple’s trade secrets in new job, but in California you can’t enforce non-compete clauses except in rare circumstances that don’t apply here.
 
HA HA HA!!!! FUNNY!!!

Your Joking right??

He knows where the future is.

X86 and Windows 11

Not the closed garden of Mac.
There's a good reason that Intel hired him: They want someone with ARM design experience because Intel can see which way the wind is blowing. Intel will be coming out with its own ARM-based chips soon.
 
Does Intel make SoCs for any client segments?

Most of the laptop in the. Client segment.

Here is an Intel composed image.

S6g2pKgVF8hLBZqiM3iJLQ-970-80.jpg.webp


[ the image credit in tomsguide's article says Intel. ]

The CPU and the I/O chip are bounded onto the same package. Can try to nitpick that this isn't a SoC but it has got

i. CPU
ii. GPU
iii. I/O ( USB , SATA , external PCI-e lanes, Audio, base WiFi unit , Base Ethernet unit , etc. )
iv. Thunderbolt 4

That is pretty much most of a computer. Yes there are some other chips on the logic board but there are other chips on the logic board for Apple's systems also.

That was Intel Gen 11. For gen 13 and up it is going to be even more chiplets/tiles on the package.

Wherever, Intel can dump the LGA socket for a client platform product , the solution is going to increasing be an SoC across the board. Intel is similar to Apple: > 65+ % is laptops. Which means the bulk of the client division at Intel is SoCs.

Another viewer on the SoC in a system

TGL%20CPU_678x452.jpg




Anandtech's launch coverage of Tigerlake

474551355-Intel-Blueprint-Series-11th-Gen-Intel-Core-Processors-pdf-page-057.jpg


20200902170937.jpg




For laptops there are two two "containers" here but pragmatically you can't buy them decoupled. If get one you have to get the other in the same request.

11th-generation-core-processor.png.rendition.intel.web.720.405.png



Pragmatically that makes that class of products a "chip". Folks can try to "protest" that the Tiger Lake Multiple chip module (MC) is a printed circuit board more than a "chip". Intel Foveros and EMIB over the next 2-5 years is going to make that distinction less and less relevant. [ Coming in as CTO stuff this guy is going to work on that see market entry is going to be on the tail end of 5 year time frame. ]. This isn't about where Intel was in 2019-2020... or even 2021. It is about where the stuff that is coming is. And yes , Intel had already started using that terminology in 2020 because they knew it was coming.



P.S. The top end laptop configurations look more like desktops with a large, discrete ( often significantly displaced ) PCH controller . That is where Intel is now. But that isn't making them more competitive in the 2020's in the laptop market. Going to have to collapse into a smaller set of packages to keep up with AMD let alone other competitors.
 
Last edited:
  • Like
Reactions: Analog Kid
Intel will be coming out with its own ARM-based chips soon.

Highly doubt it since Intel sold off ARM Xscale business to Marvell. More likely Intel will continue to evolve x64 architecture with mobile attributes like Alder Lake. Until software availability improves, ARM for PC will remain <10% niche while x64 will remain >90%.
 
Most of the laptop in the. Client segment.

Here is an Intel composed image.

S6g2pKgVF8hLBZqiM3iJLQ-970-80.jpg.webp


[ the image credit in tomsguide's article says Intel. ]

The CPU and the I/O chip are bounded onto the same package. Can try to nitpick that this isn't a SoC but it has got

i. CPU
ii. GPU
iii. I/O ( USB , SATA , external PCI-e lanes, Audio, base WiFi unit , Base Ethernet unit , etc. )
iv. Thunderbolt 4

That is pretty much most of a computer. Yes there are some other chips on the logic board but there are other chips on the logic board for Apple's systems also.

That was Intel Gen 11. For gen 13 and up it is going to be even more chiplets/tiles on the package.

Wherever, Intel can dump the LGA socket for a client platform product , the solution is going to increasing be an SoC across the board. Intel is similar to Apple: > 65+ % is laptops. Which means the bulk of the client division at Intel is SoCs.

Another viewer on the SoC in a system

TGL%20CPU_678x452.jpg




Anandtech's launch coverage of Tigerlake

474551355-Intel-Blueprint-Series-11th-Gen-Intel-Core-Processors-pdf-page-057.jpg


20200902170937.jpg




For laptops there are two two "containers" here but pragmatically you can't buy them decoupled. If get one you have to get the other in the same request.

11th-generation-core-processor.png.rendition.intel.web.720.405.png



Pragmatically that makes that class of products a "chip". Folks can try to "protest" that the Tiger Lake Multiple chip module (MC) is a printed circuit board more than a "chip". Intel Foveros and EMIB over the next 2-5 years is going to make that distinction less and less relevant. [ Coming in as CTO stuff this guy is going to work on that see market entry is going to be on the tail end of 5 year time frame. ]. This isn't about where Intel was in 2019-2020... or even 2021. It is about where the stuff that is coming is. And yes , Intel had already started using that terminology in 2020 because they knew it was coming.



P.S. The top end laptop configurations look more like desktops with a large, discrete ( often significantly displaced ) PCH controller . That is where Intel is now. But that isn't making them more competitive in the 2020's in the laptop market. Going to have to collapse into a smaller set of packages to keep up with AMD let alone other competitors.
Thanks for taking the time to pull that all together-- it's a really useful summary.
 
  • Like
Reactions: brucemr
I think they’re forced to, though. It’s the nature of the beast, any market they decide NOT to compete in is a market someone else has an opportunity to make an effort, learn from the experience, iterate and eventually challenge Intel. Lucky for them, that last time that happened, the company that gained from the experience only makes chips for ONE company :)
Pretty much the same narrative was trotted out for Apple in the late 80's through mid 90's. Apple has to get to 25% or better share of PC market to survive. Gotta grab as many PC seats as possible otherwise can't be a controlling player .... And that basically drove the company almost into bankruptcy .

Intel isn't "forced" to exert monopolistic control. That is a choice. ( I guess is all that matters is greed than it is "forced to be greedy" ).

That mentalist is one of the reasons their foundry business start up attempts have repeatively failed. It is a slippery slope enough to build and offer services. If taking overt steps to "knife" everybody in the back ... long term won't find any good partners.

The notion that Apple was on only one to sneak through is myopic. Intel tried to "buy" their way into the smartphone market. Samgsung , Qualcomm , Mediatek are doing just fine. The strategy of taking money from one segment where can almost "print money" on margins and buy this and that that is related typically fails over the long term becuase not winning on competence... just throwing money around. At some point either attract greemailers ( who want a cut of the "print money" profits ) or just get lazy and dumb. [ Never mind when governments go on the antitrust war path. ]



Part of the mindset of doing "everything" was that was on way to fill up high volume fabs. That might work when fabs were 'crazy' expensive. The costs now to stay on the bleeding edge is it is much, much much less risky to share to load (R&D and investment) with other products and organizations.
 
Highly doubt it since Intel sold off ARM Xscale business to Marvell. More likely Intel will continue to evolve x64 architecture with mobile attributes like Alder Lake. Until software availability improves, ARM for PC will remain <10% niche while x64 will remain >90%.

Sold the business but kept some of the intellectual property. Besides Intel has several products that has ARM cores in them. It isn't like they don't send a check to ARM on a regular basis anyway. It is just not a ginormous check.

The reason why Intel will probably stick with x86-64 is the inertia helps add a multiplier to the money they invest. Lots of customers don't want to move. That said Intel (and AMD) have a problem in the laptop space. Not just Apple M series.

Intel does need to come up with a cleaned up ( de-constipated ) upgrade to x86-64 . ( e.g. dump the dead end 80's and 90's and perhaps early 2000's stuff that modern. Windows 11 and ChromeOS don't really need. quirky boot and obsolete security modes from the 80's. . Half a dozen, extremely redundant attempts at SIMD instructions , etc. ) . Have more of a hoarding problem than an instruction set problem. ( at least for mainstream PC CPUs. )
 
Anyone who’s ever worked a corporate job knows this will have zero impact at Apple. Depending on how good he really is, he might have an impact at Intel considering his new title. The irony is, he was already at Intel but nobody recognized his skills so he moved on while Intel has cratered. So typical of corporate… overlook true talent for a** kissers.
I honestly feel that if this made the news on this site, he was a big player at Apple, although not famous to us.
 
Pretty much the same narrative was trotted out for Apple in the late 80's through mid 90's. Apple has to get to 25% or better share of PC market to survive. Gotta grab as many PC seats as possible otherwise can't be a controlling player .... And that basically drove the company almost into bankruptcy .
You said they try to build everything for everybody, I’m acknowledging that point. I’m only adding that the main reason why companies usually do this is some fear that another company will take ANY opening and exploit it. Intel, for example, could do what Apple did and simplify their instruction set by cutting 32-bit instructions. However, it’s almost guaranteed that the moment they announce that plan, other companies would try to become the favored vendor for those still needing 32-bit capable processors and could potentially use the money and experiences obtained from the venture to challenge both Intel and AMD in the future.
 
Intel, for example, could do what Apple did and simplify their instruction set by cutting 32-bit instructions.

No they couldn't, Intel relies on Windows which they don't control.

If MS would remove 32Bit from Win12 (after making sure that only 64Bit apps can use Win11 features years before) they could start thinking about it AFTER the Win12 adoption rate is >50%.

Remember Apple removed support for 32Bit CPU long before the removed support for 32Bit app long before they removed 32Bit support in HW. Both for macOS and iOS.
 
You said they try to build everything for everybody, I’m acknowledging that point. I’m only adding that the main reason why companies usually do this is some fear that another company will take ANY opening and exploit it. Intel, for example, could do what Apple did and simplify their instruction set by cutting 32-bit instructions. However, it’s almost guaranteed that the moment they announce that plan, other companies would try to become the favored vendor for those still needing 32-bit capable processors and could potentially use the money and experiences obtained from the venture to challenge both Intel and AMD in the future.
They could start by cutting 16bit instructions and segmented memory. They could cut the fat off their many generations of vector processing and multimedia instruction sets. There’s a lot they could do.

But I don’t think they’re worried that someone is going to come in and cater to legacy customers. I think they’re worried about breaking the spell they have over their customers. They, and Microsoft, have created this arsenal of FUD around backwards compatibility— you’re supposed to feel safe using those products because the code you write today will always run until the end of time. It preys on customers fears of losing their investments to obsolescence.

Instead of worrying about another vendor stealing their legacy customer base, though, they should worry about those being their only customers left. 3 old banks and a NASA computer tracking Voyager.
 
No they couldn't, Intel relies on Windows which they don't control.
This is part of the point I’m making. Part of Intel’s business is being “the CPU for everyone” and because of that, there’s a large part of their business plan that’s already written for them. Even if it might make technical sense to take certain actions, they absolutely can’t due to that running counter to them being “the CPU for everyone”.

Now, if one day they decide that the overriding direction is to be “the FAB for everyone”, THEN it’s a possibility.
 
  • Like
Reactions: Analog Kid
They could start by cutting 16bit instructions and segmented memory. They could cut the fat off their many generations of vector processing and multimedia instruction sets. There’s a lot they could do.

But I don’t think they’re worried that someone is going to come in and cater to legacy customers. I think they’re worried about breaking the spell they have over their customers. They, and Microsoft, have created this arsenal of FUD around backwards compatibility— you’re supposed to feel safe using those products because the code you write today will always run until the end of time. It preys on customers fears of losing their investments to obsolescence.

Instead of worrying about another vendor stealing their legacy customer base, though, they should worry about those being their only customers left. 3 old banks and a NASA computer tracking Voyager.
You know, I was just searching for whether they still supported 16-bit. :) And, I think your point makes sense. While it ties them in knots technically, backwards compatibility is likely a strong psychological factor.
 
You know, I was just searching for whether they still supported 16-bit. :) And, I think your point makes sense. While it ties them in knots technically, backwards compatibility is likely a strong psychological factor.

You can’t remove the 16 bit instructions because they are part of the 32 bit instruction set. You can remove some of the 16-bit operating modes, probably, but that likely wouldn’t buy you much.
 
  • Like
Reactions: Unregistered 4U
You can’t remove the 16 bit instructions because they are part of the 32 bit instruction set. You can remove some of the 16-bit operating modes, probably, but that likely wouldn’t buy you much.
Woooooooow, it’s like a shambling pile held together mainly by it’s own will to exist! No telling how much cruft was implemented as a “good idea at the time” with no real expectation that it’d still be a critical part of the framework this far along!
 
I'm honestly not sure why people are so quick to discount Intel. When Intel is on top they're show, iterative and reactive. Now that they're the underdog I'm frankly impressed with the quick moves they're making right now. They're investing a hundred billions into new state of the art fabs and pivoting towards becoming a viable contract manufacturer. They're setting aside massive amounts of money designated for the retention of it's top talent as well as the acquisition of new talent. I see an Intel now firing on all cylinders and although it takes time to turn a massive boat such as Intel the right moves are in place. The things that set X64/X86 and Arm apart aren't as simple as risc vs cisc and my bet is that with the right engineers the line will be blurred even further. Apple didn't pick Arm because it was more advanced, it picked Arm because they wanted to design their own chips and X64 isn't licensed out so it wasn't an option in the first place.
That is the entire problem with Intel. When AMD was suddenly beating them, you know how quick we went from 4-cores on max consumer CPU to at least 8 on the i9s? Now that Apple is doing a better job, they are now suddenly releasing something that at least competes well on benchmarks?

This was when my ISP suddenly improved tenfold when Google Fiber was about to come to our area, and we SUDDENLY got access to faster internet.

They just sit around keeping to the current products, and suddenly are able to produce better ones when something else comes by? Competition is great, but I get the feeling Intel really does not try hard unless they need to.

And I would LOVE to see Intel get much much better as I still use Windows PCs. So while I will never own a Windows laptop, I have to thank Apple for combating Intel on another area where they suddenly have a better product as a result.
 
Good stuff - I hope apple move back to intel if it makes sense performance-wise. Being chip agnostic is a strength and something apple have proven they can do in the past. There no rational reason to support M1 chips as users as if it were a football team.
Not going to happen. M1 is not just processor. Its neural engine, GPU (that matches RTX 30 series with M1 Max), prores encoders/decoders, HEVC and H.264, unified architecture for extremely fast memory bandwidth and more.

Its not really fair to compare Intel to M1. It would be fair to compare Intel + NVIDIA + DDR5 + PCIe 4 NVMe SSD and that package, but it still would not be a full comparison as there are still stuff that is not removed from the CPU on the PC side where Macs have a lot of stuff now off-CPU.

I said it before that my M1 Max Macbook Pro now feels like two computers in one. Why? When I am exporting from After Effects or Final Cut Pro, there is a lot of stuff happening off CPU and GPU now. I have performance on the table where I can work on other things now while exporting. I have difficulties doing this on my older Intel macs, and my Intel Windows PCs because CPU is at 100% during these tasks (Premiere Pro on the Windows side).
 
Woooooooow, it’s like a shambling pile held together mainly by it’s own will to exist! No telling how much cruft was implemented as a “good idea at the time” with no real expectation that it’d still be a critical part of the framework this far along!
When I was thinking about the new 64 bit instructions for integer operations I wasn’t thinking about what happens when we get 128 bit instructions, so I don’t blame my ancestors :)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.