Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Time for Apple to start making their own chips in the state and not have to rely on others.
 
  • Haha
Reactions: kxc
Time for Apple to start making their own chips in the state and not have to rely on others.
That sentiment has been utttered throughout the thread, and yet completely misses the complexity and skill required to do this. If it was easy anybody and everybody would be doing it.

Look at intel, they've had Fab facilities since day 1, and yet have struggled with getting beyond 10nm, their 18a process was painful as well. Its not as easy as just building a factory and hiring a few locals from the area
 
For people thinking Apple can just "own" a fab, no.

It's not like buying a coffee machine. Most of the expensive, time consuming, and risky part is development work of new processes and nodes. You then need the volumes to match up with those expensive nodes to pay for NRE.

Do some people think TSMC just buys ASML machines, runs them, and collects profits?
And TSMC is so far ahead of every other Fab from a capability standpoint it’s wild. Capability as in producing the best chips, not capacity.
 
Intel has a 'over promise and under deliver' trust hole to get out of. I'd be surprised if they are 'desperate', but probably do know they have a trust problem to overcome.

If most of the customers Intel is after committed to 14A then Intel wouldn't have enough capacity. Intel has to solicit more folks than they can get commitments from , but some of them walking away at the end isn't a huge negative since they all couldn't get slots anyway.

they need a 'goldilocks zone' customer for a baseline 'load'. Some wafer demand that is not too big, not too small. Also steady so they can plan the other Intel usages around them and far enough ahead of time to not disrupt the flow for any one of the other potential customers.

What Intel needs is for 18A is for external customers to come into 18A family as Intel CPU/GPU gradually moves along to the next node ( whether I 14A or T A16 ). Looking only at the current leading edge is missing the view a healthy fab vendor needs to have. They'll need 2nd, 3rd arrival iteration customers on that 18A also. So missing out on the customers willing to take highest relative risk is only part of the issue (i.e., getting 14A customers to commit in the last 3-6 months. ) . Anyone who hasn't committed to 14a at this point probably isn't a 'first iteration' customer.

TSMC is out getting new customes for N6 all the while they are rolling out N2. (e.g., Rivian AI chip on N6 recent announcement). They need breadth, but are starting out on a relatively very narrow set of options for customers. It is just going to take time. ( and they aren't going to 'win' just with the name 'Intel'; going to have to earn it. )
I'll just add that Intel, for the first time in history, have the actual chance of becoming a fab, now that Pat is out of the way and Tan, a person with a fabbing mindset, is in. I wish them good luck, since they'll need it. But the industry clearly needs more advanced fabbing capacity.

Oh, while on the topic of Intel, they also have the chance of becoming a viable gpu vendor. Ironically, all these things Pat was skeptical about.
 
Not surprising considering all the chips are needed for AI. Apple still has a huge power in negotiating and not expecting to see a hike in prices of devices immediately due to change in component pricing.

Apple have people staying in South Korea trying to negotiate deals. With demand outstripping supply, Apple doesn't have huge power over supply chains. Samsung and SK hynix are reported to be refusing to enter anything other than short-term contracts with Apple, because prices are expected to continue rising for the next few years. This leaves Apple in a difficult position of being unable to control costs and retail pricing the way they do now.

Apple have been ruthless with suppliers in the past, so now it must face the prospect of some payback in the current market. This will be painful for Apple customers, but most people will happily absorb higher prices in order to remain in the ecosystems they love.
 
  • Like
Reactions: darkblu
Watch all three CPU manufacturers CES keynotes ( Nvidia, Intel and AMD ). It was all AI and nothing consumer. It fact AMD’s Lisa Su, CEO of AMD and cousin of Nvidia’s Jensen Huang, rolled out Trump’s head of AI strategy for a cringe back and forth onstage. It was all so pathetic. Gamer Nexus has 3 recap videos that are BRUTAL to all three companies CES keynotes in which NO consumer products were touted.

This is the end game folks. Everything will be digitized, virtualized, tokenized, and controlled and monitored by AI in global planetary power sucking data centers run by Oracle, Meta, Google, Microsoft, Apple and Amazon. Entire governments and other military and intelligence apparatus are now or will be nearly to wholly run on one or more of these companies’ platforms.

“You’ll Own Nothing and Be Happy”.


I love that people think the WEF 2030 plan is funny or not sinister to the absolute core. The denial and complacency is borderline insanity.
 
I love that people think the WEF 2030 plan is funny or not sinister to the absolute core. The denial and complacency is borderline insanity.
Well since the pandemic, it seems like the ignorant/uneducated population of America have banded together to create a super cabal of stupidity, where basically nothing is true and conspiracy theories rule. I’m not sure if it was the constant rejection from traditional intellectual crowds online or if they just needed a leader to take advantage of their idiocy. The psychology of it is quite pathetic, as they’ve created this illusion of intelligence that will doom us all.
 
I'll just add that Intel, for the first time in history, have the actual chance of becoming a fab, now that Pat is out of the way and Tan, a person with a fabbing mindset, is in. I wish them good luck, since they'll need it. But the industry clearly needs more advanced fabbing capacity.

Was it really Gelsinger that was at the core anti-"fabbing mindset" problem?


The board of directors who ran Intel into the ground ... where had to go out and hire Pat to perhaps pull off a turn around were still had a cabal of dingdongs who were trying to simultaneous sell of the fab and under invest in it at the same to goose the stock price higher. ( Yes Tan was on the board for a while, but there was also a faction of entrenched folks who just wanted money to flow out of the busines not the long term health of the business. ).



Oh, while on the topic of Intel, they also have the chance of becoming a viable gpu vendor. Ironically, all these things Pat was skeptical about.

Skeptical or realist about Intel's approach to the expanding in the GPU market. Pat Gelsinger was in charge of Larrabe. (set the time machine to 2008 )

".. In response to a question from me about where we should expect Larrabee to fall performance-wise with respect to high-end GPUs from AMD and NVIDIA, Gelsinger told me flat-out: "We want to win. So in the high-end space, running traditional gaming applications—DX, OpenGL, etc.—we will win. And when you take advantage of its unique characteristics, and some of the more advanced techniques that it will enable, it will shine." ..."

He thought applying x86 cores to the GPU space was a viable , 'winning' solution? That isn't skeptical that is the haughty arrogance that put Intel in the ditch. The next round that tried to do everything for everybody all at once was deeply flawed in other ways , but arrogance still a root cause issue. Was talking the same "rah rah Intel can do anything" cheerleader stuff decades back also.

Only folks drinking Intel kool-aid ever thought Larrabee might work.

When Gelsinger got back the GPU business was bleeding billions and the fab busienss was bleeding billions. Neither one was in very good shape and Intel could afford both to burn money while the other producst bussiness was about to loose substantive market share. What wasn't 'on fire' ? Intel had been a laader in fab. They had never been a leader in dGPUs (or max GPU revenue. Unit volume if iGPU that were bundled, but unbridled is a different value proposition).

A GPU vendor in the consumer market sense? Probably not.


The timeline to product being about 3 year out seems to indicate that this Tan's call at least as much as Pat's. It could be one of Pat's last calls that Tan is riding with. IF Intel is going to lean on Nvidia to make. Mn Pro/Max and AMD xxx Halo top end laptop 'all in one package" solutions then the consumer discrete market is likely done.

An Intel Inference "GPU" for datacenter since that is a 'print money' market for the foreseeable future. Probablywill get green lit by Tan. But the quagmire of endless gamer optimized driver updates to chase quirks in steady stream of new and old games? Probably Not with the limited amount of money Intel has now.

Some folks have spun the Intel-Nvidia as the total death of Intel iGPUs. Unless Intel is suicidal, that probably isn't true. Tacking mid-to-upper-mid range GPU die to a CPU tile/Chiplet in single package probably will be a 'thing' for laptop and smaller desktop market going forward. However, that is mainly mapping down discrete GPUs into chiplet/tiles. Monolithic or small tile/chiplet CPU/GPU die designs is a space Intel die reasonable well in before taking a stab at dGPUs.

Intel has a chance if the carefully pick a subset of GPUs to do where they can get some traction without tons of deep overhead. Once get that stabilized and then perhaps incrementally more larger coverage. (e.g., If Nvidia gets even more 'bored' with consumer GPU go after the lowest end and stop. Susume that and then maybe move on to the next submarket. )

To get traction in GPU need to have hardware and software. Gelsinger's time at VMWare didn't seem to give him better software insights ( more so same run a monoply player forward on a monopoly player path. ). I think Tan has a better skill at picking out what Intel does badly because he didn't grow up inside Intel and has seem Software , Hardware , and blending of the the two as companies.

P.S.
" ... . Eric Demers, who designed ATI's best GPUs and spearheaded almost all of Qualcomm's Adreno designs, has now joined Intel's GPU team "with a focus on AI." ..."

Linked off that article is an older one about how the head of AI/CTO at Intel had bolted to OpenAI. (he got pulled from Networking and Edge so this 'replacement' a more nuanced GPU background hire. )

Not sure how much Demers had to do with Qualcomm's Inference card offerings, but that's likely a better tie in than Toms hardware's article doing extensive reminiscing about old ATI/AMD cards. (or trying to promote notion that Nvidia is quaking in their boots. Probably not). More likely this about him being experienced enough to pick and guide a team to do the work (e.g., filter 'smoke' from someone not delivering or pointing out methods that probably won't pan out) rather than doing it himself.

Andreo has been all about lower power , on main die , iGPU solutions. Some 300-400W power sucking mega gamer bragging rights GPU coming soon? Probably not.
 
Last edited:
I don't believe there's any evidence that that's true, and that "could" is doing a hell of a lot of work.
Interesting comment. What industry do you work in? To me the evidence is already overwhelming, still growing, and shows no signs of stopping its growth. And I say that with no financial stake in AI other than holding some S&P index funds, haha.
 
Interesting comment. What industry do you work in? To me the evidence is already overwhelming, still growing, and shows no signs of stopping its growth. And I say that with no financial stake in AI other than holding some S&P index funds, haha.

You personal opinion isn't "evidence," is all I'm saying. You can't make a statement like "it's also true that AI is a genuinely transformative technology" back it up with the equivalent of "trust me, bro," and expect people to take that seriously.

And FWIW I'm an attorney, and AI has done absolutely nothing for the field other than gotten a lot of attorneys and law firms severely sanctioned for citing fictitious cases.
 
And FWIW I'm an attorney, and AI has done absolutely nothing for the field other than gotten a lot of attorneys and law firms severely sanctioned for citing fictitious cases.

That isn't true. AI isn't directly replacing lawyers or allowing them to sleepwalk writing briefs for court. But discover on a huge pile of documents to find bits of of relevant data faster. That is already being done.

That are two general approaches to use the new AI tools. First, to be even lazier (do less work and get paid the same or even more money). Somewhat like leveraging unpaid , slave labor. There will be bad, lazy lawyers before and after AI.

Second, use it to augment work. Perhaps by farming out some long, relatively boring, but necessary tasks. More affordable quality improvements. etc.

Probably going to be a bit like law firms before electronic word processing and post . The ratio of aggregate personnel and tasks is substantively different. Still lawyers present (not necessarily a decrease in junior associates.. since those are typically bleed away over time anyway at law firms) , but workload is distributed differently;



And

resize




Will the basic structure of law firms change? Probably. But if productivity improvements come at clients the expectations from clients is going to be to show some productivity improvements also. Corporations that have in-house consult that are more production probably are not going to outsource as much basic work.


Certainly there is the impact that lawyers tend to charge by the hour and don't want to be more efficient to pad more revenue. That likely will be slow adoption, but "absolutely nothing" is also likely not universally true. A short term, fanstaical productivity boost that justifies spending money at drunken sallor levels on data centers is overblown, but "no impact what so ever" is ignoring the evidence also.
 
They could use Intel for the secondary chips that don’t require cutting-edge manufacturing.

Intel doesn't have lots of breadth in fab options for external parties. They revamped their legacy 14nm process into something that is being sold as 12nm now. But the public design kits they offer are much narrower than what the mature fabs have. ( can design for 9-12 different fab processes at TSMC). IHMO folks are way too myopically focused on some kind of "exclusivity" driven. 'cutting edge' manufacuring as opposed to just manufacturing.

18A is more advanced than N3B.


For the flagship chips, though, they will need TSMC.

Flagship in what sense. Some folks use 'flagship' for most expensive system or most expensive chip. The datacenter chip Apple is reportedly about to make is the most expensive. The Ultras have been.

Flaghsip in the strategic sense. Biggest revenue iPhone probably won't because I don't think Intel has that capacity flexibility. Super peak demand on the intial 3 months and also other 3rd party users of the process piling in in year 2-3-4 when the volume goes substantially down. Apple doesn't throw A-series chips in the trash can every 12. months at all.


PS how are globalfoundries doing?

Tariffs are probably helping GlobalFoundries far more than they are helping Intel. Acquiring MIPS and other design IP (e.g., recently some RISC-V IP) is drifting into the state similar to Intel where design and fabricating are housed under one corporate umbrella. Nowhere near the same ratios though. Healthy enough though that Intel doesn't get automatic wins with their 12nm solution.

They are adapting to continue to survive. However, if there is a silicon fab bust after this boom that the protections disappear, it could be a bumpy road. USA sanctions against Chinese semi firms entry into the EUV space means things just get more competitive for folks left in the DUV space over the long term. Chinese firms have little choice but to better in the DUV space because forced them to stay there.
 
That isn't true. AI isn't directly replacing lawyers or allowing them to sleepwalk writing briefs for court. But discover on a huge pile of documents to find bits of of relevant data faster. That is already being done.

That are two general approaches to use the new AI tools. First, to be even lazier (do less work and get paid the same or even more money). Somewhat like leveraging unpaid , slave labor. There will be bad, lazy lawyers before and after AI.

Second, use it to augment work. Perhaps by farming out some long, relatively boring, but necessary tasks. More affordable quality improvements. etc.

Probably going to be a bit like law firms before electronic word processing and post . The ratio of aggregate personnel and tasks is substantively different. Still lawyers present (not necessarily a decrease in junior associates.. since those are typically bleed away over time anyway at law firms) , but workload is distributed differently;



And

resize




Will the basic structure of law firms change? Probably. But if productivity improvements come at clients the expectations from clients is going to be to show some productivity improvements also. Corporations that have in-house consult that are more production probably are not going to outsource as much basic work.


Certainly there is the impact that lawyers tend to charge by the hour and don't want to be more efficient to pad more revenue. That likely will be slow adoption, but "absolutely nothing" is also likely not universally true. A short term, fanstaical productivity boost that justifies spending money at drunken sallor levels on data centers is overblown, but "no impact what so ever" is ignoring the evidence also.

Even taking all this as true, nothing here is "genuinely transformative" to the practice of law. At most it'll create a modest productivity boost, probably similar to the rise of word processors (which, of course, arrived in law like a decade after everywhere else). I'll never be able to type to ChatGPT and have it write an entire brief for me, which is what I spend the vast majority of my time doing.
 
Well since the pandemic, it seems like the ignorant/uneducated population of America have banded together to create a super cabal of stupidity, where basically nothing is true and conspiracy theories rule. I’m not sure if it was the constant rejection from traditional intellectual crowds online or if they just needed a leader to take advantage of their idiocy. The psychology of it is quite pathetic, as they’ve created this illusion of intelligence that will doom us all.
WEF 2030 isn’t conspiracy at all. They are very open about car-less cities, digital ID, own nothing and be happy. It doesn’t take “conspiracy theorists” or lack of listening to the “intellectual crowds” to see where that leads. As a physician I am utterly appalled at the lack of critical thought in my field. Covid and the abhorrent “guidelines” and “evidence-based” policies (we now know were routed in utter nonsense like social distancing 6 ft, masking outdoors, closing down gyms and beaches) etc. The “intellectuals” and “adults in-charge” did a real bang up job the past several years.
 
Was it really Gelsinger that was at the core anti-"fabbing mindset" problem?

Not so much of an anti-fabbing, as much as I-don't-know-how-to-get-this-thing-going mindset. Or at least that's the impression I got of him in that seat.

The board of directors who ran Intel into the ground ... where had to go out and hire Pat to perhaps pull off a turn around were still had a cabal of dingdongs who were trying to simultaneous sell of the fab and under invest in it at the same to goose the stock price higher. ( Yes Tan was on the board for a while, but there was also a faction of entrenched folks who just wanted money to flow out of the busines not the long term health of the business. ).

I'm far from exonerating the board of their failures. But wrt the fabbing business the "champion" they chose to get them going was about to get the job done as much as a random pick from the street. IMO.

Skeptical or realist about Intel's approach to the expanding in the GPU market. Pat Gelsinger was in charge of Larrabe. (set the time machine to 2008 )

Yes, I know, I was there. At first observing the Larrabee bubble, and subsequently had my skin in the Knights, erm corner (no pun intended), and the Knights landing zone. As much as I have respect for many of the engineers who worked on the original MIC concept (mainly 3DLabs talent), the thing was so misguided it's not even funny from today's standpoint. It was a "Transofmers: the x86 movie" with all the cringe and bravado Michael Bay's movies are known for. And Pat had a key role in this entire.. disaster. For which he was banished. Many thought for life. Silly them.

When Gelsinger got back the GPU business was bleeding billions and the fab busienss was bleeding billions. Neither one was in very good shape and Intel could afford both to burn money while the other producst bussiness was about to lose substantive market share. What wasn't 'on fire' ? Intel had been a leader in fab. They had never been a leader in dGPUs (or max GPU revenue. Unit volume if iGPU that were bundled, but unbridled is a different value proposition).

Pat's return was supposed to boost the troops morale. To regroup the battle formation for the next major initiative. The once-banished general, the Steve Jobs of Intel ..Erm, yeah. No.

A GPU vendor in the consumer market sense? Probably not.

I'd say Intel Arc have been surprisingly competent. Do keep an eye at that space.

The timeline to product being about 3 year out seems to indicate that this Tan's call at least as much as Pat's. It could be one of Pat's last calls that Tan is riding with. IF Intel is going to lean on Nvidia to make. Mn Pro/Max and AMD xxx Halo top end laptop 'all in one package" solutions then the consumer discrete market is likely done.

An Intel Inference "GPU" for datacenter since that is a 'print money' market for the foreseeable future. Probablywill get green lit by Tan. But the quagmire of endless gamer optimized driver updates to chase quirks in steady stream of new and old games? Probably Not with the limited amount of money Intel has now.

Some folks have spun the Intel-Nvidia as the total death of Intel iGPUs. Unless Intel is suicidal, that probably isn't true. Tacking mid-to-upper-mid range GPU die to a CPU tile/Chiplet in single package probably will be a 'thing' for laptop and smaller desktop market going forward. However, that is mainly mapping down discrete GPUs into chiplet/tiles. Monolithic or small tile/chiplet CPU/GPU die designs is a space Intel die reasonable well in before taking a stab at dGPUs.

The thing is, Intel's Arc GPU is now both in their dGPU and iGPU lineup. They know they don't have the resources for more than one architectural line of GPUs. And they are focusing on that. Again, keep an eye out for them, as they have some world talent, not just Demers (who, as you noted, is positioned more in the datacenter AI segment, as it seems.)
 
You personal opinion isn't "evidence," is all I'm saying. You can't make a statement like "it's also true that AI is a genuinely transformative technology" back it up with the equivalent of "trust me, bro," and expect people to take that seriously.

And FWIW I'm an attorney, and AI has done absolutely nothing for the field other than gotten a lot of attorneys and law firms severely sanctioned for citing fictitious cases.
OK, that's fair. It's very interesting to hear your field has not seen any real 'disruption' yet. And I certainly understand the burden of proof would be on me to back up my claim. In my defense I was never asked to provide "hard" evidence, but I certainly could. (I sometimes assume everyone reads the same news that I do, which is a rookie mistake by me because that's almost never true on the internet, haha.) I don't think presenting sufficient quantities of hard evidence is worth my time over a forum like this, but please accept some of my informed perspective (with a few links for sanity) if you're curious...

I'm an engineering manager at a large public medical/research university. Here are the areas I touch closely every day:
Engineering - design of facilities and utilities infrastructure
Medical - top hospital system in the entire state, my primary stakeholder
Research - top 25 nationally in biomedical, agricultural, cancer, etc.
University - ~40k undergrad students
Administration - all this under a single (complicated) leadership umbrella

Engineering is being completely transformed, and it all starts on the computer science level (which flows into everything else). With AI, coding has gained an entirely new abstraction level for the first time in decades. Either this means staffing can be drastically reduced, or capabilities and throughput can be drastically increased. Agential AI is now getting very good at solving difficult math/science problems as well. This reflects mostly in development and response speed, as things like rapid prototyping (manufacturing) and market/network forecasting are getting a huge shot in the arm resulting in AI solutions being adopted by even typically late-mover legacy industries like utilities.

Medical fields are also markedly different already. Patients are literally watching their doctors type their symptoms or upload their imaging straight into an AI model for help with unbiased differential diagnosis, to double-check drug interactions, to aggregate results of the latest research trials, etc. (Heck, patients are skipping the real doctors and going straight to AI because it has pretty good manners/empathy and doesn't make you wait or feel rushed, lol.) This is information that was all previously available, but the speed and accuracy at which it can now be obtained has gone through the roof with AI.

I could go into the other three areas I mentioned at length but this is getting verbose. I can attest first-hand that they are all experiencing tectonic shifts as a result of the things I have already described. Think of how google, youtube, wikipedia, et al changed education and research--this is just as big a shift, and still getting bigger. Not to mention, things like driverless cars and robotic drones are actually taking usable form at scale thanks to AI/ML... or that the best traditional hurricane prediction models are starting to get obsoleted by AI-based analysis.

Not me saying, "You don't have to take my word for it" (Reading Rainbow style!)... :D
 
  • Disagree
  • Like
Reactions: hajime and jarman92
How's that been going for Intel over the last 15-odd years? Why did AMD divest itself from its fab business? How are the fabs TMSC has been building in Arizona doing? In all cases: Unsolvable delays, staffing problems, cost-overruns, process failures.

Fabs are some of the most complex process-factories humankind has ever created. Consolidation is what makes them profitable, and what focuses the investments, knowledge, skilled labor, and risk mitigation into a functional organization. The alternative is setting great big piles of money on fire for zero gain.
And, all that doesn't mean much if they can't get the chips at reasonable cost or at all. That said, Apple is a big enough of a customer that they can get in the front of the line. Also, they are already in the fab business. Remember, they are investing in TMSC. And, it's not necessarily to get the plant running. It's to secure pipeline with TMSC by giving them dollars and appease political pressure of spending manufacturing money in the US. I don't think the operational portion of the Arizona plant is the big picture here. I thought they investing in 4-5 fabs in the US.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.