Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
but rather switching from Intel to TSMC.
TSMC is the fabricator, Apple does not have the foundries to produce the actual CPUs. They are designing the Ax products inhouse. Intel, is both designing and producing their chips since they have the capability to do both.
 
Just look at the keynote. The same A12z chip from the iPad Pro could run tomb raider smoothly at an unoptimised state, while an equivalent intel chip in a laptop would be running the same game at single-digit FPS.

What Apple silicon will likely offer is way better performance, at lower power consumption (longer battery life) while producing far less heat (which allows for thinner and lighter form factors, and better sustained performance).

It also gives Apple the freedom to customise the chip for whatever new feature they want to include in their laptops.

It's going to be better in every way.

I will laugh if it's the mobile version of Tomb Raider that got featured at the keynote LoL
 
  • Like
Reactions: warnergt
That would require just as much work(A lot of pro Mac apps crash on AMD CPU's).

What kind of nonsense is that statement? Apple has never made an AMD MAC so anything that is running that chip is a hack at best. There are billions of AMD computers humming away with software.
 
TSMC is the fabricator, Apple does not have the foundries to produce the actual CPUs. They are designing the Ax products inhouse. Intel, is both designing and producing their chips since they have the capability to do both.

I know. But you mentioned 14nm, which has to do with fabrication, and not really related to underlying architecture or design/layout. If process node is what's important to Apple, then the key to this transition is TSMC, not ARM or Apple's in-house designs.

I believe only Samsung and TSMC are currently fabbing large-scale 7nm chips, and TSMC is the only one nearly ready to fab 5nm chips at a good yield. No other fab is doing 7nm or 5nm at scale. So it doesn't really matter who is designing what chips - if TSMC isn't fabbing them, they're not going to be at a cutting edge process node.
 
I know. But you mentioned 14nm, which has to do with fabrication, and not really related to underlying architecture or design/layout. If process node is what's important to Apple, then the key to this transition is TSMC, not ARM or Apple's in-house designs.

I believe only Samsung and TSMC are currently fabbing large-scale 7nm chips, and TSMC is the only one nearly ready to fab 5nm chips at a good yield. No other fab is doing 7nm or 5nm at scale. So it doesn't really matter who is designing what chips - if TSMC isn't fabbing them, they're not going to be at a cutting edge process node.
FWIW, Intel's 10nm process is equivalent to TSMC's 7nm process. Though the Intel 10nm chips don't seem that great, which tells me that intel's designs are another problem.
 
  • Like
Reactions: Zdigital2015
Refresh my memory... which Macs use Skylake?

Original Sky Lake would be in 2016 MBPs, Kaby Lake (2017 MBPs) and even the Coffe Lake CPUs (2018-2020 MBPs) are still using SkyLake architecture cores. Even 10th Gen Comet Lake is using SkyLake architecture based cores. All are on 14nm nodes. I suppose it didn’t help with all the exploits and then software/firmware patches that nerfed performance in some cases significantly depending on workload and some not completely patchable.


Only the 10th Gen Ice Lake and Tiger Lake are actually using newer gen architecture and a 10nm node. Due to low 10nm yields looks like that node is only being used for Quad Core Ice/Tiger Lakes. Clock for clock these new chips are faster per Core but have lower Mhz due to the immaturity of their 10nm mode, so not too much net gain depending on benchmark/test.

They are supposed to release Rocket Lake 11th Gen 45Watt 8 Core CPUs by the end of the year, backporting the Ice Lake or Tiger Lake architecture to 14nm. Now with Macs moving away from Intel I suppose it doesn’t matter much for next gen MBP 16” and others.
 
Last edited:
  • Like
Reactions: ambientdaw
"When your customer starts finding almost as much bugs as you found yourself, you're not leading into the right place."

Back when I was a project manager (mostly for web pages and apps, but still), there were some projects where the customer reported lots and lots of small bugs and inconsistencies within weeks of delivery. Mostly stuff that the team - myself very much included - should have picked up easily and removed before deploying. That was not my proudest moment(s).

One (of many) reasons I switched to teaching a few years back. ;)
Well hey good on ya for having the humility to take a step back and assess your career and realize that wasn't where you wanted to be! That's better than most people!
 
  • Like
Reactions: hxlover904
FWIW, Intel's 10nm process is equivalent to TSMC's 7nm process. Though the Intel 10nm chips don't seem that great, which tells me that intel's designs are another problem.

Setting aside equivalence, I think it's about timing. TSMC was mass producing 7nm at good yields in early 2018, and was supplying Apple, AMD, Qualcomm, Huawei, and others by Q2/Q3 2018. Intel wasn't producing 10nm at high volume until late 2019. So TSMC had a solid ~1.5year lead on Intel, if not more. Indeed, arguably Intel still isn't producing 10nm at high-enough volume as it's limited to only some of the latest mobile CPUs - not even all mobile CPUs, and not desktop/server CPUs.

As to the x64 market, this is why AMD Ryzen made such a big splash last year. Being 1-2 years behind is a huge deal in that industry.

Some experts are now saying Intel is years behind TSMC in fab technology. And Samsung isn't too far behind TSMC (their Austin TX fab is seriously impressive).
 
  • Like
Reactions: NetMage
If x86_64 Macbook somehow discontinue, then I will be force to buy another laptop when that happen.
The main reason why I choose mac in first place is due to multi-platform compatibility, NOT from virtualization.

Also casual gaming on bootcamp will essentially be dead at that point...
 
  • Like
Reactions: SocialKonstruct
If x86_64 Macbook somehow discontinue, then I will be force to buy another laptop when that happen.
The main reason why I choose mac in first place is due to multi-platform compatibility, NOT from virtualization.

Also casual gaming on bootcamp will essentially be dead at that point...

Well they are definitely going to be discontinued, so get used to the idea of buying another laptop. Though I don't understand that argument since you aren't going to find multi-platform compatibility that includes MacOS if you buy a dell or whatnot.
 
From what I’ve seen in the last few days, Apple’s desire to move to their own silicon goes far beyond performance and power requirements. The features that Apple has been able to add into their chips give them control over the end product features. No more waiting to get beyond RAM limitations or Wifi version limitations or whatever it may be that they depended on Intel for. As has been mentioned elsewhere, ‘ARM’ is only a small part of the SoC story.

Intel may have been releasing iterations of their processors on an annual basis, but not only have the changes been negligible, but they’ve been launched in dribs and drabs that only cover certain market segments. I’ve lost track of which generation is available for which segment (low power laptop, mid power, high power, desktop etc.)
 
Last edited:
Thanks for the good laugh...
Are you blind or just willfully ignorant?
[automerge]1593120617[/automerge]
And it's their perfect right to do so, as that has always worked out well for them and for their customers.
???
[automerge]1593120682[/automerge]
Folk constantly complain that Macs have become stagnant. When your timetable is set by a third party, then that creates issues as you're now the tail that wags the dog.

Sure AMD are ahead today, but Apple don't look at today, they look at tomorrow, and if tomorrow looks anything like today then they will want to change it.

Welcome to the new tomorrow.
AMD's roadmap involves actual change and progression. Intel's roadmap has been vomiting out slightly different chips year after year and rebranding them.
[automerge]1593120726[/automerge]
There was a time when IBM was innovating, but that stopped. Intel was innovating for a while, then that stopped. AMD is innovating right now, but that too will eventually hit a wall. Apple has been at the mercy of three different chip manufacturers over its long existence and probably doesn't want to risk it again. By controlling its own processors, Apple will control its own destiny (for better or worse).

If it wanted to, Apple could probably buy AMD and absorb its tech. But I think Apple has a different vision of where they're going.
I get that they want to control their own chip progression. I just don't know why they've stuck with Intel for so long. Intel has had their heads up their butts for years.
[automerge]1593120775[/automerge]
Tile based rendering has been supported on the desktop since 900 series by Nvidia and since Vega series by AMD. It has already been brought to desktop.
[automerge]1593087295[/automerge]

Only in the CPU space. The GPUs have been lacking for years.
AMD is dominating the mid-tier GPU market right now. I wish they'd compete in the high end space, but I don't know if that's going to happen anytime soon. I wasn't referring to their GPUs though. It's irrelevant and Apple has been using nothing but AMD GPUs for some time now.
 
I wouldn't put it past me that it was Steve who, after the iPad was released, drove the A-Series engineers to take it to the point of taking over ALL Apple products.

The move to Intel was necessary at the time.

But if I remember correctly Steve said (to Motorola): "I can't wait until I don't need you anymore." I don't think that was exclusive to them.

I believe Steve ALWAYS wanted to control EVERYTHING.

Apple is taking his vision to its logical place. And the iPad and iPad "Pro" were the test beds to show them: "Hey, we can do this. We can do this NOW."

Time will tell, but it looks like Apple's horses were pulling the right cart all along.

Man, the 20's are going to be (even more) INTERESTING.
 
Well hey good on ya for having the humility to take a step back and assess your career and realize that wasn't where you wanted to be! That's better than most people!
I know. The stress level as a teacher is better (not necessarily less, but much more manageable).

Ironically I was quit apt at finding even the weirdest bugs back when I was "just" a developer; I think that maybe suited me better as a person. ;)

To keep this (slightly) OT: I'm looking forward to see what Apple can do with their own Apple Silicon, I think maybe we're in for a treat... I might even consider ditching my (newish) 2019 MBP if the new machines are as good as I expect them to be.

Intel has disappointed over and over again the last couple of years, and this anecdotal evidence of (very) poor quality control only amplifies that.
 
Last edited:
Well they are definitely going to be discontinued, so get used to the idea of buying another laptop. Though I don't understand that argument since you aren't going to find multi-platform compatibility that includes MacOS if you buy a dell or whatnot.
This to 11.

Getting a computer is like getting a gaming console. You go to the platform of choice for its exclusives...

So if you need Windows, why buy a Mac at all? As a student I needed to integrate into different scenarios, so cross-platform compatibility was a necessity. The Mac made it so I could have just one machine. But Windows 10 is more than capable of handling any computing you need.

In the pro world, how many people BYOD? Usually you get issued what you need, right?

The dual-boot Mac thing was convenient, for sure, and it served its purpose when Microsoft was "the enemy", up until recently.

But it really doesn't make sense from Apple's perspective today. We have Office for Mac, iPad, you name it. Third-party apps of every kind are available on both platforms, not to mention iOS and iPadOS.

Things have changed considerably in recent years. The rise of iOS has been unprecedented (look at Big Sur!)

Even the iPad went from something I could not realistically use to my Macs' replacement, this year.

Maybe it's time to let the old ways die. ;)
 
  • Love
Reactions: urnotl33t
If they switched to AMD, they'd be in the same position they were with Intel, IBM and Motorola; Apple would have to wait for the other company to innovate. By controlling their own silicon (which could only have happened as Apple's fortunes grew), Apple will now be able to set their own roadmap for innovation.
I don't innovation is the driver. A walled garden with maximised profits that don't need to be shared with outside suppliers seems more a motivator to me.
 
I'd say it's hard to believe a company as old and established as Intel is shipping buggy CPUs, but then they've screwed up before, and if Boeing can forget how to both properly design and properly manufacture airplanes, Intel flubbing a CPU isn't a stretch.

My org had this exact experience on a vastly smaller scale: You're buying a product from a well-established vendor, then you start finding bugs in what's supposed to be a polished industrial device, and at some point you start thinking "I'll bet we could do better than this ourselves."

Then at some point, you come to the realization that not only could you, it just makes more sense to ditch them and go DIY. When you're Apple, that consists of ditching the best-known chipmaker on the planet for your own CPUs.
i love stories like this that demonstrate that while capitalism might be a terrible system, it is better than any other.
[automerge]1593127033[/automerge]
I don't innovation is the driver. A walled garden with maximised profits that don't need to be shared with outside suppliers seems more a motivator to me.
I believe the correct terms are “vertical integration” and “control of the whole widget to control your own destiny”.
A “walled garden” concept is much more limited. Whether an open, generic system of hardware components dependent on the innovation of others is more or less profitable overall than an internally designed and built product is debatable, and no doubt dependent in the end on how good the product is.

And if it succeeds.

Doing it all yourself certainly has much greater risk though. You would hope that such an ambitious enterprise would bring greater reward if it can be actually pulled off.
 
Last edited:
  • Like
Reactions: iBluetooth
I'll never forget just how bad my 2015 27" iMac with Skylake was... the machine would randomly restart at any given moment. I was looking forward to using this machine a lot since it was my first desktop, only to be letdown by a vicious bug. I'm still grateful to this day that the Geniuses just traded for a new one on the spot (of course, after bringing my Skylake Mac in for 6 or 7 times for repair.)
 
  • Like
Reactions: AlexGraphicD
He's right and also that Apple finally has optimised the chips enough to show us a mac mini running an iPad Pro chip. Imagine what they can do with a chip designed and optimised for the desktop!

But this is also about including AI, Security, 5G, FPGA etc. into their computers also and that is just the future. It will become harder and harder to differentiate iPads from macs and they will be much more secure than Intel CPU computers.
It's got an FPGA? What is it used for?
 
  • Like
Reactions: iBluetooth
"Following Apple's announcement about its switch to custom silicon, Intel said it will continue supporting the Mac through its transition, but insisted that its processors are still the best option for developers."

Intel can only say this because their chips are more compatible worldwide.. Over time, when all Macs are done, even the Mac mini, Intel will loosen its grip with Apple. I realize you can always blame the "next-intel CPU" for the reason your leaving and making your own chips, but Apple wanted to get Skylake. The fact they chose to stick with it make the mention to go to ARM that more important.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.