Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
The real question is whether Apple can continue to strive for a major improvement if the processor node is unable to shrink less than 3nm.
In their iPhone SoCs, Apple has devoted a fair amount of effort to adding custom coprocessors. Neural engines, audio processors, and various other functions get silicon allocations. Even if process improvements are slow to come, there are other performance enhancements to be made, and this is where closely integrated hardware/software becomes a real advantage. It makes it much easier to add and drop functionality. I'm pretty sure that Apple would have dropped the segmented memory models of x86 and the pure MMX instructions by now, for example, but Intel keeps that functionality generation after generation. If you're comfortable dropping functionality in the future, you can be more comfortable adding it now.
 
7nm is shipping about when they said it would

Sorry, is 7nm shipping already? No.

They are claiming that 7nm is 1 year behind. Right now. They were claiming 10nm was due in 2016. Until it is actually out, you can't say it is "shipping when they said it would".

When I say 10nm is "barely out" - I mean that it is barely capable of being used for low volume (like it or not, the ice lake parts are not the bulk of their sales or even anywhere near close the bulk of their laptop sales) and only able to produce 4 core parts, at clocks that are nowhere near what they can attain with 14nm++++. And this is 4 years after it was meant to be out.

They're putting these 10nm parts out because it's about all they can produce in order to claim to shareholders "see! we shipped 10nm 'in volume'!". After lying about the 10nm release date for 4+ years!

Never mind that if ice lake the architecture wasn't 15-25% faster than Skylake the architecture, the 10nm process would have made it SLOWER and less efficient than intel's own 14nm parts.

This is why they're still using 14nm for the ultra low power stuff like Atom.
[automerge]1595810613[/automerge]
Intel is probably going to ship 7nm in late 2021. Just not in large volume for the mainstream market. High end GPU's ( Xe-HPC Ponte Vecchio ) and perhaps some other relatively very high priced FPGA models.

You might want to look up intel's latest plans for Ponte Vecchio. They're outsourcing production of "some of the tiles" to other fabs. Given the release date, and given Intels own 7nm is likely not going to be available either intel are lying about the release date or they're more likely getting another fab to make the 7nm components of it.

Intel will likely make the 14nm IO die for it, but not the smaller process parts.
 
Last edited:
Much of the stuff out there focuses on CPU (and GPU) performance. My bigger concern is battery life and a lot of that is more dependent on speed and power shifting algorithms etc, not just paper TDP numbers. This is especially true for much of the business market where even previous generation Y class chips usually provide sufficient performance.

We shall see once the actual products come out but I get the impression that for this AMD is depending heavily on process advantage as opposed to design advantage.
Most reviews also concentrate on battery life.
AMD's Renoir APU's are around 50% more efficient that Intel's mobile chips. Their 25-35W 8 core APU's has similar power consumption with 15W chips in day to day, light tasks.

AMD's TDP numbers are very close to the actual power consumption of their chips, way closer than Intel's TDP numbers anyway.

AMD is not depending heavily on process advantage, their CPU and GPU architectures are also excellent.
 
Apple 1.5 trillion vs not cash rich AMD. Tough call.

So what? Money isn't everything.
AMD is a chip company first and foremost and they did accomplish some impressive things with a small budget.
Now that they have money I don't see them having a problems competing with anybody.

Apple. Leading edge cpu and gpu design in a market that Intel, Nv, AMD can't compete in.

Apple fail at producing even more powerful A class chips? (Have they failed...with any of them?)

We don't know if they can't compete in mobile. AMD's CEO did say they wanted to concentrate on what they do best which is High performance server grade CPUs. Realistically the ship has sailed so there's no reason for AMD or intel so throw money into the smartphone chip market.

AMD was a punch drunk boxer that have only got off the canvas because Intel sat on their process lead and dropped the ball on their process lead.
If Apple moves into traditional computer areas...Intel and AMD are going to have to compete...actually compete and up their game.
Azrael.

AMD a punch drunk boxer? Really? LoL
Actually AMD's architecture and CPU's still would have been great no matter what Intel would have done.
Apple is actually a small player in the traditional computer area so they are the ones that will have to compete. Intel has way more influence that Apple in the computer industry for example so beating their CPU's in performance is not enough.
 
I suspect they were. There’s been references to AMD chips in macOS in beta builds for Catalina.

The catch is that AMD hasn’t exactly been as phenomenal when it comes to landing the products. Some issues plagued the 3000-series at launch, or may still impact them today. These are all things that I encountered with my gaming PC I built around the launch of the 3600:
  • There were issues with 3000-series chips not reaching advertised boost clocks. AMD had to issue a microcode fix.
  • There were issues with the 3000-series chips boosting under very low loads because it was set to be too sensitive.
  • AMD shipped a broken rand instruction that had to be fixed by microcode. It didn’t break a lot of software, but it just looks very sloppy.
  • AMD’s turnaround on these issues wound up being on the order of 1-2 months to resolve the day one issues.
  • The 3600 in my case after all the fixes still idles at higher temps than a 2600 I also had and seems to boost with Windows doing not a whole lot.
The chips themselves are good performers, but when Apple’s likely upset with Intel’s QC, and wanting more power efficient chips... the 3000-series isn’t a great counter-example. If Apple was getting engineering samples from AMD, they may have made the final call which way to go based on that experience.

Enthusiasts are more willing to put up with this sort of thing to get cheaper, faster chips. But I can’t imagine Apple doing that.

Well taking in consideration that Zen 2 was a new architecture on a completely new node those problems aren't that big or that many.
"Apple's desktop silicone" can easily have more problems than that.
The chips themselves are good performers, but when Apple’s likely upset with Intel’s QC, and wanting more power efficient chips... the 3000-series isn’t a great counter-example. If Apple was getting engineering samples from AMD, they may have made the final call which way to go based on that experience.

Enthusiasts are more willing to put up with this sort of thing to get cheaper, faster chips. But I can’t imagine Apple doing that.

LoL AMD's 3000-series aren't a great counter-example for better efficiency? Even if the 16 core 3950X uses similar power to an unconstrained 9900k which is an 8 core CPU? Even if AMD Renoir 25W APU is as fast or faster than Intel's 54W chips?

Realistically AMD's CPUs don't have more problems that Intel's CPUs.
I would say Apple simply didn't want to take AMD into consideration because there's no way they wouldn't have been impressed by AMD's Renoir APUs or the latest Threadripper CPUs as they are a generation above Intel's CPUs.
[automerge]1595833345[/automerge]
to be clear though, intel desktops are faster than amd, until you have an application using more than 8-10 cores (which not many really take advantage of yet). I have only been buying amd lately (2 x 3900x and a 3700x) but Intel makes “the best” desktop chips if you measure on most relevant performance. in addition software has been optimized generally for Intel even before raw power is taken into account.

I’m excited to see if Apple beats intel in absolute performance (though sadly it won’t be relevant for me since I only use windows software in my work)

Intel will also be bringing big and little cores to Windows.

How much of their power and heat inefficiency will offset this, and when and if they stay relevant is an unknown that clearly is worrying investors (raw processing power has largely had diminishing returns over the past 5+ years which is part of what allowed intel to be lazy?).

for myself, virtually all computing latency is network related and then benefit of cpu/gpu progress is portability and battery life.
They aren't really faster. Intel's CPU's can boost 400-500mhz higher so in some situations at the cost of higher heat and power consumption they would perform slightly better. Zen 2 CPUs actually have higher IPC than Intel's CPUs.
The only notable situation where Intel CPUs have a lead is in gaming at 1080p with a very fast GPU. Other then that they lose in most productivity workloads.
 
Last edited:
I would say Apple simply didn't want to take AMD into consideration because there's no way they wouldn't have been impressed by AMD's Renoir APUs or the latest Threadripper CPUs as they are a generation above Intel's CPUs.

Apple shifting to AMD wouldn't fix the major problem they have with intel at the moment
  • reliance on a third party for CPU development.
They've switched CPUs multiple times before, and sooner or later they'd get held up by a third party who doesn't have interests directly aligned with where apple want to go. Additionally, building their own CPUs enables apple to distinguish themselves in a competitive market with vertical integration that nobody else in the market is capable of.

So whilst AMD may be a good choice for some Macs today (and Zen is a great platform - today) - it isn't Apple's best plan for the long term. Which is why I think that yes, sure - there is AMD based code in MacOS (no doubt as a hedge, they evaluated all options) - but it isn't "plan A".

Plan A will be to build their own stuff for the reasons mentioned above.
 
Additionally, building their own CPUs enables apple to distinguish themselves in a competitive market with vertical integration that nobody else in the market is capable of.
I'm amazed at how Apple is managing to pull this off in a day and age when everyone is moving in the opposite direction -- i.e. getting rid of vertical integration in order to minimize upfront investment cost. They would not be doing it if it were not in their best interest economically speaking. It's testimony to their silicon engineering expertise that they are confident they can continue to out-innovate processors used in the greater market (i.e. Windows/Linux clients and servers).
 
Apple shifting to AMD wouldn't fix the major problem they have with intel at the moment
  • reliance on a third party for CPU development.
They've switched CPUs multiple times before, and sooner or later they'd get held up by a third party who doesn't have interests directly aligned with where apple want to go. Additionally, building their own CPUs enables apple to distinguish themselves in a competitive market with vertical integration that nobody else in the market is capable of.

So whilst AMD may be a good choice for some Macs today (and Zen is a great platform - today) - it isn't Apple's best plan for the long term. Which is why I think that yes, sure - there is AMD based code in MacOS (no doubt as a hedge, they evaluated all options) - but it isn't "plan A".

Plan A will be to build their own stuff for the reasons mentioned above.
I don't think that's a problem. Apple is a small player in the PC world after all.
 
I'm amazed at how Apple is managing to pull this off in a day and age when everyone is moving in the opposite direction -- i.e. getting rid of vertical integration in order to minimize upfront investment cost. They would not be doing it if it were not in their best interest economically speaking. It's testimony to their silicon engineering expertise that they are confident they can continue to out-innovate processors used in the greater market (i.e. Windows/Linux clients and servers).


I think apple is run differently (and better) to most.

Most companies are run by accountants in a race to the bottom who are looking to get economy of scale for less cost so they can survive on razor thin margins competing with everybody else building essentially the same crap.

Apple are looking to build better integrated products and maintain 40% margins.

There's a clear mindset shift where Apple are attempting to build the things THEY WANT TO USE.

That may or may not align with everybody in the market, but if your desires line up then they have very attractive products.
[automerge]1595835076[/automerge]
I don't think that's a problem. Apple is a small player in the PC world after all.

This is exactly WHY it is a problem. Apple has essentially zero or minimal influence on intel CPU design, so they get whatever intel put out. This was the case with Motorola and would not doubt be the same with AMD.

Bringing that in house they can build exactly what THEY want, not make do with what the other 90% of the market wants.
 
  • Like
Reactions: Analog Kid
This is exactly WHY it is a problem. Apple has essentially zero or minimal influence on intel CPU design, so they get whatever intel put out. This was the case with Motorola and would not doubt be the same with AMD.

Bringing that in house they can build exactly what THEY want, not make do with what the other 90% of the market wants.
They could influence AMD's designs(it's rumored that Apple did influence Vega's design and they were among the first to use it in their computers), I mean AMD does semi-custom chips(for example Sony or Xbox consoles) and that would allow Apple to differentiate themselves. For example Mac Pro has a dual Vega GPU that can't be found anywhere else.

I honestly don't think Apple will be able or willing to compete in price with Windows OEMs after they move to their own silicone so at first the potential costumer base for their computers will be mainly formed around current Mac users, which isn't much and it could prove to be a problem.
 
I honestly don't think Apple will be able or willing to compete in price with Windows OEMs after they move to their own silicone so at first the potential costumer base for their computers will be mainly formed around current Mac users, which isn't much and it could prove to be a problem.

I suspect you're in for a surprise.

Apple will be on 7nm, then 5nm, and later 3nm. Smaller process means smaller/cheaper to produce processors once the process is mature. They are building custom hardware to run the macOS/iOS APIs specifically. They'll be on par with AMD without the burden of maintaining intel x64/x86 compatibility. Even at equal production cost, they won't be paying intel's margin.

This is going to be a big shift. You're going to see a total inability to compete in terms of performance per watt from the PC market. It will very much not surprise me if Microsoft try to ramp up their ARM support a few years later, if nothing else for security for specific enterprise focused devices. x64/x86 exploits/shell-code don't work on ARM.
 
I suspect you're in for a surprise.

I doubt it.

Apple will be on 7nm, then 5nm, and later 3nm. Smaller process means smaller/cheaper to produce processors once the process is mature. They are building custom hardware to run the macOS/iOS APIs specifically. They'll be on par with AMD without the burden of maintaining intel x64/x86 compatibility. Even at equal production cost, they won't be paying intel's margin.

The thing is anybody can use TSMC's fabs, even Intel. Now I know some Intel exec are against using outside fabs but this could change at any time.
Also x64/x86 compatibility may be a burden but is also an advantage.
And thta's the thing, Apple doesn't pay Intel the prices you see on Intel's site. OEMs always get big discounts from Intel, often paying less than half of the recommended price. This is how Intel has been staying in the game. AMD's parts are actually cheaper than Intel's and it still difficult for them to move chips, especially in pre-built computers.


This is going to be a big shift. You're going to see a total inability to compete in terms of performance per watt from the PC market. It will very much not surprise me if Microsoft try to ramp up their ARM support a few years later, if nothing else for security for specific enterprise focused devices.

I'm not convinced we will see "a total inability to compete in terms of performance per watt" from the PC market. And at the end of the day the most important thing for computers is performance/dollar.
ARM is just a side Project for Microsoft, which is mainly a software company after all.

x64/x86 exploits/shell-code don't work on ARM.

Some do work on ARM CPUs but realistically this hasn't been a huge problem. AMD's X86 CPUs don't have the same vulnerabilities that exist in Intel X86 CPUs so it's not an X86 specific problem.
 
Last edited:
I don't think that's a problem. Apple is a small player in the PC world after all.

Apple are about 45 times bigger than Dell.

'Small.'

Adding iPad sales and Mac sales. Makes them a very huge player.

That's before you add the monolithic iPhone.

Which is just another 'PC' by any other name.

Apple going to bury Intel and that cash poor company AMD (who have only just got off the canvas after Intel beat on them for decades. And AMD couldn't compete with Nvidia either, who pulled them through a hedge backwards.)

Money isn't everything. But a few years of decent performance doesn't mean Apple should go with them.

That may be what you want.

But Apple have stated otherwise. Perhaps they know something you don't.

Or perhaps you weren't listening when their Chip lead designer said this was about Power and Efficiency.

Something neither Intel, AMD or Nvidia have been that great at. But that's what happens when you get little competition in the 'tiny' PC market.

Azrael.
 
This is not unique to AAPL. Back in the day companies were more vertically integrated. Then hardware become more commoditized and the most efficient way to compete was on cost and vertical integration went out of style. I started my career as a CPU designer around this time and worked for many of the major chip firms that you know. However, around 2005, vertical integration was increasing and most of my design work has been with the large tech firms that have decided to have their own "in-house" custom silicon solutions.

You can only achieve so much with software and commoditized merchant silicon. For any given solution you want to implement there are many opportunities to optimize throughout the HW/SW stack. It you only control a portion of the solution space you are never going to be the best as you leave a lot of performance, power, and cost on the table. This even applies to Intel and is one of the reasons they had dominated CPUs for so long. Sure, Intel's TMG was always a bit ahead of the industry but that wasn't the key. The important point is that their logic process was highly tuned and customized to exactly what the CPU design required. You can't do that with a fab like TSMC that creates a general purpose process that appeals to hundreds of customers. Some of the larger firms I've worked with have been able to customize TSMC's process a bit but nothing like Intel can do with their own fabs.

I have no doubt that Apple's silicon play will work well but it's not just because of the silicon. It's the very tight HW/SW integration, allowing for a highly optimized solution, that will provide a competitive advantage. Certain critical functions can be offloaded to hard IP on the SOC. The tradeoff's in the chip uarch can be made with full knowledge of what macOS needs to be optimal. This in-house advantage is also being leveraged by Amazon, Google, Facebook, Baidu, Microsoft, Tesla, Samsung, Alibaba, etc.

The problem companies face with vertically integrating in house silicon is that there are very few talented and experienced chip designers left. Many have left the industry and most college students study software, not hardware engineering. Apple has spend many years recruiting the best architects, front end, and backend hardware engineers. Buying PA Semi, a team a know well, gave them a great head start in cultivating a top notch chip team. Having a deep bench of scarce talented hardware engineers contributes to AAPL's competitive advantage.

I'm amazed at how Apple is managing to pull this off in a day and age when everyone is moving in the opposite direction -- i.e. getting rid of vertical integration in order to minimize upfront investment cost. They would not be doing it if it were not in their best interest economically speaking. It's testimony to their silicon engineering expertise that they are confident they can continue to out-innovate processors used in the greater market (i.e. Windows/Linux clients and servers).
 
Last edited:
Apple are about 45 times bigger than Dell.

'Small.'
Yeah Apple is a small player in the computer market.
Dell ships more than 2x more computers per quarter than Apple.
[automerge]1595845212[/automerge]
That's before you add the monolithic iPhone.

Which is just another 'PC' by any other name.
LoL, OK the iphone is a "PC".

I have to say your posts are funny. It's obvious you don't know much about computers in general.
 
I think apple is run differently (and better) to most.

Most companies are run by accountants in a race to the bottom who are looking to get economy of scale for less cost so they can survive on razor thin margins competing with everybody else building essentially the same crap.

Apple are looking to build better integrated products and maintain 40% margins.

There's a clear mindset shift where Apple are attempting to build the things THEY WANT TO USE.

That may or may not align with everybody in the market, but if your desires line up then they have very attractive products.
In addition, Apple has shown a remarkable ability to play the long game in technology development strategy. There are a series of acquisitions that seem to disappear into the Apple void long enough that I start to assume they’ve been disbanded— but then we see their expertise driving a major shift later on. Then there are these small shifts in hardware and software each year that seem almost arbitrary until a few years later when we realize they’ve been testing ideas at scale and slowly nudging developers along so they’re better positioned for a future release.

That’s not easy discipline. A lot of companies expect acquisitions to prove their value almost immediately and insist on much shorter timelines to realize returns on investment.
 
  • Like
Reactions: throAU
Yeah Apple is a small player in the computer market.
Dell ships more than 2x more computers per quarter than Apple.
[automerge]1595845212[/automerge]

LoL, OK the iphone is a "PC".

I have to say your posts are funny. It's obvious you don't know much about computers in general.

You're in denial. :)

Mobile is here and dwarfs the PC market. The PC is niche. And honkin' tower buyers uber niche.

Performance, Power and Efficiency are the way it's going whether you want it or not. Apple does not agree with you.

PC is tiny market. Dell can shift alot of low margin crap. It's small potatoes.

Azrael.
 
Also x64/x86 compatibility may be a burden but is also an advantage.
The only advantage is backwards compatibility. The sudden demand for COBOL programmers when Y2K started coming up was a testament to how much legacy code there is out there.

There is no raw benefit to this compatibility though. From a consumer perspective, nobody cares what chip they run on, they care about the ability to continue manipulating their data.

The x86 philosophy has been to never drop support for anything. Most of us haven’t been running in 16bit land for decades, but most Intel chipsets still include special handling for the addressing hacks of that era in case someone out there is still relying on them. At some point all that compounded technical debt overwhelms the short term benefits of backwards compatibility.

OEMs always get big discounts from Intel, often paying less than half of the recommended price.

And yet Intel has traditionally reported gross margins of 60%-- not recommended gross margins, but after the deal money in the pocket gross margins. The latest quarter was 54%, but those are also company wide so we know that the margin on their processors has to be higher than that if the storage business is losing $1.2B a year.

That means that if Apple Silicon costs twice as much to manufacture, Apple still breaks even and has the added bonus of being able to customize the device to their needs and roadmap.

It also means the cost of backwards compatibility to x86 is pretty high. How much silicon is being added to maintain compatibility and work around the limitations of an archaic instruction set? More silicon means fewer chips per wafer, it means lower yields, and all that added cost is multiplied by 1.5 or 1.6. How much silicon is being added by the larger process size? That added cost is multiplied by 1.5 or 1.6.

Itanium supposedly devoted more than 30% of their die to x86 compatibility. Even ignoring the hard to guess yield impacts, that a die that's 1.3 times bigger with a markup of 1.5 for margin= 1.95 times the cost. x86 compatibility essentially doubled the cost of Itanium.

I'd hope the cost of keeping the underlying microcode engine of the x86 chips compatible with the x86 instruction set isn't a factor of two, but it's certainly not free. Being stuck at 14nm is certainly not free.

And at the end of the day the most important thing for computers is performance/dollar.

Performance per dollar is limited by the ability to bleed off heat. In consumer desktops, which are a shrinking market segment, you start to reach limits of the available and affordable cooling technologies. In laptops, that's compounded by the constrained form factor and the added requirement of extended battery life. In data centers, the cost of operation is dominated by electrical power which not only feeds the racks, but also the massive cooling systems.

ARM is just a side Project for Microsoft, which is mainly a software company after all.

As a software company, Microsoft will build for whatever platform people are buying. Apple Silicon started as a side project too, I'm sure. Windows on ARM was a hedge against this kind of shifting market, not a novelty.

Windows 10 made a big deal out of being an OS that scales from IoT to servers. If Apple Silicon shows that ARM is a processor platform that outscales x86 from wearable to workstation, don't you think MS will take notice?
 
Last edited:
  • Like
Reactions: throAU
The sudden demand for COBOL programmers when Y2K started coming up was a testament to how much legacy code there is out there.
Meanwhile, in 2020...
 
  • Like
Reactions: Analog Kid
You're in denial. :)

Mobile is here and dwarfs the PC market. The PC is niche. And honkin' tower buyers uber niche.

Performance, Power and Efficiency are the way it's going whether you want it or not. Apple does not agree with you.

PC is tiny market. Dell can shift alot of low margin crap. It's small potatoes.

Azrael.

Intel doesn't care about the size of the mobile market, or the size of Apple's services business. They're not going to make a custom CPU for someone who sells 4% of the market, even if that person also sells a lot of other stuff.
 
  • Like
Reactions: M3gatron
Backwards compatibility is nice but not as critical as it used to be when the sacred x86 ISA was conceived.

1) The current SaaS trend is for more processing to shift from the client (aka your laptop, desktop, etc) to the datacenter where everything is virtualized. At the hyperscalers the code is highly customized given the leverage factor. There are still loads that are optimized for x86. However, there's plenty of room for alternative architectures. I was at a startup that created a CPU with a brand new custom ISA and architecture along with the software ecosystem. Facebook and Google couldn't care less about this as they saw some of their customized code gain 20x performance at the same power of x86. That's real $$$ at the scale at which they operate. At the datacenter they couldn't care less that our CPU didn't run legacy x86 apps like Word Perfect, Core Draw, or Lotus. They cared about tasks like DL training and inference, distributed databases, network intrusion, etc.

2) The software ecosystem (VM, frameworks, tools, etc) are a lot more robust than they used to be. Code is increasingly less tied to a specific ISA than before as it's simply easier adopt new ones. Apple has undoubtedly invested significant resources into their software ecosystem to facilitate adoption of their Arm SOC. This trend will continue

3) There are *lot* more software devs today than even 10 years ago. The labor market has exploded in line with the wealth that is being created. A company can more easily throw money (dirt cheap these days) and warm bodies to port and optimize code as long as it makes business sense.

4) Apps created with contemporary languages and frameworks are more modular. A lot of the heavy lifting is done by off the shelf libs and the framework. Not much is created from scratch these days. This modularity makes it much easier to port a current app than one written in COBOL 40 years ago.

In short the financial and resource barrier to porting an app from one ISA to another is much lower today than before. If it makes business sense to port something to Apple's SOC it will be done. Today that increasingly means addressable market instead of porting NRE. With the lower cost of a custom SOC Apple could certainly lower costs to increase marketshare if they wanted to.


The only advantage is backwards compatibility. The sudden demand for COBOL programmers when Y2K started coming up was a testament to how much legacy code there is out there.

There is no raw benefit to this compatibility though. From a consumer perspective, nobody cares what chip they run on, they care about the ability to continue manipulating their data.

The x86 philosophy has been to never drop support for anything. Most of us haven’t been running in 16bit land for decades, but most Intel chipsets still include special handling for the addressing hacks of that era in case someone out there is still relying on them. At some point all that compounded technical debt overwhelms the short term benefits of backwards compatibility...
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.