No they shouldn’t. AMD is incapable of sustained success.Apple should built a semi custom processor with AMD.
No they shouldn’t. AMD is incapable of sustained success.Apple should built a semi custom processor with AMD.
It says right there Meltdown does not apply.I did look at that page and it shows nothing about Meltdown. It shows for Spectre.
It's the majority of the article, but here are some highlights:…what?
The debut of Intel's 10nm process has been a particular sore spot, with the forthcoming Whiskey Lake set to be the fifth new architecture debut in the 14nm process. Prior to 14nm, Intel had maintained a two architecture, "tick-tock" strategy for its processors, where a new foundry node denoted a small architecture update over the previous processor as a "tick," and a more significant architectural evolution as a "tock" on a matured process.
We first reported on the demise of the tick-tock strategy in 2016. Things have only grown worse for Intel since then as 10nm has faced further delays. To put this delay in perspective, Intel's original roadmaps had 10nm technology debuting in 2015. There are several reasons for the delay, but Intel CEO Brian Krzanich explained that some features in Intel's 10nm process require up to five or six multi-pattern steps, whereas other competing foundries are known for up to four steps in 10nm or 7nm processes.
This development has consequences for Intel, its customers, and its competitors. First, Intel has lost the technology advantage it once held over the rest of the semiconductor industry. While you cannot compare the dimension in the node name directly across foundries, competitors such as TSMC, Samsung, and Global Foundries have largely reached parity with Intel's 10nm on 7nm processes, with transistor densities besting Intel's own at 10nm. Intel used the transistor density metric to combat the marketing furor that the node names created, but it seems to have lost those bragging rights as well.
More importantly, Intel's competitors are starting to enter volume manufacturing of competing 7nm nodes. While the technology leadership was only important to Intel before as an enabler for superior products, its relatively recent opening of fabs to outside customers has lost some of its luster as a result of these developments.
In the earnings call, Intel also acknowledged that it expects to cede marketshare to rival AMD, as its rival has enjoyed recent success thanks to the debut of new CPU architectures such as Zen that have begun to close the performance gap with Intel's own CPUs. AMD is expected to make significant gains in the server space thanks to recent developments, and after spinning off its own foundry into Global Foundries, has been using a mixture of the former in-house foundry and TSMC. AMD is expected to debut consumer products on the 7nm node in 2019.
What barrier? Can you be more specific?When you get down into the single digit nm it's pretty obvious Intel and TSMC are hitting the limits of silicon. We've already hit the gigahertz barrier, all there is left is to add more cores to the cpu and architecture improvements. I doubt CPUs and SOCs are going to get much faster.
I would think it would be rare that they introduced such a bug even if they modified the design somewhat.Who knows what Apple has been doing with the ARM license. Apple is clearly ahead of the curve when designing processors.
But clearly, Apple wouldn't be patching iOS if their hardware isn't affected.
When you get down into the single digit nm it's pretty obvious Intel and TSMC are hitting the limits of silicon. We've already hit the gigahertz barrier, all there is left is to add more cores to the cpu and architecture improvements. I doubt CPUs and SOCs are going to get much faster.
Say Nintendo, Sony, Microsoft, and now Intel.No they shouldn’t. AMD is incapable of sustained success.
What barrier? Can you be more specific?
Say Nintendo, Sony, Microsoft, and now Intel.
https://www.geek.com/chips/intel-predicts-10ghz-chips-by-2011-564808/
None of the Intel CPU's are going past 5ghz, unless they're overclocked with liquid cooling. How are they going to solve that problem?
Use liquid cooling.https://www.geek.com/chips/intel-predicts-10ghz-chips-by-2011-564808/
None of the Intel CPU's are going past 5ghz, unless they're overclocked with liquid cooling. How are they going to solve that problem?
Sometimes there's only one year gap. It's not AMD's fault if console cycles are long.Perfect example proving my point. AMD does one processor for a game console maker every 5 years or something. Big difference between that and producing a range of processors every 18 months or so.
Use liquid cooling.
Sometimes there's only one year gap.
Maybe, but I don't think any of the bugs he referred to at the time were related to Spectre or Meltdown.Theo de Raadt was so right all those years ago...
https://www.theregister.co.uk/2007/06/28/core_2_duo_errata/
One year MS, the next Nintendo.Since when? What console maker has updated their CPU twice in two years?
I was an AMD cpu designer for 9+ years. You really don’t want to go there.
That’s not what I said. That’s different processors in the same generation - just diffferent SoCs. Big difference between that and what Apple needs to provide for its lineup and refresh on a regular basis.One year MS, the next Nintendo.
One year Sony, the next MS.
No they shouldn’t. AMD is incapable of sustained success.
AMD has been significantly refreshing their APUs basically every year, and they continue with Ryzen.That’s not what I said. That’s different processors in the same generation - just diffferent SoCs. Big difference between that and what Apple needs to provide for its lineup and refresh on a regular basis.
AMD is supplying Vega GPUs to Apple Already. AMD is also providing semi-custom CPU+GPU to Sony PS4 and Microsoft Xbox One.
Care to elaborate on how AMD can't supply whatever Apple may need?
You don’t WANT to go past 5ghz becuse Power = C V^2 f. Anything you can do to increase performance by doing something other than increasing f is a good idea. Especially when you consider that the easiest way to increase f is to increase V.
AMD has been significantly refreshing their APUs basically every year, and they continue with Ryzen.
Since you're an engineer, I have a question for you. Since GPUs are so much faster at certain tasks, such as cypto currency mining, 3D gaming and graphics design, why are we using DDR4 instead of Gddr5/5x, or HBM/2? Wouldn't a GDDR5 motherboard give a substantial improvement over DDR?
They don't have to pay for their own fabs anymore. There will not necessarily be another Bulldozer.Yes, and within a year or so they will be behind the game again. Just like when we were ahead of the game with Opteron/athlon 64/K8. How long did that last?