Then it would be nice if you provided a link to that statement.i thought they said q1 2019 for release
Hint: You won't find it.
Then it would be nice if you provided a link to that statement.i thought they said q1 2019 for release
My guess is they've already chosen which Intel CPU they want, and are setting the release date based on that.
We'll see a prototype at WWDC 2019, but it's actual release date will be based on when they can get the first run CPUs from Intel, which could be... very variable.
But that would be why they'd want to be cagey on the date. They know they can show it in 2019 because Intel will have prototype CPUs. But they won't know exactly when they can release.
It's entirely reasonable to assume in 2017/2018 they picked an Intel CPU that then slipped into 2019 and will possibly slip into early 2020.
i thought they said q1 2019 for release
Charlie was calling about Intel problems with 10 nm process for past 2 years, and people were calling him hater, idiot, fanboy, shill, etc.Enjoy the hype? The problem with Demerjian is that sometimes he gets on the 'hater' hype train and goes too far.
Don't read clueless Bloggers.I read a blog (I think it was either Ben Thompson or Ben Evans) where it was suggested that Intel should have repositioned themselves as foundry only, and not bother with designing processors.
Probably too late for that now.
I doubt a Q1 release. They were very forthcoming that the Mac Pro “would not be coming this year” with Apples track record of actually delivering (2013 Mac Pro, AirPods, HomePods, AirPower) I wouldn’t expect anything before WWDC, NAB if they really had the **** to say they were serious about Pro users at a Pro venue.i thought they said q1 2019 for release
I doubt a Q1 release. They were very forthcoming that the Mac Pro “would not be coming this year” with Apples track record of actually delivering (2013 Mac Pro, AirPods, HomePods, AirPower) I wouldn’t expect anything before WWDC, NAB if they really had the **** to say they were serious about Pro users at a Pro venue.
Some "Pro" dog and pony show is absolutely not necessary. When they have something reasonably done (and have some relatively solid production ship timelines ) they should do a demo. Waiting for a fixed in time dog-and-pony show for some collection of other products is looney toons for a product this late. This product is soooooo late it is a spectacle all by itself. They don't need any more spectacle than they already have.
If Apple's shovel all Macs into the June - October window philosophy have worked extremely well over the last 4 years, that would one thing. In many respects it hasn't. Log jamming more systems in smaller parts of the calendar year has turned into a highly constipated (inhibited ) product flow over last 4 years.
Shifting Mac Pros away from the iPhone launch , macOS beta window, and masOS initial (probably buggy) release would be a good thing. If Intel's and AMD new speed-bumped product line ups match up to that time shift window then all the more so. ( 2013 Mac Pro release was about same Quarter new Xeons shipped and Thunderbolt 2 was ramping up in volume. And tease was about time Intel was throughly leaking info on Xeon bump; just not officially released. Ramped volume for MP 2013 was pragmatically Q1 2014. )
June for Intel mobile based products made sense because Intel has typically done April-June releases in that space. The Xeon workstation stuff has been in Q4 ish time frames over last several years. So picking a following Q1 would tend to back accurate even in the context of an Intel slide for a quarter. The safe pick is a quarter or so after what Intel is shooting for. What Apple sorely needed was a safe pick; not some yippie-ki-yay flyer.
The AirPods and HomePods and AirPower were first generational products. In some sense, that can't be "late" because there were never offered before ( they had a promise but Apple had a "gee this is buck rogers stuff" excuse. Nobody else had done them either technically). The Mac Pro is about 180 degrees opposite of that. It is horribly late. Some substantive fraction of the angry mob with pitchforks wants them simply to ship the old, old machine with 'as minimal as possible" changes. So the whole "this is new and super fantastic Area 51 like technology" reality distortion field isn't going to fly.
To be blunt, Q4 2018 would have been painfully late. That they are talking 2019 is only indicative that they were screwing around engaged in deeply unproductive( or just plain no ) activities in 2015 and 2016. They already were on the clock. They've been on the clock since the MP 2013 was 2-3 years old. They absolutely are explicitly on the clock since April 2017; over a year ago. The Airpods , HomePod, and AirPower didn't come two-three years later.
Cherry on top is that they have working iMac Pro. A working Xeon W and current top end AMD GPU can't possibly be some deep, dark mystery to getting a macOS up and running on.
There’s no requirement for a dog and pony showing... it is interesting that both the initial mea culpa and follow up we’re both in April.
Let’s be honest Apple has f***** up in the pro space. The MacBook Pro had a press release “release”, and there is little precedence for Apple to do much.
They did preview the iMac Pro, but this Apple is something else.
I think we’ll be lucky to see something by NAB or WWDC, more than likely it’ll be closer to 2020. But Tim/Craig/Phil can try to prove me wrong.
Charlie was calling about Intel problems with 10 nm process for past 2 years, and people were calling him hater, idiot, fanboy, shill, etc.
He was dead right from the ground up. What makes you think this time he is wrong, when he in previous article: https://semiaccurate.com/2018/07/17/amds-rome-is-going-to-be-a-monster-cpu/ even pointed out to specific configuration Rome will have?
Don't read clueless Bloggers.
It's their foundry manufacturing group that is sinking Intel down. It is directly because TMG is not able to push out 10 nm CPUs, Intel is going to struggle technologically for the next 2-4 years. If they will ditch all of their technology, what they will do, when they are going to be stuck on 14 nm process next couple of years, when everybody will be on 7 nm? Intel made good decisions to search for another markets to survive, till they can push out new architectures, and new process nodes.
Everything at this moment points to a situation, where Icelake architecture will actually compete with Zen 3 architecture.
And remember, we will see next year Zen 2 architecture. Zen 2 will be slightly faster per core than Skylake, but will offer double the cores, and much better efficiency, because of 7 nm process.
Matisse/Rome is not what People think it is. When you will see guys what AMD did on 7 nm CPU you will be amazed at brilliant technical and business genius, Lisa Su have decided to do.
Because this decision translated into efficiency of the design, and its simplicity, that allowed AMD to perfectly scale it.
That would have been a horrible idea.
Don't read clueless Bloggers.
Yeah.Charlie always loved to spin up doom and gloom scenarios for Nvidia and Intel before. Intel is definitely having some trouble with their 10nm and their future process but lets not act like sky is falling down on them lol. Intel is most likely counter Rome with MCM solution and will not have problem competing in terms of core count. Intel will no longer be able to retain their sweet margins on their Xeon line of processors anymore but they are more than capable of competing in terms of performance. Intel's 10nm is comparable to other foundaries 7nm high performance node. (First EPYC 2 with TSMC is going to fabbed on 7nm lp) Assuming if they hit their target of 2019-2020, they shouldn't have much trouble competing against AMD.
Real question would be if Intel will have same kind of problem with 7nm process. if their 7nm EUV (comparable to other foundaries 5nm) comes as scheduled, then they will not have any problem. If not, then they will have some real trouble in the future (assuming other foundaries next node comes without any delays)
Charlie was calling about Intel problems with 10 nm process for past 2 years, and people were calling him hater, idiot, fanboy, shill, etc.
He was dead right from the ground up. What makes you think this time he is wrong, when he in previous article: https://semiaccurate.com/2018/07/17/amds-rome-is-going-to-be-a-monster-cpu/ even pointed out to specific configuration Rome will have?
It's their foundry manufacturing group that is sinking Intel down. It is directly because TMG is not able to push out 10 nm CPUs, Intel is going to struggle technologically for the next 2-4 years. If they will ditch all of their technology, what they will do, when they are going to be stuck on 14 nm process next couple of years, when everybody will be on 7 nm? Intel made good decisions to search for another markets to survive, till they can push out new architectures, and new process nodes.
Everything at this moment points to a situation, where Icelake architecture will actually compete with Zen 3 architecture.
And remember, we will see next year Zen 2 architecture. Zen 2 will be slightly faster per core than Skylake, but will offer double the cores, and much better efficiency, because of 7 nm process.
Yeah.
350W Cooper Lake Triple die CPU, with low core clocks, that will cost eye watering 40-50k$ will be perfectly capable of competing with 64 Core, 180W-250W TDP CPU that will cost around 10000$.
Cascade Lake is just a refresh of Skylake-SP. 28 cores, with higher TDPs(200-250W), the same core configs. And that will cost 20000$.
Lets say that 64 core/128 Thread, 180W TDP CPU will cost 10000$.
How do you see Intel offering competitive?
AMD will destroy Intel on price/performance/power factor.
In all of the above by a factor of 2X!
You know perfectly why Charlie is "wrong"First, I didn't say anything about him being wrong about 10nm. I said he sometimes goes too far. He takes nuggets of truth and then starts to wrap histrionics around them. That Intel had/has a problem 10nm. After year one, that wasn't some super unique insight. That they have a problem isn't the real issue with his over-the-top commentary. It is that this specific problem is going implode or decimate the company somehow. That's what is lacking. It was surprising that Intel let this fester for so long.
Second, as for wrong now... go back and re-read my response. Everything he screwed up I went through point by point.
As for his article about AMD.... actually that is illustrative of why isn't to be solely trusted about the doom of up coming processor products. To quote:
"
... If AMD stuck with it’s 2016 roadmaps, it would be in for a big fight with Intel next year. ...
....
This is the long way of saying two things, one we got the information right two years ago but that information changed and it is wrong now. ..."
'we were right but it s wrong'. Carefully claiming correctness and throwing wrong into misdirection. That's the problem with his narratives. If he is right about something he will mention it over and over and over again into every article he can stuff it into. And when not really on it is mainly about misdirection. I get it ... these open articles are sales pitches for his "paid for" content. But sales pitches aren't objective analysis.
Roadmaps change over time. Intel's has changed as had AMD's. His doom and gloom is wrapped up in Intel being rigidly stuck with their 2015-2016 roadmap. They aren't. Nor are they 100% stuck with their 2017-2018 pricing.
If they could make 10 nm products, we would already see them, in massive quantitiesIntel can probably make stuff at 10nm. Part of Intel's problem is that they are hooked on super fat margins like a crackhead. The problem is more likely that they can't make relatively large dies at super fat margins at "normal" prices with 10nm. They aren't going to completely junk their TMG and start over. They do have to wait though to get the 7nm gear ( which for some of their equipment is upgrades..... but can't go down to parts bin at Fry's and buy these. They have multiple year lead times to buy and deploy. )
Intel's diversification moves have been slow and extremely expensive. The basic ideas were OK but the execution has not been stellar. Some stuff like McAffe was pretty bad. ( bought for $7B in 2010 and spun out into a joint venture in 2016 for revaluted $4B . That was $3B down the drain... enough to build a modest foundry for 3rd party customers. ). But not devastating losses (every investment can't be a winner.).
Infineon (cell radios ), Altera , and Mobileeeye have all been blockbuster acquisitions that haven't been huge home runs. They've often been relatively slow in getting those profitably merged into their own TMG fabs. ( Nervanna isn't quite blockbuster in cash outlay, but that too isn't deep integrated and creeping toward 3 years in. Not late yet, but certainly not early. )
Of course not. Intel changed a lot in Ice Lake architecture, while it was, and they wanted to do something different with the architecture(15-20% IPC increase with lower clock speeds).Yes, but is Ice Lake the same Ice Lake it was back in 2015 ?
Feb 2017:
" .. On speaking with Diane Bryant, the 'data center gets new nodes first' is going to be achieved by using multiple small dies on a single package. But rather than use a multi-chip package as in previous multi-core products, Intel will be using EMIB as demonstrated at ISSCC: an MCP/2.5D interposer-like design with an Embedded Multi-Die Interconnect Bridge (EMIB). ... "
https://www.anandtech.com/show/1111...n-core-on-14nm-data-center-first-to-new-nodes
Intel was shifting away from monolithic dies also. The data center stuff was suppose to be first to 10nm++ ( but it looks like 10nm++ is probably going to be the real first, high volume iteration. )
At what % will AMD get 120 mm2 die on 7 nm process, that is pretty mature right now?Zen 2 has yet to ship in high volume. It probably will scale, but at what yield rates? Intel has a problem, but the notion that AMD is 100% problem free needs alot more completed phases to it.
You know perfectly why Charlie is "wrong".
Its because he writes about stuff 12-24 months before they happen, and a lot changes on the road. Why he changed his point of view on AMD?
Because in 2016 there was one design, that would happen from AMD: 16C/32T CPU, and EPYC 2 would be made from 4 dies plus something additional, for IO, on the package. Current design for Matisse, and Rome are MUCH different.
If they could make 10 nm products, we would already see them, in massive quantities.
Here is good article on 10 nm Launch Intel did earlier:
https://semiaccurate.com/2018/05/29/is-intels-upcoming-10nm-launch-real-or-a-pr-stunt/
The problem with 10 nm process is not margins.
10 nm process essentially is not yielding at all to the point where Intel would LOSE MONEY on each wafer, if they would sell products made on it, and they would not be performance competitive with their own 14 nm++ and 14nm+++ process, because of lower clock speeds.
Of course not. Intel changed a lot in Ice Lake architecture, while it was, and they wanted to do something different with the architecture(15-20% IPC increase with lower clock speeds).
BUT, 10 nm process, or rather 12 nm process as it should be called right now, per:
https://semiaccurate.com/2018/08/02/intel-guts-10nm-to-get-it-out-the-door/ will bring performance and density decrease from initial performance/density targets, for 10 nm process.
At what % will AMD get 120 mm2 die on 7 nm process, that is pretty mature right now?![]()
No. Its because I payed him the money for the articles, and I can see that what he was writing in them was 90% correct, and just the cosmetics were not correct.Christ, you are basing all your arguments on Charlie's articles? There is a reason his website is called Semiaccurate
But I guess it doesn't matter as long as it says whatever you want to hear right?
No. Its because I payed him the money for the articles, and I can see that what he was writing in them was 90% correct, and just the cosmetics were not correct.
The above and the below are not rationally consist plans. Those are two excuses for exceedingly poor planning and product management.
WWDC is too late. Pragmatically, Apple will have to show in April around that anniversary date they have now laid the groundwork for. Two reasons. Between April 2019 and June 2019 a giant hailstorm of hate is going to ran down on them. "The dog ate my homework wait until June for only a tease" is equally as screwed up as "We'll we screwed up and it is now 2020" would be. Both are a Lucy pull the football move. Many folks are going to be pissed by the end of 2018. The anniversary date is only going to unleash the 'hate' in a giant wave. Apple the vapourwave king is going to tech porn press headline for months by WWDC.
This is Charlie' Twitter:Cosmetics? Chuckle. His articles doom and gloom about how SP Skylake is grossly underperformance what Broadwell did. How that stalled, weakened line up and that 10nm was just going to put it down from there.......
This is Charlie' Twitter:
https://twitter.com/CDemerjian
Read his feed, because he was at Intel's conference, and look what he has to say about Intel, then read his latest articles.
Intel is lying, and trying to spin things out so that analysts cannot see how murky waters are(THE LAKES!), that things are really bad.
And yes - they are. But that you will find out, only in his articles, because he has solid data about the technology behind 10 nm process.
P.S.
What If Apple cannot give you Mac Pro, because of Intel's 10nm fiasco? What if Apple wanted to do something interesting, but lack of tech from Intel does not allow them to do so?
What if Apple was designing MP with IceLake in mind?What Lack of tech? Workstation design should be flexible enough to update to whatever new tech that comes out every year. Be it PCIE 4.0, New thunderbolt, etc, etc.
If Apple can't release Mac Pro because of Intel's 10nm fiasco, then might as well not come out because that means it might be another MacTrashCan 2.0, putting themselves in thermal corner again.
Look up "TPM".The "boot" GPU problem would simply go away, if the Apple mMac Pro re-design were to include a PC-standard UEFI bios instead of their non-standard EFI scheme, which is only there for anti-Hackintosh purposes. Surely there is a better way for them to attempt to prevent the Hackintosh, while still retaining the ability to use standard and un-modified PC graphic cards (as the "boot" GPU)?
That would simply mean that Apple is stupid. (Although not much about the MP6,1 or MP7,1 would argue against the point that "Apple is stupid".)What if Apple was designing MP with IceLake in mind?
What if Apple was designing MP with IceLake in mind?