Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
When do you guys think the new chips and their variants will start releasing? I just bought a 2017 i7 MB 12" and is really debating if I will keep it or wait (been waiting for ages and do not want to wait anymore!). But if they update very soon, I will have immense buyers remorse for sure.

No reply saying "there will always be new things around the corner" hahaha.
well... There will always be new things around the corner tho!
If the computer works for you, keep it. Potential AMD 2020 model will be even better, might as well wait for that one. Or potential 2022 ARM model.

fwiw, i just ordered 2018 13" and Mac Mini. Could cancel and wait another half a year. Then maybe those will have issues and intel will announce a new set of chips which will also be better than Sunny Cove.

etc
 
well... There will always be new things around the corner tho!
If the computer works for you, keep it. Potential AMD 2020 model will be even better, might as well wait for that one. Or potential 2022 ARM model.

fwiw, i just ordered 2018 13" and Mac Mini. Could cancel and wait another half a year. Then maybe those will have issues and intel will announce a new set of chips which will also be better than Sunny Cove.

etc
Dang! First comment of all things.

Yeah its a new thing around the corner, although is it a big jump compared to 2015 - 2018? Considering we were in 14nm for 3 years now and does the sunny cove have integrated thunderbolt support on chip?
 
Can't wait till AMD fan armada learns that physics works the same for all companies.

Actually, physics does work different for differenty companies depending on the technology they have available. Intel is stuck on 14nm with a 10nm process they keep postponing. AMD will have a chiplet design on 7nm next year, which will allow them to produce high core, low power processors much cheaper than Intel can.
 
  • Like
Reactions: itguy06
Actually, physics does work different for differenty companies depending on the technology they have available. Intel is stuck on 14nm with a 10nm process they keep postponing. AMD will have a chiplet design on 7nm next year, which will allow them to produce high core, low power processors much cheaper than Intel can.

The laws of physics work the same on 14 nm, 10 nm and 7 nm and regardless of a company. It would be chaos if they didn't.
 
Last edited:
Actually, physics does work different for differenty companies depending on the technology they have available. Intel is stuck on 14nm with a 10nm process they keep postponing. AMD will have a chiplet design on 7nm next year, which will allow them to produce high core, low power processors much cheaper than Intel can.
I believe you are mistaken

https://www.semiwiki.com/forum/content/7602-semicon-west-intel-10nm-gf-7nm-update.html

in short,
AMD 7Nm == Intel 10Nm...
So AMD won't be ahead.
 
Dang! First comment of all things.

Yeah its a new thing around the corner, although is it a big jump compared to 2015 - 2018? Considering we were in 14nm for 3 years now and does the sunny cove have integrated thunderbolt support on chip?

I would encourage you to read this article - https://www.anandtech.com/show/13699/intel-architecture-day-2018-core-future-hybrid-x86/8

Until the first CPUs tape out and we can get some benchmarks, I would not hazard a guess as to how much of a jump beyond Sandy Bridge these will end up being. Intel's been doing process refinement for so long on 14nm and trying to get 10nm stable that I would be cautious about hoping for large gains.

Thunderbolt 3 is mentioned but, again, I would not get my hopes up until there are actual CPUs announced. Intel has continually disappointed with its roadmaps since Broadwell.

EDIT 11:16AM, 12/14/18: I have also not been able to find any reference to Intel including support for PCIe version 4 (or 5, even though it is not yet 100% finalized). Considering that AMD is at least starting to support PCIe v4.0 with the upcoming EPYC Rome CPU, I would think that Intel is either evaluating it or they are going to punt on it and wait for PCIe v5.0. I assume that these CPUs will rely on PCIe 3.0, as Intel would have most likely trumpeted PCIe v4.0 inclusion on die if it was going that route. Personally, I think this is still yet another parlor trick to give Intel time to fix their 10nm process issues. Meaning that Sunny Cove was on the map already, just not called Sunny Cove.
 
Last edited:



Intel today introduced Sunny Cove, its next-generation processor microarchitecture designed to increase performance and power efficiency.

macbook-pro-2018-intel.jpg

Sunny Cove microarchitecture, built on a 10nm process, will be the basis for Intel's next-generation Core and Xeon processors later next year according to the company, making them appropriate for potential 2019 models of the MacBook, MacBook Air, MacBook Pro, iMac, iMac Pro, Mac Pro, and Mac mini.

Intel also unveiled new Gen11 integrated graphics with up to double the performance of its Gen9 graphics paired with Skylake-based processors. Gen11 graphics will support 4K video streams and 8K content creation in constrained power situations and feature Intel's Adaptive Sync technology for smoother gaming.

Intel did not provide a comparison of Gen11 and Gen10 graphics, paired with Cannon Lake-based processors.

For those who are ever-confused by Intel's roadmap, it is believed that Sunny Cove processors paired with Gen11 graphics will be called Ice Lake, which succeeds Coffee Lake, Whiskey Lake, Amber Lake, and Cannon Lake.

Intel reaffirmed its plan to introduce a discrete graphics processor by 2020, providing Apple with another option beyond its current provider AMD and former provider Nvidia for future MacBook Pro, iMac, iMac Pro, and Mac Pro models.

Intel has essentially been iterating on its Skylake microarchitecture since 2015, so it is refreshing that the chipmaker is finally moving on to something new. But with rumors of Macs switching to custom ARM-based processors as early as 2020, it might not be long after Sunny Cove that Apple moves on too.

Article Link: Intel Unveils Next-Generation 'Sunny Cove' Processors and Graphics Appropriate for 2019 Macs
[doublepost=1544806935][/doublepost]Unfortunately, Apple is well known for neglecting its desktop line. Judging from apples’s past the desktop line isn’t due for an update until 2021.
 
Actually, physics does work different for differenty companies depending on the technology they have available. Intel is stuck on 14nm with a 10nm process they keep postponing. AMD will have a chiplet design on 7nm next year, which will allow them to produce high core, low power processors much cheaper than Intel can.

LOL. No.

Maxwell's equations and Schroedinger's equation apply no matter what process node you're on.
[doublepost=1544807379][/doublepost]
I believe you are mistaken

https://www.semiwiki.com/forum/content/7602-semicon-west-intel-10nm-gf-7nm-update.html

in short,
AMD 7Nm == Intel 10Nm...
So AMD won't be ahead.

Correct. TSMC 7nm has essentially identical feature sizes to Intel 10nm. Intel uses a different naming scheme.
 
I would encourage you to read this article - https://www.anandtech.com/show/13699/intel-architecture-day-2018-core-future-hybrid-x86/8

Until the first CPUs tape out and we can get some benchmarks, I would not hazard a guess as to how much of a jump beyond Sandy Bridge these will end up being.

On page 4
"...As part of the Architecture Event, Intel ran a number of demos on a chip that was supposedly based on the new Sunny Cove cores and Gen11 graphics. ..."

On page 7
" ...The demo system that Intel had on display looked similar to the previous Sunny Cove design, ... "

Technically, Intel is a bit past the tape out stage. There are physical implementations of the Sunny Core and Gen11 actually running. That doesn't mean they are defect free enough to ship in large volume production numbers. But this is way past the theoretical and/or simulation only stage. However, it is the wrong time to be definative about bragging about performance that is "ready to ship". ( and AMD hasn't released end product for late 2019 either. )

They don't have instances of every possible CPU package they will build, but some small subset of the product line up has a pretty decent chance of shipping in the second half of 2019... probably closer to the end of 2019 than Intel will like.

Intel's been doing process refinement for so long on 14nm and trying to get 10nm stable that I would be cautious about hoping for large gains.

Intel doesn't need large gains they just need to get back onto the track of 10-20% gains ( without just throwing higher power and clock at it. ). The primary thing in what they talked about is not the 10nm part but that they are getting out of the "pain" of trying to put 100% of the whole CPU package onto the same process.

Thunderbolt 3 is mentioned but, again, I would not get my hopes up until there are actual CPUs announced. Intel has continually disappointed with its roadmaps since Broadwell.


EDIT 11:16AM, 12/14/18: I have also not been able to find any reference to Intel including support for PCIe version 4 (or 5, even though it is not yet 100% finalized). Considering that AMD is at least starting to support PCIe v4.0 with the upcoming EPYC Rome CPU, I would think that Intel is either evaluating it or they are going to punt on it and wait for PCIe v5.0.

PCI-e v4.0 probably ran into the same problems as several other things. It was probably tied up in a "Cannon Lake" design family revision (prehaps "Ice Lake" ) and that has been tied up in 10nm process.
There is no particularly good reason to punt on PCI-e v4.0 . If intel's X^e discrete graphics is targeting 2020 then there is a pretty good chance that Intel will merge the CPU and GPU there. ( Coming in 2020 means they have already gone well past started now and PCI-e v4.0 would be a relatively stable target to pick last year. )

The talk was an architecture "big picture" discussion. The objective was not to outline every last nit pick of the technical design.







I assume that these CPUs will rely on PCIe 3.0, as Intel would have most likely trumpeted PCIe v4.0 inclusion on die if it was going that route.

That is a bit dubious. Technically PCI-e is part of the uncore (non core). What this talk was about is the core (x86_64) microarchitecture. In the past, Intel has rolled out different PCI-e levels to the same microarchtecture ( e.g., Xeon E5 v1 at PCI-e v3 and the Core i same mirco arch baseline at PCI-e v2). Same thing could happen here where PCI-e v4 capable is added to the higher priced and/or data center centric options. ( e.g., folks with 100GbE , inter-rack cluster network fiber , etc. kinds of problems. )

Personally, I think this is still yet another parlor trick to give Intel time to fix their 10nm process issues. Meaning that Sunny Cove was on the map already, just not called Sunny Cove.

I suspect no. That at some point they decided to rewrote their logic library targeting. One of the problem probably was that they couldn't find a 10nm that was good at 6-8 different things all at once.
They may have gone back to refactor their CPU microarch a bit to support a narrower set of same-die coupled subsystems . ( also wouldn't be surprising if changed tactics to on how to add iGPU to the CPU package. ). 10nm is being tuned at the moment just for "high computation" CPU/GPU cores and associated L1/L2 cache. Intel aimed for relatively high transistor density with 10nm and there are some applications that don't need quite that high. (they'll later have another variation of 10nm that is tuned different. )
 
Last edited:
  • Like
Reactions: RandomDSdevel
When do you guys think the new chips and their variants will start releasing? I just bought a 2017 i7 MB 12" and is really debating if I will keep it or wait (been waiting for ages and do not want to wait anymore!). But if they update very soon, I will have immense buyers remorse for sure.

No reply saying "there will always be new things around the corner" hahaha.

Apple can release a newer MacBook based on Amber Lake-Y any moment if they want to, but it barely seems worth the effort. It's only a few percent faster.
 
Brand perception is such BS and Intel has duped people since the 80's that their CPUs are better. At some times during history they have been but on average they are not better or worse than others. Been an AMD guy since the 486 days and they have made some great CPUs (and some not so great just like Intel).

Their current server chips are a much better CPU than Intel - it's why many supercomputers (think Cray) are running AMD EPYC CPUs. The core count and performance is much better than Intel.

On the desktop space, the increased core count with lower power of AMD's Ryzen and Threadripper is better than Intel in Apple's "core" market of content creators. I've got a Ryzen 7 system (8 cores/16 threads) with 32GB and I can leave 5-8 VM's running and not even notice it because of the extra cores. I can render video quite quickly because of the core count. It truly has been an awesome system and cost probably 1/2 what an 8 core/16 thread Intel box would have cost last year when I built it.

If anyone could do it, Apple can switch to AMD without a second thought. There is no incompatibilities, and they could tout some seriously fast computers.

On the server space, I can't think of any reason to use Xeon over EPYC at this time other than FUD or specific software issues. EPYC has much better performance with lower cost than nearly any Xeon.

I know but that’s on the high end which I clearly mentioned. Dell uses Threadripper on some Alienware configs. EPYC is amazing performance for the money. Xeon Platinum costs a ******** more.

I’m partial to Intel because I grew up with their CPUs. I did have an AMD notebook I bought in 2008 with the dual core Tyrion X2 Ultra ZM-80 @ 2.1 GHz. It wasn’t as good as the comparable Core 2 Duo but since it had the Radeon HD3200 on board, that definitely made a huge difference. Battery life did suck of course but I could run HL2 at 1280x800 with everything on high and Doom 3 maxed out.

Just 4 years earlier those games running at 1024x768 with everything on high pushed my ATI Radeon 9700 Pro to the max! I miss the days when ATI was king. AMD can do so much better in the GPU department. Their mobile solutions like Vega are much better than Iris will ever be but they can’t touch nVidia with their full size desktop GPUs.

nVidia was already far ahead with Pascal and now with Turing, AMD is about 1.5 to 2 generations behind which is a lot of ground to make up.
 
  • Like
Reactions: RandomDSdevel
I’m partial to Intel because I grew up with their CPUs. I did have an AMD notebook I bought in 2008 with the dual core Tyrion X2 Ultra ZM-80 @ 2.1 GHz. It wasn’t as good as the comparable Core 2 Duo but since it had the Radeon HD3200 on board, that definitely made a huge difference. Battery life did suck of course but I could run HL2 at 1280x800 with everything on high and Doom 3 maxed out.

I've run them both for years and Intel is nothing special. They have great marketing and slightly faster performance today in single threaded apps but I'll take more cores over that any day of the week. Most likely I'm done with Intel CPUs when the HP dies - it will be whatever AMD laptop I can get. The only outlier would be the Surface - love the form factor but hate the CPU.

max! I miss the days when ATI was king. AMD can do so much better in the GPU department. Their mobile solutions like Vega are much better than Iris will ever be but they can’t touch nVidia with their full size desktop GPUs.
LOL. The new AMD GPUs are quite competitive:
https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html
 
  • Like
Reactions: RandomDSdevel
"For those who are ever-confused by Intel's roadmap, it is believed that Sunny Cove processors paired with Gen11 graphics will be called Ice Lake, which succeeds Coffee Lake, Whiskey Lake, Amber Lake, and Cannon Lake."

Makes perfect sense.... You have to have another "..lake" name in there, otherwise it would throw it all out of wack.
 
Dang! First comment of all things.

Yeah its a new thing around the corner, although is it a big jump compared to 2015 - 2018? Considering we were in 14nm for 3 years now and does the sunny cove have integrated thunderbolt support on chip?

Thunderbolt support on the chip is highly unlikely to be fully integrated. Intel says they have have integrated support for Ethernet in the PCH I/O controler chips but you still need a seperate PHYS chip to actually have an actual Ethernet socket on your system. The need for another chip isn't going to disappear. It may be a slighly smaller and very incrementally cheaper chip than what is required now, but there will still be a need to buy and place on the logic board something else.

What may happen is that the complete implementation shrinks from being multiple chips ( the power and some of the alt mode variants and backward compatibility modes require some more chips in addition to the Thunderbolt controller chip that has only been from Intel up to the present. ). intel may weave more of that into the PCH and the PCH may be integrated into the CPU package more often. ( the "base" layer in these Foveros , 3D package technology is more of integrating the PCH into the CPU package. ). Some of the video output switching and the gathering of data to put onto the Thunderbolt network could be down in the CPU package with the iGPU and PCH interconnections internal the CPU package. There would still be the basics of the Thunderbolt switch component of the TB controller that would need to be located within a very limited distance for the physical socket. ( You can't place the TB controllers anywhere. Currently it is required to be within about an inch or so from the socket. That is a hang up with trying to put 100% of the TB implementation inside the CPU or PCH .... that would drag those closer to the edge of the system ... and that isn't possible in many implementations. )


Integration part of the TB controller into the basic CPU package means that system builders will have to pay for at least part of the TB implementation regardless if add a socket to the system or not. The deployment will raise because it will be a bit cheaper to implement, but also in part because going to have to pay for it anyway. Intel is stuffing base WiFi implementations into the PCH also. They have already done Sound/Audio. The move toward "system on a chip" SoC or perhaps closer "System in the CPU package" ( not just one chip but multiple chips collected into a single physical package. ) continues.
 
  • Like
Reactions: RandomDSdevel
I am blown away by the iPad's graphics abilities and I am completely underwhelmed by the graphics abilities of the MacBook. I think Apple ditching Intel and moving to ARM can't happen soon enough. I welcome the day a slim, light, MacBook not only performs well but has smoking graphics and an ability to play games. Intel is yesterday's technology. It is way past its prime.


Riiight.

I cannot wait for that day as well ... yet I think you're way ahead of yourself here.

Intel can ball too ... m3/m5 cpu's not requiring a fan tells you intel is going to come knocking on ArmHolding's designs for arm-based cpu's.

Also until 2018 you could not properly record 1080p at 60fps on Arm with good quality and no chop.
Can you transcode on Arm from 1080p 30/60/120fps to 4/5K ?

Until we see a proper mobile OS for iPad 2018 or later that can without limitations or workarounds to one's normal workflow doing any tasks on macOS occurs ... then we're still waiting.
[doublepost=1544926060][/doublepost]
I wonder if these new chips will be a better match for Apple's current Macbook Pro case design. The heat thing really bugs me. I don't buy their software fix for the i9. Perhaps this new intel chip will solve (or at least reduce) some of the thermal problems surrounding the Macbook Pro.

Apple executives NEED to be forced to use a MacBook Pro for their 8 to 10 hour shifts on their laptops. Just for a week! Until they do so I dont' believe Apple as a whole really believes in the thermal issues of high productivity usage on end users laps.

Seriously though Apple needs to solve this issue asap OR get rid of using metal on the bottom case altogether.

PS: Anyone know of a fabrication shop that will make custom and exacting specs for MBP 2018? ;)

The great CPU we have all been waiting for is Ryzen (coming from an Intel fan).

Personally I recall all the hardships and pain HP had with AMD's last generation of mobile cpu's in their laptops . 10yrs ago. Not ever going to trust them as a full mobile computing platform until they can sincerely prove otherwise.

I'm currently sitting this one out, using a Surface Pro 6 and if it carries on working well i will keep it until at least 2020 when Apple is rumoured to put their ARM chips inside Mac's.

SP? That machine with it's lack of ports and connectivity seems a LOT more limiting than the 2018 iPad right now. Surface Pro 6:
Still uses MiniDisplayPort
Still does NOT have ThunderBolt (of any kind); yet the Asus close of that product DOES!

I'm not so certain in a years time you'll feel the same way about it.
 
  • Like
Reactions: RandomDSdevel
Riiight.
Apple executives NEED to be forced to use a MacBook Pro for their 8 to 10 hour shifts on their laptops.

Amusing - you think that any Apple executive has spent more than 20 continuous minutes on a MacBook Pro laptop - ever!:cool::D

Do you think Ive has spent the average hours a consumer spends on an iPhone or a prosumer spends on an iPad in a week - just to evaluate the user experience, blind to the design?
 
Last edited:
I mean, this was the first time I had posted about it in well over a month, but ok. ¯\_(ツ)_/¯
Look: We're ALL waiting with abated breath to see what the new Mac Pro will be.

But, if history is any indicator, Apple has never been a company that will "backtrack" to a previous design paradigm; so if you truly want "Cheesegrater 3.0", I would suggest you start gathering parts for a Hackintosh instead; because Apple is looking forward. not backward.

But, if like most of the rest of us, you are cautiously optimistic that Apple is coming up with something that will be the next "Game Changer", and finally get us past the "tower" configuration that most of the home computer industry is still stuck in for over half a century (!!!), then you will adopt a "wait and see" approach to all this.

Think about it: The first tower-PCs came out before the first Macintosh, when the Apple ][ was in its heyday. It just cannot be the only viable solution to an "Expandable and Upgrade-able" personal computer for all time!
 
finally get us past the "tower" configuration that most of the home computer industry is still stuck in for over half a century (!!!), then you will adopt a "wait and see" approach to all this.

Think about it: The first tower-PCs came out before the first Macintosh, when the Apple ][ was in its heyday.

Towers probably did exist in the early 1980s, but weren’t popular; the prevalent form factor was the desktop (which today means something different). The tower and later mini tower / midi tower didn’t become popular until the early 1990s.

Even assuming your claim, “over half a century” would be before 1968, and at that point, I just don’t know what you’re talking about.

Having said that, I agree with your sentiment: no, the Mac Pro probably won’t be another Cheesegrater. It just isn’t that interesting a design for Apple any more. Few people need four internal 3.5-inch drive bays (they may, however, like four socketed NVMe SSDs) or four internal PCIe slots.

Whether this (PC makers still being stuck in the world of midi towers) is an area that needs innovation is another matter.
 
Also until 2018 you could not properly record 1080p at 60fps on Arm with good quality and no chop.
Can you transcode on Arm from 1080p 30/60/120fps to 4/5K ?

The 2014 Galaxy S5 was recording 1080p/60 and 4k on an ARM chip (Snapdragon). And you could transcode from 4k to 1080 on phone too. Apple was late to the party as usual.

Personally I recall all the hardships and pain HP had with AMD's last generation of mobile cpu's in their laptops . 10yrs ago. Not ever going to trust them as a full mobile computing platform until they can sincerely prove otherwise.

Like what? I had a few AMD laptops and they were fine. Early reports from the current Ryzen Mobile users are positive. I'd trust them without reservation and prefer them over Intel's crap all day and twice on Sunday.

SP? That machine with it's lack of ports and connectivity seems a LOT more limiting than the 2018 iPad right now. Surface Pro 6:
But the SP runs a full OS, all your apps with their native functionality, provides an easy connection to peripherals (USB) and supports them all. Lack of USB ports is solved with a $20 hub.
The iPad is a quite limiting device without filesystem access, runs gimped apps, and is not really the desktop replacement it's billed to be.
 
Last edited:
  • Like
Reactions: RandomDSdevel

I don’t see how having GPUs that still can’t pass the now almost 2 year old 1080ti is competitive. The 1080ti is still the benchmark for 4K gaming and Adobe use. Sure they cost less and that’s probably the best thing they have going for them.

Yes, the Vega 64 has an advantage in some titles optimized for AMD but overall the 1080ti is king and now the RTX cards are just on another level. Their ranking scale does a poor job of differentiating these cards. You really need to see benchmarks at 4K with high settings and CUDA render times to see how much better nVidia’s stuff is.

With their CPUs they’re actually ahead of Intel on paper and in real world use. That’s what I meant.
 
  • Like
Reactions: RandomDSdevel
This will be my next processor. I just hope I'm not making the wrong choice in going with Intel right before Apple makes a huge leap to ARM.

Actually this is a good reason to go for this one.
Second, I'm quite sure there'll be a transition period of 2-3 years that one could buy both architectures.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.