Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Cue the should I buy or wait comments in 3.. 2.. 1..

These "should I buy or wait" comments for the 2017 iMac have been around since at least the end of last winter. Some people were suspecting a 2018 iMac as early as spring time.

I posted back then that my thoughts were that Apple probably wouldn't have a 2018 iMac until really late, or maybe not until 2019. Anyone that is still waiting might still have a long wait ahead of them.
 
For those who are ever-confused by Intel's roadmap, it is believed that Sunny Cove processors paired with Gen11 graphics will be called Ice Lake, which succeeds Coffee Lake, Whiskey Lake, Amber Lake, and Cannon Lake.

This is one thing Apple really is getting right: at a glance you know where each generation of processors (A9, A10 etc) fits into the series. This "Lake" nonsense is really pointless.
 
  • Like
Reactions: RandomDSdevel
This is welcomed news since Intel is so far behind the competition on iGPU performance that it's hard to consider them anymore. And, hopefully their new dGPU is more successful than their last i740 attempt which I still have around here somewhere.
 
  • Like
Reactions: RandomDSdevel
Wait, what? I missed out on Coffee Lake and Whiskey Lake?

The current MacBook Pros are Coffee Lake, with the exception of the non-Touch Bar one (which is still Kaby Lake, because Intel has yet to provide a suitable new CPU).

The new MacBook Air is Amber Lake-Y.

Whiskey Lake isn't very interesting to Apple right now.

True but that looks to be a coffee lake processor on 14nm. What happened to the die shrink of skylake/kabylake/coffeelake that was going to be cannonlake?

Only a single Cannon Lake SKU shipped. It's probably dead.
 
This is one thing Apple really is getting right: at a glance you know where each generation of processors (A9, A10 etc) fits into the series. This "Lake" nonsense is really pointless.

Intel and Apple does the same.

Intel: I7-8700K -> Coffeelake
Apple: A10 Fusion -> Hurricane, Zephyr
 
Don't buy a computer now when better things are about to come out.
(A motto that saves you a lot of money over the years.)
A motto which also keeps you from ever buying a computer. Which, to be fair, would definitely have saved me a lot of money over the years.;)
[doublepost=1544649598][/doublepost]
Only a single Cannon Lake SKU shipped. It's probably dead.
Wait, it really was just one??
 
I think Apple is more than happy to keep putting three-year-old chips in its machines.
Ha ha ha...if that were actually true, then we would all have a good laugh. Try again.

2017 MacBook - Kaby Lake-Y, introduced August 30, 2016 (i5 and i7), April 10, 2017 (m3)
2018 MacBook Air - Amber Lake-Y, introduced June 5th, 2018.
2018 MacBook Pro - Coffee Lake H-Series, introduced April 2nd, 2018.
2018 Mac mini - Coffee Lake S-Series, Core i3-8100 introduced September 24, 2017, Core i-8500 and Core i7-8700 introduced September 24, 2017.
2017 iMac - Kaby Lake S-Series, introduced January 3rd, 2017.
2013 Mac Pro - Shortly after Methuselah was laid to rest...well, as they always say, you win some, you lose some.

Source: https://en.wikichip.org/wiki/WikiChip
 
I'm completely lost with Intel's roadmap. So many delays, changes etc. So what is the great CPU we are all waiting for? Wasn't 10nm meant to be flawed?
That processor is available today, from AMD. It's called Ryzen and Threadripper. Shame Apple won't use them.
 
  • Like
Reactions: RandomDSdevel
Since so many of us are enthusiasts here we tend to underappreciate how good integrated graphics have become. I’ve been using computers for over 20 years now. I remember when you needed a 3D accelerator for anything more than 2D. The CPU could render games like Quake and a Tomb Raider in software but they were a pixely mess.

Today a discrete GPU is only necessary for the most demanding work and of course playing new high-end games. When Intel started including 3D integrated graphics I never thought it would amount to much but now Iris and even HD Graphics have surpassed the large beefy GPUs of yesteryear on a tiny die!
[doublepost=1544652325][/doublepost]
That processor is available today, from AMD. It's called Ryzen and Threadripper. Shame Apple won't use them.

It largely has to do with brand perception. AMD is still considered the more affordable CPU option and most premium computer manufacturers use Intel. They will only use AMD CPUs for their high-end desktops like Alienware, etc.

Intel still does many things better than AMD. If you look at just the amount of cores and threads then of course AMD wins hands down.
 
  • Like
Reactions: RandomDSdevel
Funny how almost all are missing the fact that they appear to have given Larrabee a jolt up the backside and kickstarted it back to life.

It did have promise, hopefully they can stop the stupid bickering and infighting from last time.
 
  • Like
Reactions: RandomDSdevel
What makes me really sad is that those responses are getting so much attention while my comment only has ONE like... Oh dear. *facepalm*

This thread is not good evidence that Apple should listen to their consumers regarding product development.
 
  • Like
Reactions: IG88 and SDJim
I'm confused. Is this going to be 9th gen core architecture or 10th gen? I thought next in line was cannon lake. Are they skipping cannon lake altogether?

Cannon Lake was only released as an i3 so Intel could claim success with their 10nm process. They’ve basically admitted during an investor call that they’re going back to the drawing board with 10nm and should have something ready next year (although they’ve been saying “next year” since 2015 for 10nm so I’m hoping this time they’re really going to do it!)
 
It largely has to do with brand perception. AMD is still considered the more affordable CPU option and most premium computer manufacturers use Intel. They will only use AMD CPUs for their high-end desktops like Alienware, etc.

Intel still does many things better than AMD. If you look at just the amount of cores and threads then of course AMD wins hands down.

Brand perception is such BS and Intel has duped people since the 80's that their CPUs are better. At some times during history they have been but on average they are not better or worse than others. Been an AMD guy since the 486 days and they have made some great CPUs (and some not so great just like Intel).

Their current server chips are a much better CPU than Intel - it's why many supercomputers (think Cray) are running AMD EPYC CPUs. The core count and performance is much better than Intel.

On the desktop space, the increased core count with lower power of AMD's Ryzen and Threadripper is better than Intel in Apple's "core" market of content creators. I've got a Ryzen 7 system (8 cores/16 threads) with 32GB and I can leave 5-8 VM's running and not even notice it because of the extra cores. I can render video quite quickly because of the core count. It truly has been an awesome system and cost probably 1/2 what an 8 core/16 thread Intel box would have cost last year when I built it.

If anyone could do it, Apple can switch to AMD without a second thought. There is no incompatibilities, and they could tout some seriously fast computers.

On the server space, I can't think of any reason to use Xeon over EPYC at this time other than FUD or specific software issues. EPYC has much better performance with lower cost than nearly any Xeon.
 
The new 28th gen i7 multi-core-xray coffee lake is TWICE as fast as the Intel LASER FACE ice lake processor but the new built in GPU of the Quaker oats microarchitecture of 10mn thin chip process of the 11th gen lego blue-blee-blox gaaah gaah goo goo is 50% faster than the previous blart blagh blox x86 steam engine roadmap however the biggle borf blappity boop has more bleeedo dorf sweet tiberium mined by the depth grovelers in it to allow for blarf output! I'm lost, it sounds faster so that's nice!
 
What makes me really sad is that those responses are getting so much attention while my comment only has ONE like... Oh dear. *facepalm*

You broke a lot of people with your post. Guess it was more subtle than I thought.

Perhaps it's the mentality of people eager to get into disagreements online. They see what they perceive as someone making a mistake and are quicker to point it out than to wait a half-second for their brain to come up with other explanations for what they're seeing.
 
Let's predict:
10% performance improvement.
10% power savings.
Still 2/4/6 cores.


> Intel reaffirmed its plan to introduce a discrete graphics processor by 2020

It will be too late.
 
  • Like
Reactions: trifid
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.