Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
One thing I’ve learned being an Apple customer is if you keep waiting for the next upgrade you’ll never end up purchasing a product. The MBP 2017 just came out, go get one. They will last you at least a couple of years if not more.

This might be a good strategy. Helps you save money.
 
Xxbxv3m.png


It may be a desktop CPU you see on the right side of this image, but if you compare single-core scores, you will see that CPUs won't be much faster, they will only add more cores, which is utilised at best in encoding and video. The quad core models in the 13" (1.8GHz is reported) will have much lower clock speeds compared to the ones you see in the 15", because of the available space to cool down the heat of a high level CPU.

The extra level of heat may cause problems in the first generation, and it would be wise to see how this pans out before spending large amounts of time waiting. Six cores in a laptop may also run at a lower clock speed, and the thin chassis may cause the computer to throttle. Haswell in the mid 2015 throttles, and has problems that the 2016 and 2017 Skylake and Kaby lake doesn't have, they are capable of running at full capacity throughout a rendering or encoding, but a Haswell actually clocks down to dispatch the heat.

Since Coffee Lake will still be produced at 14nm, it will probably not be any cooler and you will have to wait for Cannon Lake to see a six core CPU run flawlessly in a laptop form factor. With after market fans and big chassis on desktop, none of this matters, but it may pose a problem without the 10nm production prosess in computers with limited space to cool the internals.

I guess you will be alright buying a computer right now. If you have the luxury to wait, buy something else than a computer. Travel, visit a michelin star restaurant or buy a car. If you are still interested in a computer, I would buy the current generation, and probably upgrade in 2020. Three years is a lot in computing.
 
Last edited:
Xxbxv3m.png


It may be a desktop CPU you see on the right side of this image, but if you compare single-core scores, you will see that CPUs won't be much faster, they will only add more cores, which is utilised at best in encoding and video. The quad core models in the 13" (1.8GHz is reported) will have much lower clock speeds compared to the ones you see in the 15", because of the available space to cool down the heat of a high level CPU.

The extra level of heat may cause problems in the first generation, and it would be wise to see how this pans out before spending large amounts of time waiting. Six cores in a laptop may also run at a lower clock speed, and the thin chassis may cause the computer to throttle. Haswell in the mid 2015 throttles, and has problems that the 2016 and 2017 Skylake and Kaby lake doesn't have, they are capable of running at full capacity throughout a rendering or encoding, but a Haswell actually clocks down to dispatch the heat.

Since Coffee Lake will still be produced at 14nm, it will probably not be any cooler and you will have to wait for Cannon Lake to see a six core CPU run flawlessly in a laptop form factor. With after market fans and big chassis on desktop, none of this matters, but it may pose a problem without the 10nm production prosess in computers with limited space to cool the internals.

I guess you will be alright buying a computer right now. If you have the luxury to wait, buy something else than a computer. Travel, visit a michelin star restaurant or buy a car. If you are still interested in a computer, I would buy the current generation, and probably upgrade in 2020. Three years is a lot in computing.
Without having any way of knowing, I'd not be surprised if the announcement mentioned the new chips have a way of working out whether your current application would benefit more from 2 cores running at a higher clock, or 4 at a lower - I believe I'm right in saying that is already the case with whether the chip uses a single core or all its cores for an application, but without the variable maximum clock speed?
 
Without having any way of knowing, I'd not be surprised if the announcement mentioned the new chips have a way of working out whether your current application would benefit more from 2 cores running at a higher clock, or 4 at a lower - I believe I'm right in saying that is already the case with whether the chip uses a single core or all its cores for an application, but without the variable maximum clock speed?

It's how you code your applications that determines how the CPU utilise the cores and threads, and it will take a couple of years before the leading software developers will be able to fully integrate new instructions along with the extra cores into their code. Software that is already using multiple cores like Premiere and Final Cut, will probably use the new cores and instructions from the start, but how it operates within a thin chassis like the MacBook Pro will be another question to answer. Heat is the major obstacle, and may be why video and 3D requires desktop. I have a desktop which is virtually silent even when gaming in 4K on ultra settings, due to proper cooling, but again no one is planning to make laptops thicker to accommodate more power.
 
It's how you code your applications that determines how the CPU utilise the cores and threads, and it will take a couple of years before the leading software developers will be able to fully integrate new instructions along with the extra cores into their code. Software that is already using multiple cores like Premiere and Final Cut, will probably use the new cores and instructions from the start, but how it operates within a thin chassis like the MacBook Pro will be another question to answer. Heat is the major obstacle, and may be why video and 3D requires desktop. I have a desktop which is virtually silent even when gaming in 4K on ultra settings, due to proper cooling, but again no one is planning to make laptops thicker to accommodate more power.
So would it be impossible to have the decision switched round to (I'd guess the OS rather than the chip itself actually thinking about it) - I know software will either be made to be able to use one core, or two, four etc, but is there any way of the OS analysing core usage and adapting how the software sees the CPU (either as higher clocked dual, or a lower clocked quad) to 'force' it to use the chip in the way that's optimal? Or would that be moot as if it's using more cores already that's because it benefits more from having more threads running simultaneously than fewer being done faster?
 
The only question is when do you need it?

If you need it now then buy it now, and if you can wait, wait. I guarantee that by the time the next model is released there will already be rumours of the one after that, so making a decision based on spec will lead to uncertainty and buyer's remorse.

If you want one right now, go for it. I can confirm that the MBP 2017 is great. :)
 
So would it be impossible to have the decision switched round to (I'd guess the OS rather than the chip itself actually thinking about it) - I know software will either be made to be able to use one core, or two, four etc, but is there any way of the OS analysing core usage and adapting how the software sees the CPU (either as higher clocked dual, or a lower clocked quad) to 'force' it to use the chip in the way that's optimal? Or would that be moot as if it's using more cores already that's because it benefits more from having more threads running simultaneously than fewer being done faster?

There are already a lot of procedures that analyse and adapts the power of the CPU to accommodate the software, both OS and its applications, and it has been like that since the very first dual core became available to the public around the same time Macs started getting equipped with Intel CPUs. So in the last ten years, a lot of the standards have been put in place to handle the different workloads, but a lot of software still lacks proper optimisation, because hardware is already strong enough to handle it. Then after a while, webpages, applications, background processes will start demand more from the CPU and they have to take this "demand" and hand out cores/threads to each in order to prevent lags/freeze etc.

My computer is running 1408 threads at this very moment and 321 processes. It is the optimisation and its smooth co-existance with the CPU, memory and more that makes it feel instant. Computing starting out as a single process, now trillions of them happen at once in form of networks, servers, phones, laptops etc...

The dual core CPUs that was released in 2006 don't have the instructions, optimisation and utilisation we have today, even though they may have the same clock speed as todays Skylake and Kaby Lake CPUs. It is a battle between hardware and software. Sometimes new ways of encoding and decoding video may require a new set of instructions in the CPU to not choke it, and then sometimes new hardware innovations let software developers add better lightning, shadows and hues to 3D models. A great example is 4K. 4K video is still hard, even on a desktop with a 1080 GTX Ti, but eventually it will become common to own a 4K monitor and that will make it easier to sell GPUs in larger numbers, making it cheaper to make and cheaper to sell.
 
Last edited:
So? 30% is a considerable increase. Imagine if it was your income. Or if your income when Down by 30%.
I don’t know why everyone is fixated on the 30% figure, it will be different for every processor. This is a big upgrade and it redefines every processor in the lineup, there will be significant gains in multi core performance.
 
I don’t know why everyone is fixated on the 30% figure, it will be different for every processor. This is a big upgrade and it redefines every processor in the lineup, there will be significant gains in multi core performance.

How do you know? I think this is the quick fix CPU from Intel to send a signal to AMD that says "look out, we can also add cores to our CPU". Will wait for 10nm for the real deal to avoid throttling and battery drainage.
 
mmm. Tim Hortons. Wish we had those in California. I always stop at one of those when I am in Vancouver or Toronto. I love their little pastry items and strong coffee.

Drinks in Tim Hortons are cheaper than those in Starbucks.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.