Which in the real world is one snap of your fingers.
So? 30% is a considerable increase. Imagine if it was your income. Or if your income when Down by 30%.
Which in the real world is one snap of your fingers.
One thing I’ve learned being an Apple customer is if you keep waiting for the next upgrade you’ll never end up purchasing a product. The MBP 2017 just came out, go get one. They will last you at least a couple of years if not more.
Without having any way of knowing, I'd not be surprised if the announcement mentioned the new chips have a way of working out whether your current application would benefit more from 2 cores running at a higher clock, or 4 at a lower - I believe I'm right in saying that is already the case with whether the chip uses a single core or all its cores for an application, but without the variable maximum clock speed?![]()
It may be a desktop CPU you see on the right side of this image, but if you compare single-core scores, you will see that CPUs won't be much faster, they will only add more cores, which is utilised at best in encoding and video. The quad core models in the 13" (1.8GHz is reported) will have much lower clock speeds compared to the ones you see in the 15", because of the available space to cool down the heat of a high level CPU.
The extra level of heat may cause problems in the first generation, and it would be wise to see how this pans out before spending large amounts of time waiting. Six cores in a laptop may also run at a lower clock speed, and the thin chassis may cause the computer to throttle. Haswell in the mid 2015 throttles, and has problems that the 2016 and 2017 Skylake and Kaby lake doesn't have, they are capable of running at full capacity throughout a rendering or encoding, but a Haswell actually clocks down to dispatch the heat.
Since Coffee Lake will still be produced at 14nm, it will probably not be any cooler and you will have to wait for Cannon Lake to see a six core CPU run flawlessly in a laptop form factor. With after market fans and big chassis on desktop, none of this matters, but it may pose a problem without the 10nm production prosess in computers with limited space to cool the internals.
I guess you will be alright buying a computer right now. If you have the luxury to wait, buy something else than a computer. Travel, visit a michelin star restaurant or buy a car. If you are still interested in a computer, I would buy the current generation, and probably upgrade in 2020. Three years is a lot in computing.
Without having any way of knowing, I'd not be surprised if the announcement mentioned the new chips have a way of working out whether your current application would benefit more from 2 cores running at a higher clock, or 4 at a lower - I believe I'm right in saying that is already the case with whether the chip uses a single core or all its cores for an application, but without the variable maximum clock speed?
So would it be impossible to have the decision switched round to (I'd guess the OS rather than the chip itself actually thinking about it) - I know software will either be made to be able to use one core, or two, four etc, but is there any way of the OS analysing core usage and adapting how the software sees the CPU (either as higher clocked dual, or a lower clocked quad) to 'force' it to use the chip in the way that's optimal? Or would that be moot as if it's using more cores already that's because it benefits more from having more threads running simultaneously than fewer being done faster?It's how you code your applications that determines how the CPU utilise the cores and threads, and it will take a couple of years before the leading software developers will be able to fully integrate new instructions along with the extra cores into their code. Software that is already using multiple cores like Premiere and Final Cut, will probably use the new cores and instructions from the start, but how it operates within a thin chassis like the MacBook Pro will be another question to answer. Heat is the major obstacle, and may be why video and 3D requires desktop. I have a desktop which is virtually silent even when gaming in 4K on ultra settings, due to proper cooling, but again no one is planning to make laptops thicker to accommodate more power.
So would it be impossible to have the decision switched round to (I'd guess the OS rather than the chip itself actually thinking about it) - I know software will either be made to be able to use one core, or two, four etc, but is there any way of the OS analysing core usage and adapting how the software sees the CPU (either as higher clocked dual, or a lower clocked quad) to 'force' it to use the chip in the way that's optimal? Or would that be moot as if it's using more cores already that's because it benefits more from having more threads running simultaneously than fewer being done faster?
I don’t know why everyone is fixated on the 30% figure, it will be different for every processor. This is a big upgrade and it redefines every processor in the lineup, there will be significant gains in multi core performance.So? 30% is a considerable increase. Imagine if it was your income. Or if your income when Down by 30%.
I don’t know why everyone is fixated on the 30% figure, it will be different for every processor. This is a big upgrade and it redefines every processor in the lineup, there will be significant gains in multi core performance.
mmm. Tim Hortons. Wish we had those in California. I always stop at one of those when I am in Vancouver or Toronto. I love their little pastry items and strong coffee.