With Broadwell CPUs ("U" & "H" series) now confirmed as delayed until Feb-July 2015, it looks like we won't see quad-core MBPs until at least mid-late Q3 2015. The recently rumored Haswell rMBP "spec-bump" would only seem to confirm this.
Must we now accept that both the technical and economical limitations of shrinking CPU die sizes mean that Moore's Law is now, if not dead, at least in terminal decline?
Should we mourn, or see this as the start of a new era in computing in which the CPU is no longer the focus of innovation?
Although there is a certain sadness in the realisation that some technologies may have almost reached their end-point, there are some advantages to be seen:
1) We may finally escape the addictive impulse to upgrade our hardware as soon as Apple (& others) dangles a new carrot in front of us.
2) You won't need to compulsively check MacRumors every day for news of the latest & greatest - once a month will do ;-)
3) We might start to consider what we actually "do" with our computers, rather than what the hardware spec is.
4) Following on from (3), we should see more focus on software development, rather than hardware. I think there is huge scope for improving the quality and efficiency of our applications - maybe a return to the "good old days" when you had to find novel solutions with very limited resources. Developing multi-threaded apps to run on larger arrays of "commodity" CPU cores should become the norm.
5) Manufacturers will look at other areas of the hardware to add value - we're already seeing big strides in iGPU performance. How about:
i) New materials to give us stronger & lighter mobile devices, e.g flexibile screens, liquid metal etc. We're always trading size/weight against computing power - let's have both!
ii) New form factors - I'd love to see a really well designed laptop/tablet hybrid that does both jobs excellently, rather than being a compromise.
iii) Batteries that give us days of usage in a small, lightweight package. The MBA is a great start, but the iPhone has a long way to go to get back to the week-long battery charge of my 2000-era cell phone.
iv) New user-interface options: "usable" voice control, gesture control, holographic displays and other "sci-fi" technologies.
As others have pointed out, the death of Moore's law will probably not be due to technology (although there are clearly physical limits), but economics - it will just be too expensive to justify the investment in making things smaller. We're likely to see this after Skylake, if not sooner.
Personally, I think it would be great to be able to keep a computer for 10 years and not feel technologically "deprived". At the end of the day, the computer is a tool for productivity and communication. Provided it can keep up with human demands, it is "good enough".
Must we now accept that both the technical and economical limitations of shrinking CPU die sizes mean that Moore's Law is now, if not dead, at least in terminal decline?
Should we mourn, or see this as the start of a new era in computing in which the CPU is no longer the focus of innovation?
Although there is a certain sadness in the realisation that some technologies may have almost reached their end-point, there are some advantages to be seen:
1) We may finally escape the addictive impulse to upgrade our hardware as soon as Apple (& others) dangles a new carrot in front of us.
2) You won't need to compulsively check MacRumors every day for news of the latest & greatest - once a month will do ;-)
3) We might start to consider what we actually "do" with our computers, rather than what the hardware spec is.
4) Following on from (3), we should see more focus on software development, rather than hardware. I think there is huge scope for improving the quality and efficiency of our applications - maybe a return to the "good old days" when you had to find novel solutions with very limited resources. Developing multi-threaded apps to run on larger arrays of "commodity" CPU cores should become the norm.
5) Manufacturers will look at other areas of the hardware to add value - we're already seeing big strides in iGPU performance. How about:
i) New materials to give us stronger & lighter mobile devices, e.g flexibile screens, liquid metal etc. We're always trading size/weight against computing power - let's have both!
ii) New form factors - I'd love to see a really well designed laptop/tablet hybrid that does both jobs excellently, rather than being a compromise.
iii) Batteries that give us days of usage in a small, lightweight package. The MBA is a great start, but the iPhone has a long way to go to get back to the week-long battery charge of my 2000-era cell phone.
iv) New user-interface options: "usable" voice control, gesture control, holographic displays and other "sci-fi" technologies.
As others have pointed out, the death of Moore's law will probably not be due to technology (although there are clearly physical limits), but economics - it will just be too expensive to justify the investment in making things smaller. We're likely to see this after Skylake, if not sooner.
Personally, I think it would be great to be able to keep a computer for 10 years and not feel technologically "deprived". At the end of the day, the computer is a tool for productivity and communication. Provided it can keep up with human demands, it is "good enough".