I'm truly glad for the performance upgrades. But, in all fairness, it wasn't that big of a feat to outperform the previous model...
What's utterly fascinating is that the "old, tired" 2013 Mac Pro still beats the pants off everything but one 2017 iMac configuration.
How is that any different from any Mac or PC that has ever just launched? At least with Apple you get hardware and ecosystem optimized for the OS.These numbers looks really good for those who still own iMac2012 like myself. But both Intel/AMD had just launched the new CPU with 6-8 cores with much better performance but at the same price point (or lower) as the current i5/i7 CPU found on this iMac.
After seeing those news, I kinda feel like the iMac 2017 is already obsolete....
I'm waiting for some more benchmarks. The previous iMac GPU's all throttle down at least 15-20% under a moderate load, due to thermal issues. It's still nice to see a GPU speedup like this, mostly if you only use Apple's media apps. Would really like a user-replacable GPU though, even at a slightly greater cost/bulkier design, but I realise this is good value for a consumer model.
... in synthetic multi-core / multi-GPU benchmarks that are not representative of real world-usage.
Most Mac applications will not take full advantage of having more than 4 cores or dual GPUs as well as those benchmarks do.
Unless we're looking at heavily parallelized applications like video editing, the single-core CPU performance and single-GPU performance mattersmore, and there the iMac totally beats the Mac Pro. Heck, even the Swift compiler is faster on a quad-core iMac than a 12-core Mac Pro.
Do they? Why? It's a great design. Why change just for the sake of change?People wanted redesigned. The same front design since October of 2009.
And if they had put in Threadripper they would have leap frogged into first place. Next year, TB is free and you can bet Zen2 will have it built-in.
Hey, Apple product designers, engineers and product managers -- stop the manufacturing line! This random guy on this forum found something you somehow all missed.Putting this horsepower in that silly iMac form factor will make this a toaster.
Ignoring AMD since liklihood of Apple going AMD for CPU's is fairly low (for now)
But the "X" series CPU's from Intel aren't really intended for this purpose. they're being toughted as "enthousiast grade". they feature far higher TDP's than their main line-ups, Require a new socket 2066, and would require a complete redesign of the chassis and cooling. The only people who are likely to really benefit from the new "X" intel lineups are gamers (who is the main target)
So, unless intel decides to bridge the gap and allow CPU's with higher than 4 cores on their current mainstream socket, the current chips are the right chips to be in the iMAC.
it has fusion drives. unless they start making SSD that can hold 3TB without costing as much as the computer itself spinning drives are still necessaryStill comes with spinning hard drives and bezels on a non-touch screen.
Not just the 2013, but even some of the old cheese grater Mac Pros hold up pretty well.
The claim has been made, that with an iFixit Kit and a little bravery apparently the CPU and GPU are replaceable in the 27 model... I have not gone out of my way to confirm that, and I havent yet had a chance to read through the tear down myself.
As an Industrial Designer, redesign for the sake of redesign is never good.People wanted redesigned. The same front design since October of 2009.
IIRC the iFixit team found that the CPU isn't soldered in and is replacable (just not easy to get to if you're not willing to rip it apart), But that the GPU wasn't replacable. we'll have to wait and see for more detailed breakdowns. The problem with the iMac is that it's not designed to use generic PCI-E based add-on cards. So even if the GPU itself was in a socket, there's no way for consumers to get the chips to replace it. Previous iMACs used mobile parts that were not user servicable, since there's no room inside the iMac for a full length PCI-E card
But looking at the price point of these X-series CPU from Intel, it's really hard to ignore...
Haha—was thinking the same at first.Feel like I'm playing spot the difference. Are those two charts identical?
UPDATE--they fixed it now