We predict the panel will use a LED phosphor material called KSF to notably boost color saturation.
Not going to happen until OS X supports 10-bit color output and applications are required to support color management. I'd put money on Apple sticking with sRGB displays in the foreseeable future.
Better display quality? I wonder how, since the current Retina iMac has the best display I have ever seen on any computer. It's perfect. Colors, viewing angles, saturation..
It's an 8-bit sRGB glossy panel with a fixed refresh rate and no way of writing to the internal LUTs.
It's a nice display, but there are much better panels available from companies like Eizo, NEC, or even Dell, for example.
There are no OLED panels currently suitable for use as a computer monitor, even ignoring how expensive such a display would be.
There's no reason that a 2015 Apple computer should have a spinning HD in it. They are slower, noisier, run hotter, are more prone to mechanical failure. IMO, it's about time Apple did away with them.
If they are going to keep them around, then there should be a way for the user to replace them without voiding the warranty, or having to remove the damn SCREEN to get to it.
The SSDs in my system run about as warm as the hard drives - right in the middle when comparing the warmest and coolest drives actually - and this is enterprise-grade 3.5" drives I'm comparing them to.
Today's high performance SSDs don't necessarily run any cooler than HDDs.
The problem with SSDs is capacity. It's still
very expensive to buy high capacity SSDs, when there are now 8TB HDDs that are relatively inexpensive.
And frankly I don't see why most people care that much about SSDs in desktop machines.
Buy enough RAM for your system and you should never be touching the disk to open applications - they'll already be in memory.
I'm not saying there is
no reason to use SSDs, but for most users' requirements, the only thing they benefit from is faster app launch times.
SSDs are more important in notebooks where they are physically saving you space inside the machine, draw less power, are going to be
far more reliable due to the lack of moving parts, and are much quicker to respond than the <5400 rpm drives that spin down at every opportunity.
They better put some Nvidia GPUs in there. AMDs are hot garbage for such poorly design airflow considering how much heat AMD gpus create. Worse yet, attempting to do anything at 4k on a mobile GPU is spitting in the face of consumers.
I expect that they will stick with AMD in the foreseeable future, due to the type of compute workloads Apple use.
I would prefer to see NVIDIA GPUs in there too.
But it's a common misconception that desktop Macs have insufficient cooling.
The issue is that Apple purposefully targets the upper thermal limits when designing their cooling solutions in order to keep the systems as small and quiet as possible.
I
could set all the fans in my desktop PC to stay off up to a certain temperature threshold - my GPU is rated for operation up to 98℃ so it could be completely passive/silent up to say 90℃ - but I prefer to have the fans in my case running at all times to prevent heat from building up, extending the life of the components.
Instead of letting the heat build up to 90℃, the GPU fan is set to a minimum of 500 rpm, which is practically inaudible, but keeps the card at a cool 25℃. Under load the fans don't get much higher than that, and keeps the card around 50℃ (GTX 960)
The Mac Pro is meant for very specific applications (hi-res video editing, 3D rendering, etc.) It's not the best setup for typical use or even for gaming. Your typical user will get far more big for the buck from an iMac.
Yes, anyone interested in gaming should build a PC. You're just throwing away money if you buy an upgraded Mac for gaming. You pay at least twice as much and performance is nowhere near as good - especially if you stick to OS X instead of booting into Windows.