Any guesses on size of display to accompanying the Mac Pro?
Will Apple go back to 30” as they had with Cinema Display?
I can see them going a bit bigger but not by much given it’s still for production, and not a television set.
Apple has repeatedly said they will be releasing a new display and next-gen MacPro..I doubt if they will ever release any new display.
Most likely they will simply pick another LG monitor and mark up the price (if they really release a 7,1), that's it. They don't really care about the Mac Pro line.
From a profesional standpoint as far as super accurate colour ones. You don't see many over 31".
IIRC EIZO's (full) 4k cinema reference monitor is 34"
I'm going to bet Apples pro display is just an improved version of the panel Dell is using for their 32" 8k display.
of course Apple is pouring resources into new and expensive displays coupled with weak GPUs, at the exact moment in history when VR Headsets coupled with powerful GPUs are the emergent paradigm for usable "all day" work environments.
So many black swans in Apple's skies, they block out the sun.
I would be quite interested to know of examples of workers doing their jobs with VR?
I have a question that hasn't been answered during my searches.
What's the difference (other than sharpness) if I run 2560x1440 HIDPI on a 4k panel vs 2560x1440 HIDPI on a 5k panel.
I've seen the two and obviously the 5k panel is sharper, but I can't say that 4k is far behind. If I am sitting 2-3 feet away, I probably can't tell the difference.
However, is there a difference in GPU usage and smoothness when doing things such as mission control on these two types of configurations?
That is highly experimental (though awesome). Also extremely niche. I have played around with VR and while my experience was less then an hours time it is not something I feel is at a point yet where I could immerse myself in it from a work aspect.
Also extremely niche.
That particular video LOOKs highly experimental because it's an earlier build, also, which GPU you're using is a huge differentiator - unless you've worked in a 1080ti powered system, you haven't reached minimum viable product for a work environment. All the people I know using it, and from my own experience, you lose HOURS in immersion. Guys like SUTU regularly talk about just losing entire days immersed.
Again, you've got to cross that critical performance threshold though 1080ti and Vive with a good 3x3m live environment.
I'd be willing to bet you would be unable to find a single professional 3d animator who wouldn't cash in their entire toolset to switch to VR tools, the moment the precision (things like measurements, bezier curves for motion paths etc) become available.
My issues with VR is that I have no desire to put on the goggles and the hand controls.
However the hand controls need to go, it needs to be where you use your hands the same way you would in real life, no handles, no buttons. It needs to sense sort of what you are doing. Also they need to develop a way to where you wont walk into things. As movement in a small space I found a problem.
Proove it. I am not an animator, I however would be greatly interested to know if they would all like to throw away their wacoms and put on VR sets.
My problem though is naivety on my part. I as someone not in that arena find it hard to grasp how a VR universe replaces a digital pencil.
My understanding is as follow. Theoretically 5k is right at 2x 1440P resolution. Therefore, the GPU no need to interpolate anything, just use four 5k pixel to simulate one 1440p pixel. So, for anything base on "pixel", 5K panel should be a better choice to display 1440P stuff. Assume if you have a 1440P BMP picture and, and open it on a 4K screen (full screen mode), the GPU now have to analysis how to use nine 4k pixel (3x3) to display four 1440P pixel (2x2). In other words, error may be introduced during the "zoom" process. Then.
If all 2x2 1440P pixels are in the same colour. Then there will be no lost, because the GPU can use 3x3 4k pixel to display the same single colour.
But if all four 2x2 1440P pixels has different colours, then only the 4 corners on the 3x3 4K pixels can display correctly base on the original picture's info. All other 5 pixels are interpolated by the GPU.
The following picture is one of the way how the GPU may interpolated the signal. The left most one is the original data in 1440P. 5K (mid one) can perfectly simulate 1440P and display it correctly. The right most one is the 4K simulated 1440P. There are 5 "wrong" pixels, in fact, more than 55% pixels may be "wrong" during display.
View attachment 742382
In some case, the software may designed to pick some pixels to "stretch" it, rather than interpolate the colour of the "missing pixels". In such case, the colour may look better, the image may look sharper, however, it's again no more the original image.
View attachment 742384
On the other hand, for "vector" style data (e.g. some font, line, etc), there should not be any big different between 4k and 5k. The GPU have to render it at real time anyway. But not pre-define a pixel, and then "zoom" in. So, at a proper distance. Once go beyond human eyes' angular resolution's limitation. They should looks virtually identical.
So, is it 5K is better than 4k? To display 1440P "pixel style" data, yes. However, in real world, we usually see 1080P source rather than 1440P source. Therefore, IMO, 4K may be better than 5K most of the time. Because when we display 1080P data in full screen, now 5K monitor have to do all the interpolation (of course, only true when you insist to display the source in full screen).
IMO, there is no definitive "better" monitor between 4K or 5K. It's really depends on the user's usage. However, by considering there is so much less trouble to display 4K nowadays. I personally prefer to go for 4K SST rather than 5K MST.