Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

satchmo

macrumors 603
Original poster
Aug 6, 2008
5,144
5,943
Canada
Any guesses on size of display to accompanying the Mac Pro?
Will Apple go back to 30” as they had with Cinema Display?
I can see them going a bit bigger but not by much given it’s still for production, and not a television set.
 

h9826790

macrumors P6
Apr 3, 2014
16,649
8,574
Hong Kong
Any guesses on size of display to accompanying the Mac Pro?
Will Apple go back to 30” as they had with Cinema Display?
I can see them going a bit bigger but not by much given it’s still for production, and not a television set.

I doubt if they will ever release any new display.

Most likely they will simply pick another LG monitor and mark up the price (if they really release a 7,1), that's it. They don't really care about the Mac Pro line.
 

flat five

macrumors 603
Feb 6, 2007
5,580
2,657
newyorkcity
I doubt if they will ever release any new display.

Most likely they will simply pick another LG monitor and mark up the price (if they really release a 7,1), that's it. They don't really care about the Mac Pro line.
Apple has repeatedly said they will be releasing a new display and next-gen MacPro..
what do you make of these press announcements?
lies?

https://www.apple.com/newsroom/2017/12/imac-pro-the-most-powerful-mac-ever-available-today/

In addition to the new iMac Pro, Apple is working on a completely redesigned, next-generation Mac Pro architected for pro customers who need the highest performance, high-throughput system in a modular, upgradeable design, as well as a new high-end pro display.
 

frou

macrumors 65816
Mar 14, 2009
1,360
1,916
They have to do something interesting to make it worthwhile. So I don't expect them to be constrained by the sizes and resolutions they've offered so far.
 

William Payne

macrumors 6502a
Jan 10, 2017
931
360
Wanganui, New Zealand.
From a profesional standpoint as far as super accurate colour ones. You don't see many over 31". There is a group of people who want single huge monitors, but most of the editors I have seen whether it be video, photo, design, whatever. Multiple monitors are pretty standard there.

I personally am currently using one monitor, Honestly I could some days say I almost need 3. Yeah you can multitask on one big monitors but I like the idea of dedicated monitors.
 

AndreeOnline

macrumors 6502a
Aug 15, 2014
700
494
Zürich
To me, there is no question:

They'll just need to take what they've learned from the 5k panel in the current iMac/iMac Pro and upgrade it to HDR.

We'll be looking at a 10-bit panel at 5k with DCI-P3 or better color capable of around 1000 nits. The more they can cover BT.2020 the better. Many pros would appreciate LUT support and some calibration features. To this, just ad the normal stuff like and few Thunderbolt ports, speakers and a camera+mic. And make sure it can charge a MBP at full speed.

I would have been totally fine with the current 5k panel in stand alone form, but then it would have been out already. If I'll have to wait for the Mac Pro to launch, I'll take the HDR version, that you very much. We're looking at $1499 to $1995. I hope it's the first.

Dell's current 27" 4k HDR panel is $1995. Same as a base iMac (which includes an excellent non HDR display). But 4k is no good. We need a single cable 5k solution for 2560x1440 effective GUI in HiDPI mode.

27" is the sweet spot for me since I need two monitors. I know I'm not alone here as many pro app interfaces are built (as least as an option) around a dual monitor setup. I can see how someone who has always just used one monitor would flirt with the idea of a larger, higher resolution model, but I don't expect, nor do I want that.

In short: the current 5k display with even better brightness and HDR compatibility.

I'd prefer "just good enough specs" and an aggressive price over "state of the art" for $3495.
 
  • Like
Reactions: orph

William Payne

macrumors 6502a
Jan 10, 2017
931
360
Wanganui, New Zealand.
I wouldn't be surprised if its more expensive. It depends on how "pro" they go. If they want to compete with Dell or whatever then ok maybe it will be in the $1000 - $2000 range, however if they want to compete with EIZO and Flanders Scientific offering the same features as they do it could be $5700 ++++++++
 

William Payne

macrumors 6502a
Jan 10, 2017
931
360
Wanganui, New Zealand.
IIRC EIZO's (full) 4k cinema reference monitor is 34"

I'm going to bet Apples pro display is just an improved version of the panel Dell is using for their 32" 8k display.

If they even make an 8k display, Dells is really the only one out. Even cinema reference monitor companies are not making an 8k display yet, unless there is something coming next year.
[doublepost=1513470379][/doublepost]Oh an just checking Eizo's site, I don't see anything about 31.1" in the colour edge range.
 

Dr. Stealth

macrumors 6502a
Sep 14, 2004
813
739
SoCal-Surf City USA
I think Display Size is a personal thing. Some are very happy with a 21" while others love 40"+.

I've settled on a single Dell 31.5" 4k and I think with my desk set-up it's about the max I will go. Before I got this display I thought 27" was the max I would go but now that I have used this display for a while I love it. Not too big and not too small. It's the perfect size for me, at this point in time...
 

Raunien

macrumors 6502a
Aug 3, 2011
535
57
I have a question that hasn't been answered during my searches.

What's the difference (other than sharpness) if I run 2560x1440 HIDPI on a 4k panel vs 2560x1440 HIDPI on a 5k panel.

I've seen the two and obviously the 5k panel is sharper, but I can't say that 4k is far behind. If I am sitting 2-3 feet away, I probably can't tell the difference.

However, is there a difference in GPU usage and smoothness when doing things such as mission control on these two types of configurations?
 

mattspace

macrumors 68040
Jun 5, 2013
3,292
2,941
Australia
of course Apple is pouring resources into new and expensive displays coupled with weak GPUs, at the exact moment in history when VR Headsets coupled with powerful GPUs are the emergent paradigm for usable "all day" work environments.

So many black swans in Apple's skies, they block out the sun.
 

William Payne

macrumors 6502a
Jan 10, 2017
931
360
Wanganui, New Zealand.
of course Apple is pouring resources into new and expensive displays coupled with weak GPUs, at the exact moment in history when VR Headsets coupled with powerful GPUs are the emergent paradigm for usable "all day" work environments.

So many black swans in Apple's skies, they block out the sun.

I would be quite interested to know of examples of workers doing their jobs with VR?
 

mattspace

macrumors 68040
Jun 5, 2013
3,292
2,941
Australia
I would be quite interested to know of examples of workers doing their jobs with VR?

guys I know built this:

https://www.supermanual.io

It's a general purpose VR environment. They're conducting all their company meetings in VR group environments, simply for the space they get to organise ideas out.

Or, look at Tvori:


Tvori is an animation environment where you build and physically animate as if you were working with models and stopframe figures, but you can move things around in real time and record those movements. Point is, VR is being used to make content for viewing on a 2D screen, not as a lot of people, especially in the Apple-centric blogoverse seem to think, 2D screen computers being used to create content for VR.

Once you start researching this stuff, it becomes clear that macOS is just going to be a glorified EFI or Bios, something that launches SteamVR (which most of the apps are being built on) and then you never interact with it.
 
Last edited:

William Payne

macrumors 6502a
Jan 10, 2017
931
360
Wanganui, New Zealand.
That is highly experimental (though awesome). Also extremely niche. I have played around with VR and while my experience was less then an hours time it is not something I feel is at a point yet where I could immerse myself in it from a work aspect.
 

h9826790

macrumors P6
Apr 3, 2014
16,649
8,574
Hong Kong
I have a question that hasn't been answered during my searches.

What's the difference (other than sharpness) if I run 2560x1440 HIDPI on a 4k panel vs 2560x1440 HIDPI on a 5k panel.

I've seen the two and obviously the 5k panel is sharper, but I can't say that 4k is far behind. If I am sitting 2-3 feet away, I probably can't tell the difference.

However, is there a difference in GPU usage and smoothness when doing things such as mission control on these two types of configurations?

My understanding is as follow. Theoretically 5k is right at 2x 1440P resolution. Therefore, the GPU no need to interpolate anything, just use four 5k pixel to simulate one 1440p pixel. So, for anything base on "pixel", 5K panel should be a better choice to display 1440P stuff. Assume if you have a 1440P BMP picture and, and open it on a 4K screen (full screen mode), the GPU now have to analysis how to use nine 4k pixel (3x3) to display four 1440P pixel (2x2). In other words, error may be introduced during the "zoom" process. Then.

If all 2x2 1440P pixels are in the same colour. Then there will be no lost, because the GPU can use 3x3 4k pixel to display the same single colour.

But if all four 2x2 1440P pixels has different colours, then only the 4 corners on the 3x3 4K pixels can display correctly base on the original picture's info. All other 5 pixels are interpolated by the GPU.

The following picture is one of the way how the GPU may interpolated the signal. The left most one is the original data in 1440P. 5K (mid one) can perfectly simulate 1440P and display it correctly. The right most one is the 4K simulated 1440P. There are 5 "wrong" pixels, in fact, more than 55% pixels may be "wrong" during display.
Screen Shot 2017-12-17 at 14.46.27.jpg

In some case, the software may designed to pick some pixels to "stretch" it, rather than interpolate the colour of the "missing pixels". In such case, the colour may look better, the image may look sharper, however, it's again no more the original image.
Screen Shot 2017-12-17 at 14.46.27 copy copy.jpg

On the other hand, for "vector" style data (e.g. some font, line, etc), there should not be any big different between 4k and 5k. The GPU have to render it at real time anyway. But not pre-define a pixel, and then "zoom" in. So, at a proper distance. Once go beyond human eyes' angular resolution's limitation. They should looks virtually identical.

So, is it 5K is better than 4k? To display 1440P "pixel style" data, yes. However, in real world, we usually see 1080P source rather than 1440P source. Therefore, IMO, 4K may be better than 5K most of the time. Because when we display 1080P data in full screen, now 5K monitor have to do all the interpolation (of course, only true when you insist to display the source in full screen).

IMO, there is no definitive "better" monitor between 4K or 5K. It's really depends on the user's usage. However, by considering there is so much less trouble to display 4K nowadays. I personally prefer to go for 4K SST rather than 5K MST.
 
  • Like
Reactions: Synchro3

mattspace

macrumors 68040
Jun 5, 2013
3,292
2,941
Australia
That is highly experimental (though awesome). Also extremely niche. I have played around with VR and while my experience was less then an hours time it is not something I feel is at a point yet where I could immerse myself in it from a work aspect.

That particular video LOOKs highly experimental because it's an earlier build, also, which GPU you're using is a huge differentiator - unless you've worked in a 1080ti powered system, you haven't reached minimum viable product for a work environment. All the people I know using it, and from my own experience, you lose HOURS in immersion. Guys like SUTU regularly talk about just losing entire days immersed.

Again, you've got to cross that critical performance threshold though 1080ti and Vive with a good 3x3m live environment.
[doublepost=1513495439][/doublepost]
Also extremely niche.

I'd be willing to bet you would be unable to find a single professional 3d animator who wouldn't cash in their entire toolset to switch to VR tools, the moment the precision (things like measurements, bezier curves for motion paths etc) become available.
 

William Payne

macrumors 6502a
Jan 10, 2017
931
360
Wanganui, New Zealand.
That particular video LOOKs highly experimental because it's an earlier build, also, which GPU you're using is a huge differentiator - unless you've worked in a 1080ti powered system, you haven't reached minimum viable product for a work environment. All the people I know using it, and from my own experience, you lose HOURS in immersion. Guys like SUTU regularly talk about just losing entire days immersed.

Again, you've got to cross that critical performance threshold though 1080ti and Vive with a good 3x3m live environment.

My issues with VR is that I have no desire to put on the goggles and the hand controls. I have used it, I was wowed by it, but the hand controls aren't there yet, the goggles are great as it is like your normal vision just computer generated and lacking graphically. However the hand controls need to go, it needs to be where you use your hands the same way you would in real life, no handles, no buttons. It needs to sense sort of what you are doing. Also they need to develop a way to where you wont walk into things. As movement in a small space I found a problem.

It has a lot of progress and I feel one day it will get to a point where it will be like the matrix where you don't know you are in that world.

For me nothing I do currently has any benefit from VR, It is not something that enters my mind for me just yet.
[doublepost=1513495837][/doublepost]
I'd be willing to bet you would be unable to find a single professional 3d animator who wouldn't cash in their entire toolset to switch to VR tools, the moment the precision (things like measurements, bezier curves for motion paths etc) become available.

I am not an animator, I however would be greatly interested to know if they would all like to throw away their wacoms and put on VR sets.

My problem though is naivety on my part. I as someone not in that arena find it hard to grasp how a VR universe replaces a digital pencil.
 
Last edited:

mattspace

macrumors 68040
Jun 5, 2013
3,292
2,941
Australia
My issues with VR is that I have no desire to put on the goggles and the hand controls.

Well, I find VR gear is a more comfortable and less bulky alternative to most of the safety gear I wear when working in a studio - compared to gauntlets, a welding helmet, the weight of a MIG welder gun and feed line etc, the Vive is a doddle :)

For me, the trigger controls for painting in something like Tilt Brush are no less direct than the button an an airbrush, or a good footlong paintbrush in an outstretched arm. Right now, people are doing full glove controllers, that you strap a Vive tracker on to the forearm, and you get all 5 fingers. Though i haven't used it, the Oculus controllers are also said to provide a form of finger tracking, and be a lot better ergonomically than the Vive's.

There are also people using Leapmotion handtrackers, if you remember them when they launched a couple of years ago as a thing you put on your desk to do hand gestures, except they strap them to their VR headsets to read their hands.

However the hand controls need to go, it needs to be where you use your hands the same way you would in real life, no handles, no buttons. It needs to sense sort of what you are doing. Also they need to develop a way to where you wont walk into things. As movement in a small space I found a problem.

I work with a mouse, A3 Wacom tablet and trackpad at the same time, Vive controllers were just a different set of buttons, and actually more like touch control because you can grip and physically move things, scale by pulling your hands together or apart etc.

I don't know if you experienced this, but Vive's have camera passthrough so you can see objects within a certain proximity, also there's a warning grid which appears as you're reaching the edge of your tracked space, and you can calibrate that to avoid things, like a desk at one edge of the space, for example. One guy I know of just traces his controllers along the edges of objects in his workspace, which draws them into his work environment, then he can walk around them.

Proove it. I am not an animator, I however would be greatly interested to know if they would all like to throw away their wacoms and put on VR sets.

My problem though is naivety on my part. I as someone not in that arena find it hard to grasp how a VR universe replaces a digital pencil.

Wacom tablets are a means to an end, and they're great for 2d art, but they're still an "imagine you had a z axis" hack for 3d. All the people I know involved in VR are former 3D and game animators, and they're building this tech because they want to use it.

But we're out in the weeds off the monitor topic, fun as it is ;)
 

William Payne

macrumors 6502a
Jan 10, 2017
931
360
Wanganui, New Zealand.
VR is here to stay. I just like to see things with my own eyes and do things with my own hands.

I know that technically you are with VR, but I just have not worked in animations so for me I haven't experienced VR in any way that would benefit me.

If I was an animator I would probably think differently.

Though when I think of animating I still at this time would prefer the physical act of drawing. But as I said I am not an animator.

I love futurism but have many retro preferences. For example I avoid digital books. I have a lifelong love of libraries and physical books. I am only 29. I hope to live as long as I can to see what the future brings.

But unfortunately the older I get the less I care about technology.
 

Raunien

macrumors 6502a
Aug 3, 2011
535
57
My understanding is as follow. Theoretically 5k is right at 2x 1440P resolution. Therefore, the GPU no need to interpolate anything, just use four 5k pixel to simulate one 1440p pixel. So, for anything base on "pixel", 5K panel should be a better choice to display 1440P stuff. Assume if you have a 1440P BMP picture and, and open it on a 4K screen (full screen mode), the GPU now have to analysis how to use nine 4k pixel (3x3) to display four 1440P pixel (2x2). In other words, error may be introduced during the "zoom" process. Then.

If all 2x2 1440P pixels are in the same colour. Then there will be no lost, because the GPU can use 3x3 4k pixel to display the same single colour.

But if all four 2x2 1440P pixels has different colours, then only the 4 corners on the 3x3 4K pixels can display correctly base on the original picture's info. All other 5 pixels are interpolated by the GPU.

The following picture is one of the way how the GPU may interpolated the signal. The left most one is the original data in 1440P. 5K (mid one) can perfectly simulate 1440P and display it correctly. The right most one is the 4K simulated 1440P. There are 5 "wrong" pixels, in fact, more than 55% pixels may be "wrong" during display.
View attachment 742382
In some case, the software may designed to pick some pixels to "stretch" it, rather than interpolate the colour of the "missing pixels". In such case, the colour may look better, the image may look sharper, however, it's again no more the original image.
View attachment 742384
On the other hand, for "vector" style data (e.g. some font, line, etc), there should not be any big different between 4k and 5k. The GPU have to render it at real time anyway. But not pre-define a pixel, and then "zoom" in. So, at a proper distance. Once go beyond human eyes' angular resolution's limitation. They should looks virtually identical.

So, is it 5K is better than 4k? To display 1440P "pixel style" data, yes. However, in real world, we usually see 1080P source rather than 1440P source. Therefore, IMO, 4K may be better than 5K most of the time. Because when we display 1080P data in full screen, now 5K monitor have to do all the interpolation (of course, only true when you insist to display the source in full screen).

IMO, there is no definitive "better" monitor between 4K or 5K. It's really depends on the user's usage. However, by considering there is so much less trouble to display 4K nowadays. I personally prefer to go for 4K SST rather than 5K MST.

Thanks! That makes a lot of sense.
 
  • Like
Reactions: h9826790
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.