This is almost identical to my setup with 27" QHD displays....Main "work" display straight ahead with slight offset to the right.
2nd "content" display on the left angled to me (30° I would say).
This is almost identical to my setup with 27" QHD displays....Main "work" display straight ahead with slight offset to the right.
2nd "content" display on the left angled to me (30° I would say).
Well he does not test any 3d apps, I guess the 2d apps don’t push the GPU engough.I believe now most of sold 4k monitors have matte screens, including this year UP series from LG. But for instance LG ultrafine 23,7" 24MD4KL-B has glossy screen, I don't like it for external monitors.
I'd say it depends on your workflow and apps used. Here's comprehensive test of some GPU intensive apps, and slowdown due to scaling is not problematic until you go extreme (3 or 4 external 4k monitors). It looks in general like Apple designed ARM chips GPU with hardware accelerated scaling unlike all older designs with AMD or Intel video chips which cannot handle scaling that easy (MBP 16" is included for comparison, it is doing much worse despite having discrete GPU):
When using 1440p low res it is actually outputting real 1440p on the 4k screen and it looks horrible.What' s the difference between like 2560x1440 and 2560x1440 (low resolution)?
Thnxs!I think this is referring to retina/non-retina, so actually displaying 2560x1440 pixels rather than at double ppi.
Thnxs for this video; I've seen that before. Posted a link to the website as mentioned in the video, somewhere on a macrumors forum.I will share this as that is what I discovered to and I think he is right.
I just tried a non-productive scenario with my Intel MacBook Pro (Intel Iris Plus 645 with native resolution 2560x1600).
Short version: No relevant difference in power consumption.
Playing a 1080p H.264 mkv with IINA in fullscreen and overlaying the iStat menus GPU graph:
edit: I know it's not 4K / 1080p but it should give an idea how much resources the additional downscaling process needs (not talking about the negative impact when doing GPU-intense tasks/rendering). By the way, the base model Mac Pro 2013 with 2GB D300 GPUs is officially capable of driving three 5K displays at once. This means 44 Megapixels @60 Hz with almost 9 year old technology.
- CPU: In both cases IINA and WindowServer use the same amount.
- GPU:
- - [1] Using "looks like 1280x800" (retina 2x):
- 53% processing, 6% memory
- - [2] Using "looks like 1680x1050" (retina 3360x2100 then downsampled back to 2560x1600):
- 54% processing, 8% memory
What' s the difference between like 2560x1440 and 2560x1440 (low resolution)?
I'd say it depends on your workflow and apps used. Here's comprehensive test of some GPU intensive apps, and slowdown due to scaling is not problematic until you go extreme (3 or 4 external 4k monitors). It looks in general like Apple designed ARM chips GPU with hardware accelerated scaling unlike all older designs with AMD or Intel video chips which cannot handle scaling that easy (MBP 16" is included for comparison, it is doing much worse despite having discrete GPU):
2560 x 1440p is correct if you got a 16:9 aspect ratio.(also in reaction to L0stL0rd before)
If I go with dual monitor setup, I would have both 5k iMac (scaled to 1440p), and 1440p native Eizo.
By the way.....the Eizo CG2700S is a WQHD monitor according to Eizo Global. It also states "WQHD (2560-1440p)". Shouldn't t that be 3440x1440 (being WQHD)??
Thnxs. I know, but Eizo isn’t ultra wide.2560 x 1440p is correct if you got a 16:9 aspect ratio.
3440 x 1440 is usually ultra wide screens which is a 21:9 aspect ratio, referred to as UW QHD by some.
Yes so unless I misunderstood 2560x1440 is normal for WQHDThnxs. I know, but Eizo isn’t ultra wide.
Perhaps the 64 core Ultra would be fine but I notice slowdowns even on a 32 core max with 64 GB.
Thnxs for this video; I've seen that before. Posted a link to the website as mentioned in the video, somewhere on a macrumors forum.
Yet lots of Windows users tend to scale UI to 150% for such display because 100% is too small to see for some people (including me).2560x1440 is pretty standard for a 27 inch monitor.
And maybe best for photo editing (for 27” monitor). Eizo says the CG2700S is WQHD, which would translate to 3440x1440p. So, not very clear.2560x1440 is pretty standard for a 27 inch monitor.
Yes the difference windows scaling is way more efficient than Mac OS scaling and the speed loss is not noticeable on Windows, even on my old 5700XT.Yet lots of Windows users tend to scale UI to 150% for such display because 100% is too small to see for some people (including me).
A google search suggests 2560x1440 is QHD/WQHD......for some reason.And maybe best for photo editing (for 27” monitor). Eizo says the CG2700S is WQHD, which would translate to 3440x1440p. So, not very clear.
Yes, 3D apps can be different beast, 2D mostly uses OpenCL for image manipulation calculations, and that while taxing GPU is probably not that heavy as full fledge 3D app using metal.Well he does not test any 3d apps, I guess the 2d apps don’t push the GPU engough.
I have not noticed any issues with 2D apps either but 3d is another story.
Perhaps the 64 core Ultra would be fine but I notice slowdowns even on a 32 core max with 64 GB.
60 Hz is pretty much standard enough refresh rate for most of normal work, for gaming ≥ 100 Hz is so often desired nowadays. But gaming @4k and high fps is still a challenge even for best PC setups.First of all, I wish this youtuber every success. First time seeing him and sounds like very different from other entertainment-oriented/superficial techtubers.
While he gave some thought into his experiments (on penalty in scaled resolutions), I could find two flaws. Firstly, he seems to think that the number of pixels of the virtual canvas (e.g. 5K virtual canvas for 4K physical display in 1440p scaled resolution) is a decisive factor inflicting performance penalty. It's not. It's the extra step of down-sampling that's causing performance hit. Secondly, the metric he uses to measure performance penalty by "responsiveness of the system." While it's intuitive and perhaps what most users care, the metric is too coarse to catch any performance loss except when the penalty is really huge.
People seem to to care more about "performance loss in scaled resolutions" We got more reports from fellow members. I saw one here in another thread. This appears to be an excellent counter-argument against running 4K display in scaled resolutions because the performance loss is yelling at the user every second he's using the display.
The essence is that at 60Hz refresh rate, most people might not notice sluggishness in UI/other performance loss. At 120/144Hz refresh rate, the critical threshold seems easily reached to notice the sluggishness every second you're operating the computer. The explanation is simple in scaled resolution the down-sampling is performed from 60 times per second to 120 or 144 times per second. So not only multiple displays but also higher refresh rates seem to exacerbate the performance penalty. That's in addition to GPU workload as pointed out by @l0stl0rd.
Most I know use rather 125% scaling which gives you UI size the same like on 1080p screens, setting 150% will give (for ma at least) huge UI, equivalent of around 1706x960 resolution.Yet lots of Windows users tend to scale UI to 150% for such display because 100% is too small to see for some people (including me).
Scaling just UI in Windows has this advantage that GPU is not that heavily used, just a bunch of (usualy older) software may have problems with it, trating display as non-HiDpi and thus rendering tiny, tiny UI elements.Yes the difference windows scaling is way more efficient than Mac OS scaling and the speed loss is not noticeable on Windows, even on my old 5700XT.
Yea usually at se windows at 125% or 150% too depending on the screen.
The 4K Mac is a MP5,1 with a RX570 8GB. As it is not a native GPU, I am not sure if the powerplay tables and power consumption are 100% representative. However, here they go:I thought you had the setup with external 4K display(s) (as illustrations in your previous posts may imply) handy. That's what I was interested in if you may quickly report idle power consumption as well. (...)
That's why i overlayed the iStat menus graph, to ensure it still renders at the selected (non-native) resolution.Lastly, I think using windowed video playback will be more applicable to full screen playback in such tests. Not that it matters much to the result in your tests. I thought it would be interesting to point it out and people interested in such tests may watch out. The reasoning is by wishful thinking that Apple/macOS is intelligent enough not to do the redundant up-scaling/down-sampling to fit the full screen. Instead, macOS could render video to full screen utilising its native resolution in the first place. To avoid such doubt/uncertainty, I used windowed graphic/video in my tests that I linked.
Actually my M1 mac blacks the screen sometimes too when switching resolutions ?♂️The 4K Mac is a MP5,1 with a RX570 8GB. As it is not a native GPU, I am not sure if the powerplay tables and power consumption are 100% representative. However, here they go:
4K "looks like 1080p": 21 W
4k "looks like 1440p": 26 W
That's why i overlayed the iStat menus graph, to ensure it still renders at the selected (non-native) resolution.
But that's unnecessary, as only M1 Macs are capable of switching the resolution without blackening the screen in the process (which would be a straight giveaway on Intel machines).
But you are correct, there are apps (like Kodi) that can actively change the resolution.
60 Hz is pretty much standard enough refresh rate for most of normal work, for gaming ≥ 100 Hz is so often desired nowadays. But gaming @4k and high fps is still a challenge even for best PC setups.
The 4K Mac is a MP5,1 with a RX570 8GB. As it is not a native GPU, I am not sure if the powerplay tables and power consumption are 100% representative. However, here they go:
4K "looks like 1080p": 21 W
4k "looks like 1440p": 26 W