Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I believe now most of sold 4k monitors have matte screens, including this year UP series from LG. But for instance LG ultrafine 23,7" 24MD4KL-B has glossy screen, I don't like it for external monitors.


I'd say it depends on your workflow and apps used. Here's comprehensive test of some GPU intensive apps, and slowdown due to scaling is not problematic until you go extreme (3 or 4 external 4k monitors). It looks in general like Apple designed ARM chips GPU with hardware accelerated scaling unlike all older designs with AMD or Intel video chips which cannot handle scaling that easy (MBP 16" is included for comparison, it is doing much worse despite having discrete GPU):
Well he does not test any 3d apps, I guess the 2d apps don’t push the GPU engough.

I have not noticed any issues with 2D apps either but 3d is another story.

Perhaps the 64 core Ultra would be fine but I notice slowdowns even on a 32 core max with 64 GB.
 
  • Like
Reactions: sylwiusz
What' s the difference between like 2560x1440 and 2560x1440 (low resolution)?
When using 1440p low res it is actually outputting real 1440p on the 4k screen and it looks horrible.

The scaled version, or what Apple calls 1440 HiDPI too, is that it renders 5k and scales it down to 1440p to display it on the 4k display.

That is why the low res 1440p runs fast (faster than 4k native) and 1440p HiDPI (which is the standard by MacOs) runs slow (at 1/3 of the speed).

That is what I noticed with 3d software and a Dell and a BenQ 4k screen.

Also the Dell defaults to 1080p which is half of 4k so “perfect scaling and speed”. The BenQ defaults to 4k.

For me personally I am sticking with either get a 1440p 27”, or a 5k 27” (there is just 2 options left). Seems there was a 5k Dell but that is not being produced anymore.
 
Thank you!
If I go with dual monitor setup, I would have both 5k iMac (scaled to 1440p), and 1440p native Eizo.
By the way.....the Eizo CG2700S is a WQHD monitor according to Eizo Global. It also states "WQHD (2560-1440p)". Shouldn't t that be 3440x1440 (being WQHD)??
 
Last edited:
I just tried a non-productive scenario with my Intel MacBook Pro (Intel Iris Plus 645 with native resolution 2560x1600).
Short version: No relevant difference in power consumption.

Playing a 1080p H.264 mkv with IINA in fullscreen and overlaying the iStat menus GPU graph:
  • CPU: In both cases IINA and WindowServer use the same amount.
  • GPU:
    • - [1] Using "looks like 1280x800" (retina 2x):
      • 53% processing, 6% memory
    • - [2] Using "looks like 1680x1050" (retina 3360x2100 then downsampled back to 2560x1600):
      • 54% processing, 8% memory
edit: I know it's not 4K / 1080p but it should give an idea how much resources the additional downscaling process needs (not talking about the negative impact when doing GPU-intense tasks/rendering). By the way, the base model Mac Pro 2013 with 2GB D300 GPUs is officially capable of driving three 5K displays at once. This means 44 Megapixels @60 Hz with almost 9 year old technology.

I thought you had the setup with external 4K display(s) (as illustrations in your previous posts may imply) handy. That's what I was interested in if you may quickly report idle power consumption as well. Nevertheless, thanks for your testing.

Back to your test result, there are two things worth noting IMO. Firstly, it seems (at least for 60Hz refresh rate) the extra power consumption indeed is none or not detectable. My own testing in another thread also seems to indicate that. Secondly, if the application is not pushing the GPU hard enough, the penalty of scaled resolutions is not easily noticeable.

Lastly, I think using windowed video playback will be more applicable to full screen playback in such tests. Not that it matters much to the result in your tests. I thought it would be interesting to point it out and people interested in such tests may watch out. The reasoning is by wishful thinking that Apple/macOS is intelligent enough not to do the redundant up-scaling/down-sampling to fit the full screen. Instead, macOS could render video to full screen utilising its native resolution in the first place. To avoid such doubt/uncertainty, I used windowed graphic/video in my tests that I linked.
 
What' s the difference between like 2560x1440 and 2560x1440 (low resolution)?

(Note, that if you jump through all the necessary hoops, it will be listed as "2560x1440 (low resolution)" in the menu.)

Short answer: it produces a regular, old-school, 2560x1440 (or whatever) output even if you have a 4k or 5k display connected, and lets the monitor work out what to do with it. Most displays will, fairly crudely, scale it up to fill the screen. None of the "scaled mode" tricks (2x UI, intermediate 5k screen) apply.

I think it's actually more complicated than that, depending on what type of display you have and how it is connected. Esp. on a Mac with an internal display (a 5k Mac still reports full resolution in the System Report, but you get a slightly fuzzy image).

AFAIK the modes are primarily there for games etc. which take over the whole screen and set their own mode - and may not support (or perform well) in full 4k or 5k.
 
(also in reaction to L0stL0rd before)
If I go with dual monitor setup, I would have both 5k iMac (scaled to 1440p), and 1440p native Eizo.
By the way.....the Eizo CG2700S is a WQHD monitor according to Eizo Global. It also states "WQHD (2560-1440p)". Shouldn't t that be 3440x1440 (being WQHD)??
 
I'd say it depends on your workflow and apps used. Here's comprehensive test of some GPU intensive apps, and slowdown due to scaling is not problematic until you go extreme (3 or 4 external 4k monitors). It looks in general like Apple designed ARM chips GPU with hardware accelerated scaling unlike all older designs with AMD or Intel video chips which cannot handle scaling that easy (MBP 16" is included for comparison, it is doing much worse despite having discrete GPU):

First of all, I wish this youtuber every success. First time seeing him and sounds like very different from other entertainment-oriented/superficial techtubers.

While he gave some thought into his experiments (on penalty in scaled resolutions), I could find two flaws. Firstly, he seems to think that the number of pixels of the virtual canvas (e.g. 5K virtual canvas for 4K physical display in 1440p scaled resolution) is a decisive factor inflicting performance penalty. It's not. It's the extra step of down-sampling that's causing performance hit. Secondly, the metric he uses to measure performance penalty by "responsiveness of the system." While it's intuitive and perhaps what most users care, the metric is too coarse to catch any performance loss except when the penalty is really huge.

People seem to to care more about "performance loss in scaled resolutions" We got more reports from fellow members. I saw one here in another thread. This appears to be an excellent counter-argument against running 4K display in scaled resolutions because the performance loss is yelling at the user every second he's using the display.

The essence is that at 60Hz refresh rate, most people might not notice sluggishness in UI/other performance loss. At 120/144Hz refresh rate, the critical threshold seems easily reached to notice the sluggishness every second you're operating the computer. The explanation is simple in scaled resolution the down-sampling is performed from 60 times per second to 120 or 144 times per second. So not only multiple displays but also higher refresh rates seem to exacerbate the performance penalty. That's in addition to GPU workload as pointed out by @l0stl0rd.
 
(also in reaction to L0stL0rd before)
If I go with dual monitor setup, I would have both 5k iMac (scaled to 1440p), and 1440p native Eizo.
By the way.....the Eizo CG2700S is a WQHD monitor according to Eizo Global. It also states "WQHD (2560-1440p)". Shouldn't t that be 3440x1440 (being WQHD)??
2560 x 1440p is correct if you got a 16:9 aspect ratio.
3440 x 1440 is usually ultra wide screens which is a 21:9 aspect ratio, referred to as UW QHD by some.
 
Thnxs. I know, but Eizo isn’t ultra wide.
Yes so unless I misunderstood 2560x1440 is normal for WQHD

Pretty sure it will be ok I use the old CG 27” at work with Mac OS.l, where the sensor still is at the bottom.
 
Perhaps the 64 core Ultra would be fine but I notice slowdowns even on a 32 core max with 64 GB.
Thnxs for this video; I've seen that before. Posted a link to the website as mentioned in the video, somewhere on a macrumors forum.

1440p displays are very, very usable (and were the bee's knees only a few years back) and there are perfectly sensible reasons for choosing one if you can live without pin-sharp 4k/5k text and line art. Maybe it's a cheaper way to get the colour calibration/gamut you want, or maybe you're mostly using 3D software that your machine struggles to run smoothly at 4k...

Personally, I've always had/wanted to do a bit of everything, and find that a 4k display gives the most flexibility. There are various work-arounds and compromises to address the scaling issues - whereas a 1440p display can never display a full 4k image or give you different MacOS UI scales without horrible degradation. But if you have a well-defined workflow (especially the one that pays the rent) for which 1440p is optimum, getting a 1440p display is a no-brainer.

(I still don't get the problem with "1080p mode" with apps like Blender etc. which have their own, fully scalable UI and a hoard of display quality settings that can be adjusted to whatever your hardware can cope with - and they're crossplatform apps which certainly weren't designed for 1440p iMacs - but I used blender for about a week, years ago, so I'm not going to argue with people who use it daily).

There's one point to consider with resolutions, before even considering the GPU load of rescaling:
  • 4k uses 4x the pixels of 1080p, 2.25x the pixels of 1440p
  • On top of that, 5k uses 1.7x the pixels of 4k (and "looks like 1440p" means apps think they're running at 5k)
  • Those factors will also apply to VRAM (whether it's discrete or shared with system RAM) usage and, remember, it's not just one extra buffer for the 5k virtual screen - potentially any bitmap data stored in VRAM will grow by that factor.
As the video linked earlier says, an M1 isn't the ideal Blender machine (esp. before the native Apple Silicon version & Metal support were out) so it's not entirely surprising if rescaling was the straw that broke the camel's back. (The video doesn't discuss what quality settings were used or whether the problems were with editing smoothness or the time taken for the final render). There's a sort-of assumption at the end that scaling was the big problem and all would be rosy with a 5k or 6k display, but I'm not sure if that's safe.

Back in 2019 it would have been unusual to even consider a MacBook Air, low-end "2 port" 13" MBP (with integrated GPU) or an Intel Mini (again: feeble integrated GPU) if you were going to plug in a high-def display or two and use it for 3D - or even "pro" photo editing. The M1 certainly greatly expanded how usable the Air/MBP/Mini were for such applications but I think that - combined with the long wait for the M1 Pro/Max/Ultra machines - led to a bit of hyperbole about how suitable these entry level machines were as replacements for "Pro" machines and how "unified memory" somehow made 16GB of system RAM a worthy upgrade from a Pro machine with 16GB of system RAM and 8GB of dedicated VRAM. The M1 Pro and Max didn't only increase the amount of unified RAM available, they increased the memory bandwidth as well. The M1's GPU is amazing for an ultra-low-power integrated GPU - but that's a fairly low bar.

So if you're looking at buying a new system then I really wouldn't consider any of the regular M1 machines as a plausible candidate for a GPU-heavy workload on high-definition display(s).
 
Yet lots of Windows users tend to scale UI to 150% for such display because 100% is too small to see for some people (including me).
Yes the difference windows scaling is way more efficient than Mac OS scaling and the speed loss is not noticeable on Windows, even on my old 5700XT.

Yea usually at se windows at 125% or 150% too depending on the screen.
 
Well he does not test any 3d apps, I guess the 2d apps don’t push the GPU engough.

I have not noticed any issues with 2D apps either but 3d is another story.

Perhaps the 64 core Ultra would be fine but I notice slowdowns even on a 32 core max with 64 GB.
Yes, 3D apps can be different beast, 2D mostly uses OpenCL for image manipulation calculations, and that while taxing GPU is probably not that heavy as full fledge 3D app using metal.

First of all, I wish this youtuber every success. First time seeing him and sounds like very different from other entertainment-oriented/superficial techtubers.

While he gave some thought into his experiments (on penalty in scaled resolutions), I could find two flaws. Firstly, he seems to think that the number of pixels of the virtual canvas (e.g. 5K virtual canvas for 4K physical display in 1440p scaled resolution) is a decisive factor inflicting performance penalty. It's not. It's the extra step of down-sampling that's causing performance hit. Secondly, the metric he uses to measure performance penalty by "responsiveness of the system." While it's intuitive and perhaps what most users care, the metric is too coarse to catch any performance loss except when the penalty is really huge.

People seem to to care more about "performance loss in scaled resolutions" We got more reports from fellow members. I saw one here in another thread. This appears to be an excellent counter-argument against running 4K display in scaled resolutions because the performance loss is yelling at the user every second he's using the display.

The essence is that at 60Hz refresh rate, most people might not notice sluggishness in UI/other performance loss. At 120/144Hz refresh rate, the critical threshold seems easily reached to notice the sluggishness every second you're operating the computer. The explanation is simple in scaled resolution the down-sampling is performed from 60 times per second to 120 or 144 times per second. So not only multiple displays but also higher refresh rates seem to exacerbate the performance penalty. That's in addition to GPU workload as pointed out by @l0stl0rd.
60 Hz is pretty much standard enough refresh rate for most of normal work, for gaming ≥ 100 Hz is so often desired nowadays. But gaming @4k and high fps is still a challenge even for best PC setups.

Yet lots of Windows users tend to scale UI to 150% for such display because 100% is too small to see for some people (including me).
Most I know use rather 125% scaling which gives you UI size the same like on 1080p screens, setting 150% will give (for ma at least) huge UI, equivalent of around 1706x960 resolution.

Yes the difference windows scaling is way more efficient than Mac OS scaling and the speed loss is not noticeable on Windows, even on my old 5700XT.

Yea usually at se windows at 125% or 150% too depending on the screen.
Scaling just UI in Windows has this advantage that GPU is not that heavily used, just a bunch of (usualy older) software may have problems with it, trating display as non-HiDpi and thus rendering tiny, tiny UI elements.
 
I thought you had the setup with external 4K display(s) (as illustrations in your previous posts may imply) handy. That's what I was interested in if you may quickly report idle power consumption as well. (...)
The 4K Mac is a MP5,1 with a RX570 8GB. As it is not a native GPU, I am not sure if the powerplay tables and power consumption are 100% representative. However, here they go:
4K "looks like 1080p": 21 W
4k "looks like 1440p": 26 W
Lastly, I think using windowed video playback will be more applicable to full screen playback in such tests. Not that it matters much to the result in your tests. I thought it would be interesting to point it out and people interested in such tests may watch out. The reasoning is by wishful thinking that Apple/macOS is intelligent enough not to do the redundant up-scaling/down-sampling to fit the full screen. Instead, macOS could render video to full screen utilising its native resolution in the first place. To avoid such doubt/uncertainty, I used windowed graphic/video in my tests that I linked.
That's why i overlayed the iStat menus graph, to ensure it still renders at the selected (non-native) resolution.
But that's unnecessary, as only M1 Macs are capable of switching the resolution without blackening the screen in the process (which would be a straight giveaway on Intel machines).
But you are correct, there are apps (like Kodi) that can actively change the resolution.
 
Last edited:
  • Like
Reactions: Polochamps
The 4K Mac is a MP5,1 with a RX570 8GB. As it is not a native GPU, I am not sure if the powerplay tables and power consumption are 100% representative. However, here they go:
4K "looks like 1080p": 21 W
4k "looks like 1440p": 26 W

That's why i overlayed the iStat menus graph, to ensure it still renders at the selected (non-native) resolution.
But that's unnecessary, as only M1 Macs are capable of switching the resolution without blackening the screen in the process (which would be a straight giveaway on Intel machines).
But you are correct, there are apps (like Kodi) that can actively change the resolution.
Actually my M1 mac blacks the screen sometimes too when switching resolutions ?‍♂️
 
60 Hz is pretty much standard enough refresh rate for most of normal work, for gaming ≥ 100 Hz is so often desired nowadays. But gaming @4k and high fps is still a challenge even for best PC setups.

I tend to agree 60Hz is good enough just like IMO a 5-year old PC/Mac is enough for most people. At the same time 60Hz has been with LCD displays for so long, people can't wait to embrace new technology. Higher refresh rates aren't just for gaming. Specifically adaptive refresh rate aka variable refresh rate seems to be new tech for daily use. It offers users with a "butter smooth experience." Don't have to look beyond Apple. Hear what they say about their displays capable of ProMotion.

So I think what happened to @Lbond in this post is that: since Monterey starts to support adaptive refresh rate, he gives it a spin on his brand new 27 inch 4K LG display. I would bet for most people the preference is set to scaled resolution "look like 2560x1440" Now as I've discussed above, when the computer boosts to high refresh rates, performance penalty (of the scaled resolutions) kicks in.

End result: rather than "butter smooth experience" people get amplified sluggishness and perhaps stuttering too.

The 4K Mac is a MP5,1 with a RX570 8GB. As it is not a native GPU, I am not sure if the powerplay tables and power consumption are 100% representative. However, here they go:
4K "looks like 1080p": 21 W
4k "looks like 1440p": 26 W

Brilliant. This proves a point that I've suspecting for a while. Even at idle, GPU power consumption could be higher in a Scaled Resolution when compared to Default Resolution in 4K displays.

Don't worry about the absolute wattage numbers here. Apple's Radeon drivers (at least for Polaris chips) seem not get the calibration right for 3rd party dGPUs. I think if you divide the reported numbers by 2, you get a good estimate of the real power consumed.

So it's about 2.5W extra power required to do the down-sampling in a Scaled Resolution. If you use higher refresh rates, the extra power likely to grow bigger proportionally.
 
I have slightly off topic scaling question:

If I watch a YouTube video on the built in screen on a MacBook Air which only has 1280x800 native resolution, is there any point in selecting the video quality at 1080p or above?
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.