Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Don't worry about the absolute wattage numbers here. Apple's Radeon drivers (at least for Polaris chips) seem not get the calibration right for 3rd party dGPUs. I think if you divide the reported numbers by 2, you get a good estimate of the real power consumed.

So it's about 2.5W extra power required to do the down-sampling in a Scaled Resolution. If you use higher refresh rates, the extra power likely to grow bigger proportionally.
The 5 Watts difference was actually validated at the power outlet. (148 to 153 W ? for a Dual 5,1)
 
If I watch a YouTube video on the built in screen on a MacBook Air which only has 1280x800 native resolution, is there any point in selecting the video quality at 1080p or above?

Most people cannot tell the difference between 720p and 1080p. The difference is even more subtle to tear apart on smaller screens. So for climate change & etc, 720p is the way to go.

If you want to be nitpicking, perhaps set it to 1080p. Down scaling (when compared to up scaling) usually results in better perceived picture quality, and sometimes less computation (dependent on algorithm used in playback software).

The 5 Watts difference was actually validated at the power outlet. (148 to 153 W ? for a Dual 5,1)

I concede that 100% efficient power supply is not unheard of in a galaxy far far away ^^
 
I tend to agree 60Hz is good enough just like IMO a 5-year old PC/Mac is enough for most people. At the same time 60Hz has been with LCD displays for so long, people can't wait to embrace new technology. Higher refresh rates aren't just for gaming. Specifically adaptive refresh rate aka variable refresh rate seems to be new tech for daily use. It offers users with a "butter smooth experience." Don't have to look beyond Apple. Hear what they say about their displays capable of ProMotion.

So I think what happened to @Lbond in this post is that: since Monterey starts to support adaptive refresh rate, he gives it a spin on his brand new 27 inch 4K LG display. I would bet for most people the preference is set to scaled resolution "look like 2560x1440" Now as I've discussed above, when the computer boosts to high refresh rates, performance penalty (of the scaled resolutions) kicks in.

End result: rather than "butter smooth experience" people get amplified sluggishness and perhaps stuttering too.
Adaptive sync is actually created for gaming. Idea is simple - as soon as refresh rate matches (or doubles) fps of a game there will be no screen tearing during picture movement. What's more many 60 Hz monitors has this technology built-in also.
That said you're absolutely right that it is even more taxing for GPU, especially when in scaled resolutions. But at least here we can still play games at 1920x1080 resolutions and 144 Hz (doable on mid-range GPUs andd I believe also on M1-series IGP), while for work using 4K scaled desktop/apps at 60 Hz.
 
Scaling just UI in Windows has this advantage that GPU is not that heavily used, just a bunch of (usualy older) software may have problems with it, trating display as non-HiDpi and thus rendering tiny, tiny UI elements.
I think Windows has the theoretically better solution but that functionality has been in place since at least Windows 3.1 (I distinctly recall writing Win3.1 code using OS calls to translate to/from device coordinates and find the correct size/resolution for offscreen bitmaps) yet there are still "older" programs that stuff it up (It was extra work, so it was tempting to only bother for elements that you might want to export or print). Plus, it won't magically make all of a program's bitmap assets match the scaled resolution - so they'll still get fractionally scaled. I suspect that trying to "fix" this in MacOS would cause a lot of short/medium term chaos. I don't think I've ever seen the MacOS system "stuff up", even with a mixed economy of 4k, 5k and standard def displays - which was not the case last time I tried it with Windows (...and don't even ask about getting it working in Linux, although it can work...)
 
I tend to agree 60Hz is good enough just like IMO a 5-year old PC/Mac is enough for most people. At the same time 60Hz has been with LCD displays for so long, people can't wait to embrace new technology. Higher refresh rates aren't just for gaming. Specifically adaptive refresh rate aka variable refresh rate seems to be new tech for daily use. It offers users with a "butter smooth experience." Don't have to look beyond Apple. Hear what they say about their displays capable of ProMotion.

So I think what happened to @Lbond in this post is that: since Monterey starts to support adaptive refresh rate, he gives it a spin on his brand new 27 inch 4K LG display. I would bet for most people the preference is set to scaled resolution "look like 2560x1440" Now as I've discussed above, when the computer boosts to high refresh rates, performance penalty (of the scaled resolutions) kicks in.

End result: rather than "butter smooth experience" people get amplified sluggishness and perhaps stuttering too.



Brilliant. This proves a point that I've suspecting for a while. Even at idle, GPU power consumption could be higher in a Scaled Resolution when compared to Default Resolution in 4K displays.

Don't worry about the absolute wattage numbers here. Apple's Radeon drivers (at least for Polaris chips) seem not get the calibration right for 3rd party dGPUs. I think if you divide the reported numbers by 2, you get a good estimate of the real power consumed.

So it's about 2.5W extra power required to do the down-sampling in a Scaled Resolution. If you use higher refresh rates, the extra power likely to grow bigger proportionally.

I actually tested both variable refresh rate as well as different fixed refresh rates from 60hz to 144hz and the performance was the same accross the board in my various apps. When zooming in and out of projects in my audio apps you can literally feel and see the choppy broken frames (even at 60hz) when using the 1440p scaled resolution, so this issue isn't exclusive to high refresh rate displays. I'm guessing the only way around this would be for the devs of these apps to adopt the metal API but I can't see that happening due to them being cross platform apps.

I tested the same apps in Windows at 4K and they were butter smooth with an NVIDIA GPU though I know Windows handles scaling very differently to MacOS and just resizes the UI.
 
I actually tested both variable refresh rate as well as different fixed refresh rates from 60hz to 144hz and the performance was the same accross the board in my various apps. When zooming in and out of projects in my audio apps you can literally feel and see the choppy broken frames (even at 60hz) when using the 1440p scaled resolution, so this issue isn't exclusive to high refresh rate displays. I'm guessing the only way around this would be for the devs of these apps to adopt the metal API but I can't see that happening due to them being cross platform apps.

I tested the same apps in Windows at 4K and they were butter smooth with an NVIDIA GPU though I know Windows handles scaling very differently to MacOS and just resizes the UI.
Or apple could offer an alternative scaling method which probably will not happen. They need a reason to sell their Displays ;)

Yes I am a bit pissed since I discovered how “bad” scaling is on Mac Os.

I can live without a 4k screen however.
However if we don’t see certain things on Monday and latest end of the year, I might just go back to Windows or Linux.
 
I actually tested both variable refresh rate as well as different fixed refresh rates from 60hz to 144hz and the performance was the same accross the board in my various apps. When zooming in and out of projects in my audio apps you can literally feel and see the choppy broken frames (even at 60hz) when using the 1440p scaled resolution, so this issue isn't exclusive to high refresh rate displays. I'm guessing the only way around this would be for the devs of these apps to adopt the metal API but I can't see that happening due to them being cross platform apps.

I tested the same apps in Windows at 4K and they were butter smooth with an NVIDIA GPU though I know Windows handles scaling very differently to MacOS and just resizes the UI.

Thanks for filling in details of issues you experienced

Or apple could offer an alternative scaling method which probably will not happen. They need a reason to sell their Displays ;)

Yes I am a bit pissed since I discovered how “bad” scaling is on Mac Os.

I can live without a 4k screen however.
However if we don’t see certain things on Monday and latest end of the year, I might just go back to Windows or Linux.

Apple treatment of HiDPI displays is simple and cheap (as in computation), and suits their own purposes. Also creating a barrier for third party display vendors to enter the competition. I believe it's unlikely to change for the foreseeable future.

With adaptive refresh rate (aka ProMotion) gradually gets into The Macintosh Desktop Experience, perhaps in future most if not all 3rd party 4K (or whatever popular resolution in PC world in future that goes odd against Apple's) displays will rendered unbearable.

For now and my personal preference, I agree it's either 5K or plain old 1440p for 27-ish inch displays.

If people go with 4K at this size, "look like 2560x1440" (or any scaled resolutions) is a hit or miss. It consumes more power (even at idle), slows down your GPU-bound applications, and gets utterly sluggish experience if applications are poorly written without realising Apple's primitive and brutal way of HiDPI scaling.

So testing it out in person is definitely necessary. Even everything is okay for now. In five years time, I believe all current 4K 27ish inches won't have good The Macintosh Desktop Experience.
 
For now and my personal preference, I agree it's either 5K or plain old 1440p for 27-ish inch displays.
This is valid if you find the 1440p scaling optimal for your eyes on a 27" display. And in this scenario, a 5K is a must if you want clarity. Apple wins.

But for my eyes, the 1440p scaling it is too small for general use, I get tired from everything being too small. I like the 1080p scaling much better for a 27" 4K display and I quite like the way everything looks on the external display, everything is sharp enough not to bother my eyes.

The UHD 4K is a standard too big to ignore by Apple. Even if they make 5K, 6K and even 7K monitors, there will always be a setting for a 4K display that will look good.
 
  • Like
Reactions: mgscheue
Support 4K displays won't be removed overnight or in foreseeable future in MacOS. Apple doesn't have to do anything extra to support it. It simply follows the primitive math in MacOS's algorithms. So no worries here. And I didn't say MacOS is proactively and explicitly going to make life more difficult for 4K displays. Apple is just going to design and optimize the better The Macintosh Desktop Experience for their own displays. Of course when third party can sell displays in future with similar specs to Apple's, you'll get the same good experience.

27 inch 4K @1080p is simply too big for my tase. My normal viewing distance is about 60cm.

What Apple gets right about 27 inch display is 2560x1440p at ~110 ppi. It's proven golden IMO in the past 15 years or so.
 
Thanks for filling in details of issues you experienced



Apple treatment of HiDPI displays is simple and cheap (as in computation), and suits their own purposes. Also creating a barrier for third party display vendors to enter the competition. I believe it's unlikely to change for the foreseeable future.

With adaptive refresh rate (aka ProMotion) gradually gets into The Macintosh Desktop Experience, perhaps in future most if not all 3rd party 4K (or whatever popular resolution in PC world in future that goes odd against Apple's) displays will rendered unbearable.

For now and my personal preference, I agree it's either 5K or plain old 1440p for 27-ish inch displays.

If people go with 4K at this size, "look like 2560x1440" (or any scaled resolutions) is a hit or miss. It consumes more power (even at idle), slows down your GPU-bound applications, and gets utterly sluggish experience if applications are poorly written without realising Apple's primitive and brutal way of HiDPI scaling.

So testing it out in person is definitely necessary. Even everything is okay for now. In five years time, I believe all current 4K 27ish inches won't have good The Macintosh Desktop Experience.
Yes got my 1440p and was amazed how much snapier the the interface even feels (an both are 60 hz the new and old screen).

yes the text is a bit less nice but I played a bit with font smoothing settings and find it ok now.
 
  • Like
Reactions: mr_jomo
Yes got my 1440p and was amazed how much snapier the the interface even feels (an both are 60 hz the new and old screen).

yes the text is a bit less nice but I played a bit with font smoothing settings and find it ok now.

I think Apple dropped support for sub-pixel font rendering in macOS after all supported hardware full-house HiDPI. That's quite unfortunate.

If you have two 27 inch 1440p, I think a viewing distance of 80cm is acceptable. That also reduces stress on your neck. And from 80cm, you can hardly perceive the difference between 1440p non-HiDPI and 1440p HiDPI. ^^
 
  • Like
Reactions: l0stl0rd
Just checked, 740mm (real units....😝), with two 1440 native.
But I am on Mojave........;)........and I am not a 'crystal clear snob'.

edit:
Oops...that's what I get for trying to be smart, it should be 740mm (not 74)......🤣
 
Last edited:
  • Like
Reactions: mr_jomo and kvic
Basically it seems that one should get his most GPU/system heavy software and test his Mac performance at target scaled resolution before buying any 4k monitors. Some software will work better, some will be sluggish (perhaps depending on frameworks used to build it - I guess all Apple derived apps like FinalCut Pro or Logic should be OK), some lottery here.
 
  • Like
Reactions: l0stl0rd
In addition to the earlier screenshots, I tried my best with real photos.
Unfortunately the resolution gets automatically scaled down, so here's an external link to the hi-res image:
https://abload.de/img/scaled.resolutionssak85.jpg
Or attached as zip.
 

Attachments

  • scaled.resolutions.jpg
    scaled.resolutions.jpg
    634.8 KB · Views: 219
  • scaled.resolutions.jpg.zip
    4.5 MB · Views: 123
Last edited:
Hi,

I finally went for a 27” 4K display with up to 160hz. The LG 27GP950b as there was a great offer for 550€

right now I have both the 27gp850b 1440p and the 27gp950 4K in front of me hooked to my iMac and for some reason I can’t get over 95hz on the 4K while the 1440p can do 164hz. Any idea why I lose the option when the monitor is supposed to be able to drive those hz?
 
  • Like
Reactions: mr_jomo
(just an update)....Eizo CG2700S next to my 27" iMac on my desk now. Love the Eizo screen. Differences 2k/5k not a problem at all.
Sounds great.

I just picked up a low hours Eizo CG2730. At my viewing distance for photo editing 2560 x 1440 is perfect.

With the non-reflective screen, ColorNavigator 7 and built-in calibration, I couldn't be happier.
 
I have slightly off topic scaling question:

If I watch a YouTube video on the built in screen on a MacBook Air which only has 1280x800 native resolution, is there any point in selecting the video quality at 1080p or above?
Absolutely. Online videos tend to give you higher bitrate together with the higher resolution setting so even if it just gets downscaled to something as low as 1280x800, it can still end up resolving visually more detail. If you can't see a difference, stick to the lower res option as it is less taxing on your system and if you have data caps then less data used.
 
  • Like
Reactions: Wheel_D
Congratulations! Fine monitor! Affordable I guess?
Yes, given its excellent condition and low use I was very happy with the deal.

The CG2730 isn't quite in the league of your CG2700S, but it's not too shabby. :)
 
I'm firm believer that with all monitors lately the most important to Apple Users is RGP model! Plus for best text go with a 32 inch monitor!
Sorry, but I don' t know what you mean....RGP model? What monitors do you mean?
 
That means the display is good fro Graghics and v video production! Most artist and music guy like the Adobe test (which RGB is for)! If you were a Mac user you should know this!
I know about RGB but not about RGP and I still use a Mac. Now I still don’t know about RGP. Thank you.
 
  • Like
Reactions: MarkC426
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.