I tend to agree 60Hz is good enough just like IMO a 5-year old PC/Mac is enough for most people. At the same time 60Hz has been with LCD displays for so long, people can't wait to embrace new technology. Higher refresh rates aren't just for gaming. Specifically
adaptive refresh rate aka variable refresh rate seems to be new tech for daily use. It offers users with a "butter smooth experience." Don't have to look beyond Apple. Hear what they say about their displays capable of
ProMotion.
So I think what happened to
@Lbond in this
post is that: since Monterey starts to support adaptive refresh rate, he gives it a spin on his brand new 27 inch 4K LG display. I would bet for most people the preference is set to scaled resolution "look like 2560x1440" Now as I've discussed above, when the computer boosts to high refresh rates, performance penalty (of the scaled resolutions) kicks in.
End result: rather than "butter smooth experience" people get amplified sluggishness and perhaps stuttering too.
Brilliant. This proves a point that I've suspecting for a while. Even at idle, GPU power consumption could be higher in a Scaled Resolution when compared to Default Resolution in 4K displays.
Don't worry about the absolute wattage numbers here. Apple's Radeon drivers (at least for Polaris chips) seem not get the calibration right for 3rd party dGPUs. I think if you divide the reported numbers by 2, you get a good estimate of the real power consumed.
So it's about 2.5W extra power required to do the down-sampling in a Scaled Resolution. If you use higher refresh rates, the extra power likely to grow bigger proportionally.