Hello. I've read various contradictory things about this so I thought maybe someone here could shed some authoritative light.
What I want to know is whether there is any performance penalty for running a monitor at lower than its native resolution.
The scenario I am envisaging is as follows. Say you run a game on two monitors. Monitor A is a 2560x1440. You run the game at native resolution. Monitor B is a 4k. You drop the resolution and run the game at a non-native 2560x1440.
Will performance be identical in these two cases, all other things being equal? I.e. do you sacrifice anything by forcing a 4k monitor to "think down"?
Supplementary distantly related question. Many monitors when listing response time offer some kind of "fast mode" with a slightly lower response time. What is the penalty for running a monitor in "fast mode" i.e. why wouldn't one have it on all the time?
Thanks for any enlightenment!
What I want to know is whether there is any performance penalty for running a monitor at lower than its native resolution.
The scenario I am envisaging is as follows. Say you run a game on two monitors. Monitor A is a 2560x1440. You run the game at native resolution. Monitor B is a 4k. You drop the resolution and run the game at a non-native 2560x1440.
Will performance be identical in these two cases, all other things being equal? I.e. do you sacrifice anything by forcing a 4k monitor to "think down"?
Supplementary distantly related question. Many monitors when listing response time offer some kind of "fast mode" with a slightly lower response time. What is the penalty for running a monitor in "fast mode" i.e. why wouldn't one have it on all the time?
Thanks for any enlightenment!