OS Neutral Gaming With A 4K Monitor

Huntn

macrumors demi-god
Original poster
May 5, 2008
18,054
18,508
The Misty Mountains
Then he would lose HDR because it is 1.2 .

NVIDIA does not support FreeSync through HDMI 2.0 (AMD proprietary).

Time to try to return the monitor.

Or maybe one can use 2 cables and have each feature always activated on "each display".

Then DisplayPort for FreeSync and 10-bit SDR.

HDMI 2.0 for 4K 60Hz 4:2:2 HDR.

But one would still miss 4K 60Hz 4:4:4 HDR which needs DP 1.4 .

1100 euro to add adaptive sync to HDR with NVIDIA.
1100 euros, I don’t think so, not for my self imposed budget. :D
 

cube

Suspended
May 10, 2004
16,983
4,962
1100 euros, I don’t think so, not for my self imposed budget. :D
That's why I thought just adding DP 1.4 for about the same price as now would be the best compromise.

Unless what you bought was a special deal.
 

Huntn

macrumors demi-god
Original poster
May 5, 2008
18,054
18,508
The Misty Mountains
@garnerx , @cube , thanks for your all’s help. I’ve reached a decision. Yesterday I used the display port cable and enabled FreeSync on the monitor and G-sync on the Nvidia GTX 2070. then I downloaded an Nvidia tool to test it (https://www.nvidia.com/coolstuff/demos). The monitor seems to be happy at 50hz with this demo.

But then I cranked up Red Dead Redemption 2 and it looked flat and lifeless without HDR turned on. So I swapped back to the HDMI cable and with HDR, the magic came back. The difference is so stark, the environmental effects that HDR introduces are stunning, adding depth to the environment, seemingly turning 2D into a 3D environment.

In addition, I turned on a frame rate counter in game by way of Geforce Experience, and RDR2 runs consistently at about 35fps without any tearing or screen defects. For this monitor, HDR is a slam dunk and I’m very happy with the image it produces. Now could I run into a situation in the future where I face screen artifacts? I suppose so, but at the $350 I paid for this XG3220, I have no reason to complain. It’s got a lovely picture. 😍
 

cube

Suspended
May 10, 2004
16,983
4,962
That would be $50 cheaper than the Benq and it would not be nice to return the Viewsonic if it was your fault not to check for DP 1.4, plus you would need to spend something like $1000 to combine adaptive sync with HDR, so it seems OK to keep it.
 

garnerx

macrumors 6502
Nov 9, 2012
469
210
RDR2 runs consistently at about 35fps without any tearing or screen defects.
You shouldn’t see screen tearing at such a low frame rate, as I assume you’d have the software vsync turned on in the game. Also, g-sync won’t do anything at all when the game’s running that slowly. You’d need to get above 48 fps before it will even activate.

It illustrates what I said earlier, that some people are more sensitive to this stuff. Personally I would not want to play at 35 fps. You should try limiting it to 30 fps, as that’s going to be much smoother - every frame stays on screen for exactly two cycles. The important thing is that it looks good to you, nobody else is going to see it!
 

Huntn

macrumors demi-god
Original poster
May 5, 2008
18,054
18,508
The Misty Mountains
You shouldn’t see screen tearing at such a low frame rate, as I assume you’d have the software vsync turned on in the game. Also, g-sync won’t do anything at all when the game’s running that slowly. You’d need to get above 48 fps before it will even activate.

It illustrates what I said earlier, that some people are more sensitive to this stuff. Personally I would not want to play at 35 fps. You should try limiting it to 30 fps, as that’s going to be much smoother - every frame stays on screen for exactly two cycles. The important thing is that it looks good to you, nobody else is going to see it!
I’ll have to figure out how to do that. Hints? :)

For myself, and I imagine for many gamers the HDR as displayed on my monitor is a huge upgrade to graphics as compared to without it. Not a matter of sensitivity, but as of having eyes in your head. ;)
 

garnerx

macrumors 6502
Nov 9, 2012
469
210
I’ll have to figure out how to do that. Hints? :)

For myself, and I imagine for many gamers the HDR as displayed on my monitor is a huge upgrade to graphics as compared to without it. Not a matter of sensitivity, but as of having eyes in your head. ;)
I mean sensitivity to framerate. Personally I want it as high as possible, it brings LCD screens a little bit closer to the smoothness of the old CRTs, but plenty of people aren’t bothered about it. To limit the framerate, if there isn’t an option to target 30 fps in the game options you can use MSI Afterburner - it works for any game, very useful.
 
  • Like
Reactions: Huntn