I’ll try to clarify as I realize my mentality is different than most.That's an oxymoron because if your fps are in the 60s your 1% lows certainly aren't.
It does make sense to have a buffer because even though you might be fine for 80% of the game's content there might be a couple hours worth of content that stress the hardware more. For example in The Witcher 3 you spend all the time in the beginning in woods and sparsly populated villages only to eventually make your way to the city of Novigrad that was at the time this game full with more NPCs going about their day than was common for these games.
And if you run the hardware at its limits at 60fps already then you might find it's not spec'd to the requirements of these games. For example I currently have 5 year old hardware for gaming that works fine for virtually any modern 2025 game. Enable DLSS and call it a day and there isn't much of a visible quality downgrade but with a 4k monitor the performance gains are absolutely worth it. I do get averages in the 60s with 1% somewhat lower but it would be fine. Unfortunately I do see stuttering every couple minutes. Maybe it's the limited 10GiB VRAM of the 3080 or maybe it's the older AM4 Zen architecture but on the wife's 9800X3D build the same games don't have this issue.
I stated “solid 60 FPS” and “solid 30 FPS.” My “solid” essentially equates to an FPS limit. More specifically, setting an FPS limit slightly below the lowest 1%, or preferably 0.1%, low. If that includes 30, 60, 90, 120, whatever, you can simply pick one of those, of course. Although, if we’re going for totally optimized, it would be a limit below the 0.1% lows at (one of) the monitor's native refresh rate. Ultimately, this should keep the FPS pace within +/- 2 FPS.
Occasionally, influencers say actually useful, accurate information buried within their greedy ego clickbait.
Perhaps my mention of DLSS was the problem.I am not replacing my 3080 with a $1.3k card just so requirements exceed what that card can do - no matter by however little. I can play my games with DLSS now and it would be ridiculous to pay over a grand on a brand new latest gen graphics card that ends up needing DLSS again just because I got a dedicated high resolution gaming monitor. 4k isn't this crazy absurd thing anymore in 2025. And if a card as expensive as the 5080 can't do 4k in 2025 on the latest games then it's not gonna get better years down the road. Am I supposed to replace the 5080 next year with a 6080 hoping that will finally last longer?
If you (re)watch the earlier portion:
You’ll see true UHD with lots err “Ultra” eye candy uses ~15300 MB or ~14.95 GB of VRAM with the 1% lows dipping down to the mid/lower 70s FPS. A 5080 will probably be close to 30 FPS but VRAM usage isn’t going to increase. On something like a 5070 (Ti), the frame rate at that visual quality isn’t going to be sufficient. Therefore, you’ll go with lower texture and visual element quality or resolution and thus will decrease memory usage anyway. Basically, my point is despite the recent clamor, especially by channels such as Hardware Unboxed, VRAM is typically not going to be a bottleneck or concern unless you create unlikely/unusual scenarios to shoehorn your bias.