Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
That's an oxymoron because if your fps are in the 60s your 1% lows certainly aren't.

It does make sense to have a buffer because even though you might be fine for 80% of the game's content there might be a couple hours worth of content that stress the hardware more. For example in The Witcher 3 you spend all the time in the beginning in woods and sparsly populated villages only to eventually make your way to the city of Novigrad that was at the time this game full with more NPCs going about their day than was common for these games.

And if you run the hardware at its limits at 60fps already then you might find it's not spec'd to the requirements of these games
. For example I currently have 5 year old hardware for gaming that works fine for virtually any modern 2025 game. Enable DLSS and call it a day and there isn't much of a visible quality downgrade but with a 4k monitor the performance gains are absolutely worth it. I do get averages in the 60s with 1% somewhat lower but it would be fine. Unfortunately I do see stuttering every couple minutes. Maybe it's the limited 10GiB VRAM of the 3080 or maybe it's the older AM4 Zen architecture but on the wife's 9800X3D build the same games don't have this issue.
I’ll try to clarify as I realize my mentality is different than most.

I stated “solid 60 FPS” and “solid 30 FPS.” My “solid” essentially equates to an FPS limit. More specifically, setting an FPS limit slightly below the lowest 1%, or preferably 0.1%, low. If that includes 30, 60, 90, 120, whatever, you can simply pick one of those, of course. Although, if we’re going for totally optimized, it would be a limit below the 0.1% lows at (one of) the monitor's native refresh rate. Ultimately, this should keep the FPS pace within +/- 2 FPS.

Occasionally, influencers say actually useful, accurate information buried within their greedy ego clickbait.

I am not replacing my 3080 with a $1.3k card just so requirements exceed what that card can do - no matter by however little. I can play my games with DLSS now and it would be ridiculous to pay over a grand on a brand new latest gen graphics card that ends up needing DLSS again just because I got a dedicated high resolution gaming monitor. 4k isn't this crazy absurd thing anymore in 2025. And if a card as expensive as the 5080 can't do 4k in 2025 on the latest games then it's not gonna get better years down the road. Am I supposed to replace the 5080 next year with a 6080 hoping that will finally last longer?
Perhaps my mention of DLSS was the problem.

If you (re)watch the earlier portion:

You’ll see true UHD with lots err “Ultra” eye candy uses ~15300 MB or ~14.95 GB of VRAM with the 1% lows dipping down to the mid/lower 70s FPS. A 5080 will probably be close to 30 FPS but VRAM usage isn’t going to increase. On something like a 5070 (Ti), the frame rate at that visual quality isn’t going to be sufficient. Therefore, you’ll go with lower texture and visual element quality or resolution and thus will decrease memory usage anyway. Basically, my point is despite the recent clamor, especially by channels such as Hardware Unboxed, VRAM is typically not going to be a bottleneck or concern unless you create unlikely/unusual scenarios to shoehorn your bias.
 
Yeah, it also helps to skip multiple generations when possible. You get bigger performance improvement that way.

I am curious to see this Super refresh folks are talking about. I have no real need for more vram, so I guess I am going to sit on the sidelines for it.
I’ve been getting lucky between crypto and Ai boom. Got the 3090 for MSRP, sold for $1600 right before 4090 release. Got the 4090 FE for MSRP and Just sold my 4090 FE for $1800 and scored a 5090 FE at MSRP. That’s very minimal payment in keeping up with the latest and greatest. If you hold on to the top end, you have the most to lose by holding on to your card too long.
 
  • Like
Reactions: diamond.g
I’ve been getting lucky between crypto and Ai boom. Got the 3090 for MSRP, sold for $1600 right before 4090 release. Got the 4090 FE for MSRP and Just sold my 4090 FE for $1800 and scored a 5090 FE at MSRP. That’s very minimal payment in keeping up with the latest and greatest. If you hold on to the top end, you have the most to lose by holding on to your card too long.
It really seems like only the 90 class cards can get away with that level of "value retainment".
 
Eh I have lately found the sweet spot in the mid range Ti cards. I just got a 5060 Ti 16GB for $429 at Best Buy. That’ll last my wife quite a while at a very low price.
 
My “solid” essentially equates to an FPS limit. More specifically, setting an FPS limit slightly below the lowest 1%, or preferably 0.1%, low.
Then I don't know what the point was of your post claiming you don't understand why people want the higher framerates. You're getting the same high framerates. You're merely limiting them which adds nothing and might make the experience worse if you got a fast OLED screen with adaptive sync that can actually display these additional frames. If your hardware is already capable of displaying them then I don't know why you would set a limit to 120fps or especially 60fps. 60fps won't look as smooth as 120fps or 144fps on an OLED or 144Hz LCD.

my point is despite the recent clamor, especially by channels such as Hardware Unboxed, VRAM is typically not going to be a bottleneck or concern unless you create unlikely/unusual scenarios to shoehorn your bias.
I understand this part though I don't agree here either. Higher texture resolutions make a visible impact to the quality. More so than shadow quality and anti aliasing and the like. If you got a muddy texture somewhere then that's what sticks out. Well, depends if you're racing past a muddy texture in a car or if it's a walking simulator type of game. As long as you got the VRAM available better texture quality is a "free" visual upgrade.

I am starting to see issues now with VRAM on my older 10GiB 3080 with older titles as well, in one game I had to add custom VRAM texture streaming commands to the game's config so that it wouldn't crash to desktop within the first hour. Now it works on the highest textures with DLSS just fine but it was quite a headache to figure it out in the first place and fix it. There was one other last year where I did have to drop the texture 2 notches. I think it was a game in the DOOM series, probably the latest one from last year, whichever that was.

my point is despite the recent clamor, especially by channels such as Hardware Unboxed, VRAM is typically not going to be a bottleneck
Hardware Unboxed does put it in relation to factors like pricing and the competition as well. The message is that Nvidia is nickel and diming customers from 8GiB 5060 TI's to the more expensive cards. I had a cheap MSI RX570 with 8GiB VRAM a long time ago... I'd rather have too much VRAM for a couple bucks more rather than too little. But I guess that's too much to ask when even previously included amperage load checking for the 12VHPWR was removed from the 5000 series to save a few cents per card in manufacturing so we get melting power connectors where manufacturers then blame the user with "you're plugging it in wrong!".

Anyways, I think Hardware Unboxed did show that it's a real issue:

Unless they screwed up their testing it's clear that the same card with 8GiB and with 16GiB performs very differently.

From what I see with my own graphics card the performance is still plenty with DLSS in 4k yet the limited VRAM is what's starting to give me troubles in select few games now. I don't want to make the mistake again and buy a 16GiB card in 2025 that's definitely plenty for a long time to come only to find out in a couple years that games are even worse optimized with VRAM leaks that games like Kingdom Come 2 exhibit and then run into issues because of it like I do now where I have to look up forum posts and figure out how to optimize VRAM usage for a particular game.

I should probably have gotten a 24GiB 3090 instead of the 3080 and then I'd be set for many more years to come. Instead I am now wanting to replace it but I see nothing worth buying. I don't want an overpriced 5080 that has the same amount of VRAM as the cheaper 5070 Ti and I'd really like a card that isn't known for risking a melting connector. The 5090 is overkill and too expensive anyways but nothing else seems like a substantial upgrade worth dropping a grand and more on. The 9070 XT is a joke and even though the older 7900 XTX ticks all of the boxes it seems a beefed up version of an older architecture that doubles as a space heater. Maybe I should get that since it won't risk a house fire at least.
 
  • Like
Reactions: eltoslightfoot
Then I don't know what the point was of your post claiming you don't understand why people want the higher framerates. You're getting the same high framerates. You're merely limiting them which adds nothing and might make the experience worse if you got a fast OLED screen with adaptive sync that can actually display these additional frames. If your hardware is already capable of displaying them then I don't know why you would set a limit to 120fps or especially 60fps. 60fps won't look as smooth as 120fps or 144fps on an OLED or 144Hz LCD.


I understand this part though I don't agree here either. Higher texture resolutions make a visible impact to the quality. More so than shadow quality and anti aliasing and the like. If you got a muddy texture somewhere then that's what sticks out. Well, depends if you're racing past a muddy texture in a car or if it's a walking simulator type of game. As long as you got the VRAM available better texture quality is a "free" visual upgrade.

I am starting to see issues now with VRAM on my older 10GiB 3080 with older titles as well, in one game I had to add custom VRAM texture streaming commands to the game's config so that it wouldn't crash to desktop within the first hour. Now it works on the highest textures with DLSS just fine but it was quite a headache to figure it out in the first place and fix it. There was one other last year where I did have to drop the texture 2 notches. I think it was a game in the DOOM series, probably the latest one from last year, whichever that was.


Hardware Unboxed does put it in relation to factors like pricing and the competition as well. The message is that Nvidia is nickel and diming customers from 8GiB 5060 TI's to the more expensive cards. I had a cheap MSI RX570 with 8GiB VRAM a long time ago... I'd rather have too much VRAM for a couple bucks more rather than too little. But I guess that's too much to ask when even previously included amperage load checking for the 12VHPWR was removed from the 5000 series to save a few cents per card in manufacturing so we get melting power connectors where manufacturers then blame the user with "you're plugging it in wrong!".

Anyways, I think Hardware Unboxed did show that it's a real issue:

Unless they screwed up their testing it's clear that the same card with 8GiB and with 16GiB performs very differently.

From what I see with my own graphics card the performance is still plenty with DLSS in 4k yet the limited VRAM is what's starting to give me troubles in select few games now. I don't want to make the mistake again and buy a 16GiB card in 2025 that's definitely plenty for a long time to come only to find out in a couple years that games are even worse optimized with VRAM leaks that games like Kingdom Come 2 exhibit and then run into issues because of it like I do now where I have to look up forum posts and figure out how to optimize VRAM usage for a particular game.

I should probably have gotten a 24GiB 3090 instead of the 3080 and then I'd be set for many more years to come. Instead I am now wanting to replace it but I see nothing worth buying. I don't want an overpriced 5080 that has the same amount of VRAM as the cheaper 5070 Ti and I'd really like a card that isn't known for risking a melting connector. The 5090 is overkill and too expensive anyways but nothing else seems like a substantial upgrade worth dropping a grand and more on. The 9070 XT is a joke and even though the older 7900 XTX ticks all of the boxes it seems a beefed up version of an older architecture that doubles as a space heater. Maybe I should get that since it won't risk a house fire at least.
The Super cards are going to have more vram, so maybe you would benefit from waiting.
 
Nvidia RTX 50 SUPER GPU Specifications and Pricing Leak

Specifications and pricing for Nvidia’s planned RTX 50 SUPER series GPUs have leaked, unveiling notable performance uplifts and increased VRAM capacities. Furthermore, Moore’s Law is Dead has claimed that Nvidia plans to end RTX 5080 and RTX 5070 Ti production in October, paving the way for Nvidia’s enhanced SUPER series GPU models.

Isn't AMD releasing some new GPUs soon? If so, nvidia leaking info isn't all that surprising, though we've been waiting for the super series to land
 
  • Like
Reactions: eltoslightfoot

nVidia seems to get away with this every year. Generate excitement towards the end of the year, a ton of much higher than MSRP sales and then cut production as demand fades in the summer.

This show also covers AMD vs nVidia marketshare where nVidia is still killing it.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.