Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Yeah I am going to wait for some actual real benchmarking. Claiming that performance gain because of AI frame generation is BS.

I agree that the sweet spot is probably the 5070ti. I am also currently using a 1080ti in my PC rig, which is fine, but there are some newer games coming out where it just isn't going to cut it.

Interested in what AMD is going to counter with.
 
Jenson didn't say that, he stated that the performance of the 5070 is comparable to the 4090 point blank - no qualifiers (at last I didn't hear)
Someone posted an article about it.


Spoiler, the 5070 is faster when using MFG...
 
Interested in what AMD is going to counter with.
AMD is still not sharing — or maybe decided? — specific pricing but...


01mP4xPyKGDZe2XIB1XnBSw-2.fit_lim.size_1536x.png


I am going to wait for some actual real benchmarking.
Of course.
 
Someone posted an article about it.


Spoiler, the 5070 is faster when using MFG...
Yes, but for games that don't support the latest version of Nvidia's DLSS(and that's like 99% of available games), the 4090 will be faster.

I'm not knocking the 5070, I think for the price, its an excellent card, but its no 4090 replacement. I doubt people owning a 4090 will be lining up to buy the 5070
 
  • Like
Reactions: eltoslightfoot
Yes, but for games that don't support the latest version of Nvidia's DLSS(and that's like 99% of available games), the 4090 will be faster.

I'm not knocking the 5070, I think for the price, its an excellent card, but its no 4090 replacement. I doubt people owning a 4090 will be lining up to buy the 5070
Me either, we laughed about it at work though. I am concerned that more generated frames is going to be how Nvidia increases performance going forward though.

I am also curious about the performance impact of switching from a CNN to a ViT for the older cards.
 
  • Like
Reactions: eltoslightfoot
Me either, we laughed about it at work though. I am concerned that more generated frames is going to be how Nvidia increases performance going forward though.

I am also curious about the performance impact of switching from a CNN to a ViT for the older cards.
Exactly. If the only gains are through software and "AI", then why upgrade? Might as well switch to AMD when the time comes. Especially since you will get the synergy of the AMD CPU and AMD GPU.
 
  • Like
Reactions: Queen6
Yeah I am going to wait for some actual real benchmarking. Claiming that performance gain because of AI frame generation is BS.

I agree that the sweet spot is probably the 5070ti. I am also currently using a 1080ti in my PC rig, which is fine, but there are some newer games coming out where it just isn't going to cut it.

Interested in what AMD is going to counter with.
+1 5070ti inc' 265k pre build.
Maybe slight fps bump 4070ti.
 
Preparing m.2 Air+11' iPad resale.
Taking break from smaller screens though keeping iphone14.
Looking forward to 5070ti/5080 whichever is achievable.
 
Last edited:
Folks are upset at the 15-20% RT gains over the 40-Series. Expecting the gains to be larger.
 
Folks are upset at the 15-20% RT gains over the 40-Series. Expecting the gains to be larger.
RT performance is anywhere from ~15% (i.e., “70” tier) to ~66% (i.e., “90” tier) better.

Tensor core (“AI”) performance is up to ~2.8x faster. The biggest uplift is actually at the “70” tier. The “90” class is ~2.54x higher.

At least according to Nvidia's numbers

The shader (CUDA) cores (i.e., floating point) performance took some doing. Nvidia still hasn’t published those values, simply stating “Blackwell” for “Shader Cores.” Fortunately, the TechPowerUp GPU database has been updated.

5090 vs. 4090 = +~27%
5080 vs. 4080 S = +~8%
5070 Ti vs. 4070 Ti S (AD102) = +~1%
5070 vs. 4070 S = -~13%
--- 5070 vs. 4070 (AD104) = +~6%

Their FP performance calculations are based on Nvidia’s promised core boost frequencies, which isn’t the wrong approach. But, as many of us know, the cards will push beyond those clock speeds in the vast majority of scenarios. And I wanted to compare the (closer to) actual (a.k.a. “real world”) performance difference. Of course, we don’t yet know any of the real world frequency (and power consumption) values for the 50 series — benchmark embargoes lift starting Jan. 24. However, I was able to find the/a formula for FP performance.

s x 2 x c = g or s2c = g (GFLOPs)
g / 1000 = t (TFLOPs)

s = shader/CUDA core count
c = core (boost) frequency in MHz
I can't recall what the 2 represents.

For example: In my observations, the RTX 4080 averages 2.8GHz core frequency vs. the 2.51GHz promised by Nvidia.
So, the default calculation is 9728 x 2 x 2.51 / 1000 = ~48.8 (TFLOPs)
My RTX 4080 cards seemingly average up to ~54.5 = 9728 x 2 x 2.8 / 1000

One more thing...

It was soon obvious this generation wasn’t going to be a huge uplift, generally speaking, because Nvidia wasn’t able to substantially change the core frequency. From 30 to 40 series, the core frequencies were ~50% greater across the board, which is a big improvement in itself, obviously.
 
I've been waiting (as per usual) for Apple to at least say something about an M4 Mac Studio Ultra but these 5090 cards look great for video work and video rendering and now support 422/h264 format.
Super tempted... If I can grab them on launch day as hear stock is very limited.

First (and only) non gaming review I've seen yet.
 
I've been waiting (as per usual) for Apple to at least say something about an M4 Mac Studio Ultra but these 5090 cards look great for video work and video rendering and now support 422/h264 format.
Super tempted... If I can grab them on launch day as hear stock is very limited.

First (and only) non gaming review I've seen yet.
Yeah I forgot to post their video. There should be other non-game benchmarks in the coming days. The third party (AIB card) reviews just came out. Though for non-gaming use performance won't change much. GPU and Mem Die temps are lower on the AIB cards, maybe that matters.
 
I've been waiting (as per usual) for Apple to at least say something about an M4 Mac Studio Ultra but these 5090 cards look great for video work and video rendering and now support 422/h264 format.
Super tempted... If I can grab them on launch day as hear stock is very limited.

First (and only) non gaming review I've seen yet.
The 5090 is definitely primarily ML, 3D render, and other productivity focused. The only gamers that (should reasonably) want it are players that want to future proof RT performance. Otherwise, it’s typically just bragging rights.

I’m tempted because the RTX 5090 is basically the raw performance of two RTX 4080s. However, if I can’t grab an FE model at MSRP then I’ll skip it without gripe.
 
GPU and Mem Die temps are lower on the AIB cards, maybe that matters.
Just FYI/reminder, GDDR6X ran hot:


Cooler capability aside, I assume GDDR7 produces more heat. Probably the same situation we’re seeing with faster and faster SSDs.
 
Completely torn between the 5080 FE (if I could get one) and waiting for the M4 Mac Studio.

A few software packages I'm interested in just aren't as capable (or even available) on the mac. But do I even have time to learn that crap? I'm old and TIRED.

Might just get the video card and use it as a render slave for a mac. Dunno. Torn. It's going to depend on what is actually available at launch.

575 watt for the 5090 - insane. The 5080 is still crazy power wise, but not astronomical. If reviews show it to be crappy, I'll just wait.
 
Just FYI/reminder, GDDR6X ran hot:


Cooler capability aside, I assume GDDR7 produces more heat. Probably the same situation we’re seeing with faster and faster SSDs.
I noticed the memory overclocks are pretty tame as well. Curious to see the temps on the 5080 since it has faster chips, and if there is any OC headroom.
 
I've been waiting (as per usual) for Apple to at least say something about an M4 Mac Studio Ultra but these 5090 cards look great for video work and video rendering and now support 422/h264 format.
Super tempted... If I can grab them on launch day as hear stock is very limited.

First (and only) non gaming review I've seen yet.
The 5090 benchmarks I have seen for adobe premiere and davinci resolve have not been great. Adobe apps in particular are not taking advantage of the 5090 yet. Blender has seen a BIG jump in rendering performance, so I hope the adobe apps can get their act together.
 
The 5090 benchmarks I have seen for adobe premiere and davinci resolve have not been great. Adobe apps in particular are not taking advantage of the 5090 yet. Blender has seen a BIG jump in rendering performance, so I hope the adobe apps can get their act together.
Yeah Adobe doesn’t look great but Resolve looks much more promising esp with GPU effects.
Hoping topaz video is good as that always gets left out on reviews.
Having NVENC/NVDEC codec support will be big though and I’m very much looking forward to that.
What I’m not looking forward too is the price - rumour has it that the third party, bigger and slightly faster cards could be over $2500 and I’m after x2 of them!
 
  • Like
Reactions: Jack Burton
Yeah Adobe doesn’t look great but Resolve looks much more promising esp with GPU effects.
Hoping topaz video is good as that always gets left out on reviews.
Having NVENC/NVDEC codec support will be big though and I’m very much looking forward to that.
What I’m not looking forward too is the price - rumour has it that the third party, bigger and slightly faster cards could be over $2500 and I’m after x2 of them!
$2200-$2800 depending on the partner card. Add $$ to it due to reported stock shortages.
 
$2200-$2800 depending on the partner card. Add $$ to it due to reported stock shortages.
I got the 3090 and 4090 on launch day at rrp price so hoping for third time lucky 👌
Also hoping at these big prices, I’ll get decent money for my used 4090’s so over the 2+ years of use, they won’t have actually cost me that much.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.