Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Malus120

macrumors 6502a
Original poster
Jun 28, 2002
679
1,412
I’m creating this thread as a place for discussion and observation on Apple Silicon graphics performance with Metal FX upscaling.

(I’m posting here in the Apple Silicon forum, rather than the Mac & PC Gaming forum, because I believe the discussion, in the long run, is much broader than just how it performs in one or two specific games and has implications beyond gaming.)

As of right now, Resident Evil Village is the only application using MetalFX Upscaling so it will be the point of reference until others arrive.

First a TLDR (as this is a long post!):

MetalFX Upscaling is pretty good for 1.0 release, but is currently a story of two modes.
Quality mode looks and runs great giving a 40~50% performance boost at 4K with very little perceptible difference vs native resolution. It also looks very good at sub 4K resolutions including both 1440P and 1080P.
Performance Mode on the other hand looks quite poor and although it delivers over 2x the performance of native resolution, it isn't quite ready for prime time (In Village, Quality Mode at 1521P delivers a higher quality resolve with slightly better performance.)

Now, on to my detailed initial observations:

Test setup:
14” M1 Max MacBook Pro
32 Core GPU
32GB Memory
42” LG C2 OLED

*Note on the Interlaced Mode in Resident Evil Village*
Interlaced mode = Checkerboard Rendering a temporal reconstruction technique that debuted with the PS4 Pro in 2016 and renders at ~1512P internally targeting 4K. Only temporal upscaling option available on PC ATM.

Resident Evil Village Benchmarking methodology:

Settings:

Preset: None (Custom settings based on Digital Foundry's optimized settings with minor tweaks)
Resolution: 4K
Scaling: 1.0
Mesh Quality: Max
Texture Quality: High (4GB) (tested this setting at various levels, High(8GB) gives performance within 1-2FPS but seems to give less consistent frametimes on my machine)
Shadow Quality: High (Max seems to perform basically the same but I didn't feel like retesting - also should benefit VRAM limited cards like the 3070 a bit)
Volumetric Lighting Quality: Mid (large performance uplift vs higher settings, basically same visuals as confirmed by DF)
SSAO: SSAO (CACAO seems to have an outsized performance hit on M1 Max)
Film Grain: Off
All other effects (Contact Shadows, Subsurface Scattering, Bloom, DoF, etc): On
*This exercise started out as a test to reconfirm DF's finding on settings and see if there were any particular settings that might have an unusually large impact on Mac. Therefore, I did not use a preconfigured settings preset. For those curious CACAO seems more costly than on PC, while significant performance can be gained disabling Bloom

Scenes:
Title screen
Upstairs hallway near railing holding Rose

*Haven’t had time to play farther into the game at the moment, so testing is limited to initial areas (I’ve beaten the game previously on PS5)


Type of upscaling in RE Village: Temporal Antialiased Upscaling

Reasoning:

1. The images don’t show the kind of artifacts/over sharpening you’d expect to see with spatial upscaling.

2. When using the performance mode on a large high-res screen, you can see the characteristics of the image change in motion in ways that don’t happen at native res. Image stability both standing still and while in motion are “variable.”

3. The aliasing in performance mode tends to present in a way that, again suggests it is being temporally reconstructed on the fly (it can flicker when resolving certain types of materials, objects and edges)

4. Reconstruction performance seems to scale to some degree with framerate (quality mode at 30~45FPS has flickering artifacts that don’t appear when the framerate Is closer to 60)

Upscaling Quality at 4K (hallway & title screen):

Quality Mode:
Very impressive visual result almost indistinguishable from native as long as FPS is >50. Significantly higher quality resolve than Interlaced Mode with much less (almost no) artifacting/flicker.

Performance mode: Very poor results even at 4K, delivering noticeably worse image quality than not only Interlaced Mode (let alone native) but also quality mode running at a significantly lower post reconstruction resolution. Honestly speaking, from an image quality standpoint, this just isn’t ready for prime time.

Interlaced mode: A reasonable 4K-ish image that resolves less detail, is more aliesed, and is less stable in motion than MetalFX Quality Mode.

Upscaling Performance at 4K (hallway):

Quality Mode:
~44% performance uplift over native resolution

Performance mode: ~2.25x uplift over native resolution
*Quality Mode (1521P) : ~2.35x performance uplift over native resolution (looks better than Performance Mode)

Interlaced Mode: ~38% performance uplift over native resolution (this isn’t using MetalFX it’s just for comparison)

Initial thoughts on MetalFX Upscaling & internal resolution speculation:

Quality Mode:
Incredibly impressive delivering near native image quality while improving performance by >40% with very little, if any, perceptible loss of detail or atifacting from what I have seen so far.

Performance Mode: Image quality is a bit of a dumpster fire at the moment, and thus I reeally don't feel this is ready for prime time although the results are tolerable on a smaller screen.

Internal Resolutions: I’m not Digital Foundry, but testing various resolutions and comparing performance to MetalFX Upscaling I'd guess at 4K Quality Mode renders internally at ~1440P-1521P while Performance Mode could be ~1080P (considering the frame time cost of upscaling)

Honestly speaking Performance Mode's really needs more work as upscaling from quarter res is much more helpful from a performance standpoint and DLSS, FSR 2.1, and XeSS all deliver much more convincing quarter res (1080P -> 4K) reconstruction (in other games).

At this point it'd be better if they just enabled the resolution scaler for Quality Mode (like it is for normal and interlaced mode) so that you could have the UI render at native res, while the graphics are rendered at X% of the target output resolution.

MetalFX Upscaling Tentative Conclusion:

Quality mode is extremely impressive and from what I've seen so far competes well with other prominent upscaling tech (DLSS 2.x, FSR 2.x, XeSS, etc), althoguh I imagine closer examination may reveal aspects where it's still very "1.0." Performance Mode however feels more like a beta that just needs more time in the oven (unless RE Village just has a bad implementation.) Hopefully it will get better in time but right now it can't hold a candle to other reconstruction techniques.
Overall despite the performance mode stumbles, the tech is incredibly impressive, especially for a 1.0 release (from Apple no less) and really opens the door for all Apple Silicon Macs (not just M1 Pro/Max) to have a long bright future ahead for graphics/gaming.

Cross platform comparison:

Comparison Platforms:
Windows PC: R9 3900X with RTX 3070
(*RTX 3070 @ 45% power (100W) is the lowest my GPU can go. This should be somewhat similar to the fastest RTX 3070 Laptop edition, although they ALSO have ~15% less CUDA cores on top of being power limited so they're likely a bit slower. @125W it should be similar to how a lot of 3080 Laptops perform)
PlayStation 5 (I may add this later if time permits, but TLDR should be a locked Interlaced 4K60 without RT)

Picture4.png


Important Observations about 14" M1 Max Power/Thermal Throttling in Resident Evil Village:

The GPU is basically 100% loaded here all the time. Running at high resolutions causes the 32 core M1 Max in the 14" MBP to throttle significantly under load. While the rated clockspeed is 1292Mhz, I've seen it briefly dip as low as 850Mhz during actual gameplay at 4K, with it normally hovering around 950~1075Mhz. It is quite likely that the 16" MBP and Mac Studio variants of the M1 Max could be up to or more than 20% faster (although I have no way to test).

Of note, the GPU starts out fast and then slows down as it hits 99C and becomes heatsoaked, performance then drops until the temperature stabilizes around 88C (I'd recommend a custom fan curve here)
Interestingly, decreasing the internal rendering resolution results in noticeably higher GPU clockspeeds.

On the title screen for example:
Native 4K: ~1000Mhz ~ 1050Mhz
Quality Mode 4K: 1050Mhz~1150Mhz
Quality Mode 1440P: 1125Mhz - 1225Mhz
Quality Mode 1080P: 1225Mhz - 1292Mhz

This is another reason it's REALLY important for Apple to improve Performance Mode. It could dramatically increase the performance of more power / thermally constrained Macs.

Thoughts on Apple's initial positioning of the M1 Max:

During the initial reveal of the M1 Max, Apple compared the GPU to the 3060 Laptop Edition, 3070 Laptop Edition, and 3080 Laptop Edition. Important to note these ARE NOT the same as the desktop cards. The 3070 Laptop edition is much slower than the desktop 3070 while the 3080 Laptop is still slower (albeit much closer at >125W + having more CUDA cores) to a desktop 3070 (220W).

Assuming the M1 Max in the 16" MBP and Mac Studio are indeed ~20% faster, it is entirely possible that the M1 Max can, with appropriate cooling & power, manage to come close to or match a 3070 Laptop Edition in this game. In that case I think it's fair to say that performance is in line with the middle of the expectations (3070 Laptop) Apple set for the 16" MBP and Mac Studio, and likely similar to or slightly faster than the lower end of expectations (3060 Laptop) for the 14" MBP.

However, it's also possible this game will not run much faster on the 16" MBP / Mac Studio due to being limited by the TLB (tile buffer), the Tile Rendering architecture of the Apple Silicon GPU, some other aspect of the Apple Silicon GPU, drivers, or just being plain slower in this particular game, so more data is needed.

In summary, if performance does scale up with the 16" MBP and Mac Studio, I'd call it a very good result (~3070 Laptop equivalent) vs expectations set, whereas if it doesn't it's still an acceptable result, but at the lower end of the expectations Apple set and honestly a bit disappointing.
The comparison with the 3080 Laptop however, in this game is unlikely to hold up very well without using MetalFX Upscaling.

Nonetheless, particularly when making use of MetalFX upscaling all configurations of the M1 Max become very competitive against their PC counterparts.

Furthermore the Mac can do all of this at just 22W total system power, so gaming on the go is actually viable on battery. Furthermore, you can get a nice looking 30 or even 60FPS experience using Low Power Mode at only 10W(!)
 

Attachments

  • Picture2.png
    Picture2.png
    334.5 KB · Views: 247
  • Picture3.png
    Picture3.png
    338.5 KB · Views: 195
Last edited:

diamond.g

macrumors G4
Mar 20, 2007
11,158
2,467
OBX
Nice analysis. Did you happen to test the upscaling on the PC side for image quality and performance comparisons?
 

leman

macrumors Core
Oct 14, 2008
19,308
19,298
I wonder what do the "quality" and "performance" mode actually translate to in terms of MetalFX. The API offers a spatial upscaler and a temporal upscaler, I haven't seen any additional quality settings. I suppose Village is using the temporal upscaler. Maybe "performance" dramatically lowers the rendered resolution?
 

diamond.g

macrumors G4
Mar 20, 2007
11,158
2,467
OBX
MetalFX Upscaling session at this year's WWDC.

What does technology behind of MetalFX Upscaling look more like: Nvidia's DLSS or AMD's FSR?
MetalFX can look like either one, presuming you are referring to FSR 1 and not FSR 2, which is equal to DLSS in terms of the kind of scaler it is.
 
  • Like
Reactions: Xiao_Xi

maflynn

macrumors Haswell
May 3, 2009
73,575
43,560
Cross platform comparison:

Comparison Platforms:
Windows PC: R9 3900X with RTX 3070
(*RTX 3070 @ 45% power (100W) is the lowest my GPU can go. This should be somewhat similar to the fastest RTX 3070 Laptop edition, although they ALSO have ~15% less CUDA cores on top of being power limited so they're likely a bit slower. @125W it should be similar to how a lot of 3080 Laptops perform)
PlayStation 5 (I may add this later if time permits, but TLDR should be a locked Interlaced 4K60 without RT)
Great analysis and comparison.

Here's my take away.
The 14" M1 Max with 32 GPU cores and 32GB of ram goes about 3,100 dollars. If you want to compare that to a desktop, its probably 1,500 dollars more expensive but it shows itself holding it own

Compared against gaming laptops, the Asus ROG G14 is about 1700 to 1900. The Razer is about 2,300 for a RTX 3070-Ti equipped laptop. The advantage that the Mac brings is its long battery life and cool running. I own a 15" Razer and tried a 14" The 14's fans were constantly going since its a thin laptop with a hot running cpu/gpu. A that the end of the day, I returned the 14" Razer and bought a 14" Mac

The only downside I see, and its a show stopping downside, is the lack of games available. I don't do 3d modeling, I don't use apps that would take advantage of Metal, but I do play the occasional game. This is where the battery, energy, and cool running advantages fail to move people to buy the Mac - at least a segment of people where gaming is important.
 
  • Like
Reactions: AxiomaticRubric

Malus120

macrumors 6502a
Original poster
Jun 28, 2002
679
1,412
Nice analysis. Did you happen to test the upscaling on the PC side for image quality and performance comparisons?
Briefly, I didn't have time to go in depth today. Maybe I'll add some more/change the graphs around if I have time (I have another 3070 that might be able to do a lower power limit.)
That said, the PC only has access to Interlaced Mode (Checkerboard Rendering) in this title, so Apple Silicon Macs actually have a significant advantage in terms of image reconstruction in this title as MetalFX Upscaling Quality Mode looks quite a bit better than Interlaced Mode (heck it even looks better targeting lower output resolution like 1700~1800P than Interlaced at Mode at 4K)

I wonder what do the "quality" and "performance" mode actually translate to in terms of MetalFX. The API offers a spatial upscaler and a temporal upscaler, I haven't seen any additional quality settings. I suppose Village is using the temporal upscaler. Maybe "performance" dramatically lowers the rendered resolution?
As I said in my (admittedly rather long) analysis, I strongly believe (95% confidence) that they are using the temporal solution.
The physical makeup of the image looks like temporal reconstruction, and frankly speaking, the image quality in Quality Mode is far too high for it to be a spatial only technique. Similarly the artifacting in Performance Mode doesn't look like the kind of artificating you get from something like FSR 1.0 (spatial upscaling)


Great analysis and comparison.

Here's my take away.
The 14" M1 Max with 32 GPU cores and 32GB of ram goes about 3,100 dollars. If you want to compare that to a desktop, its probably 1,500 dollars more expensive but it shows itself holding it own

Compared against gaming laptops, the Asus ROG G14 is about 1700 to 1900. The Razer is about 2,300 for a RTX 3070-Ti equipped laptop. The advantage that the Mac brings is its long battery life and cool running. I own a 15" Razer and tried a 14" The 14's fans were constantly going since its a thin laptop with a hot running cpu/gpu. A that the end of the day, I returned the 14" Razer and bought a 14" Mac

The only downside I see, and its a show stopping downside, is the lack of games available. I don't do 3d modeling, I don't use apps that would take advantage of Metal, but I do play the occasional game. This is where the battery, energy, and cool running advantages fail to move people to buy the Mac - at least a segment of people where gaming is important.
I generally agree, I wouldn't recommend anyone buy a 14" MBP just for gaming but I sure do love mine. That said, as Linus and others have said, if games were actually more available on the Mac they'd be some of the best gaming laptops on the market
The point wasn't so much to judge it as a "gaming laptop" as judge it against the claims Apple initially made when the chips were announced (== 3070/3080 laptop)
 

diamond.g

macrumors G4
Mar 20, 2007
11,158
2,467
OBX
I thought DLSS and FSR use different technology because DLSS uses deep learning and FSR doesn't.
FSR 1.0 is Spatial FSR 2.0 is temporal. AFAIK DLSS 2/3 doesn't actually use ML (the original DLSS did and the results were not consistent).
 
  • Like
Reactions: Xiao_Xi

maflynn

macrumors Haswell
May 3, 2009
73,575
43,560
The point wasn't so much to judge it as a "gaming laptop" as judge it against the claims Apple initially made when the chips were announced (== 3070/3080 laptop)
Oh I know, but at the moment, the only point of reference that I can use is games.

they'd be some of the best gaming laptops on the market
Perhaps, but for me, if the fallout games, and specifically Fallout 76 ran on my Mac, then about 76% (see what I did there) of my requirements would be satisfied. The other 24% are work related, i.e., using vpn, database and enterprise tools for me job.
 

neinjohn

macrumors regular
Nov 9, 2020
107
70
Certainly the first time a graphically intensive game released at the same year as the hardware can be run at 4K resolution from a battery pack for more than 5 minutes if at all at 60 fps?
 
  • Like
Reactions: AxiomaticRubric

mi7chy

macrumors G4
Oct 24, 2014
10,495
11,155

What graphics preset are you using for the individual results in the bar graph, Recommended, Prioritize Performance, Balanced, Prioritize Graphics or Max? That key piece of info can have significant performance difference.

Also need to see in-game screenshots to spot any differences in image quality. Several people have noticed that, for example, native resolution on MacOS version of the game isn't the same as native resolution on PC version. On MacOS, if you look at the missing parts of the tree branches it looks like native resolution is downscaled to ~80% resolution that would have less load on GPU so not direct one to one comparison. To mimic that on PC, screenshots of Image Quality set to both 0.8 (80%) and 1.0 (100%) for comparison.

MacOS 3024x1964 max preset

PC 3200x1800 max preset 80% resolution scaling

PC 3200x1800 max preset 100% resolution scaling

Furthermore the Mac can do all of this at just 22W total system power, so gaming on the go is actually viable on battery.

M1 Max GPU has max power consumption of close to 60W not accounting for CPU. You mentioned GPU load while gaming was at or near 100% with throttling but yet it's using only 22W total system power? That doesn't seem right. Additionally, 22W wouldn't even throttle a single active cooling fan and the 14" MBP has two active cooling fans which should be good for at least 80W if not more. Can you confirm by measuring with wall watt meter?

2021-10-19-image-11-j_1100.webp


Once you have MacOS graphics preset plus resolution you can use this chart to get an idea of what GPU it's equivalent to.

https://www.notebookcheck.net/Resident-Evil-Village-Performance-Analysis.538098.0.html
1667342581605.png
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,308
19,298
M1 Max GPU has max power consumption of close to 60W not accounting for CPU.

M1 Max peak GPU power draw is 40 watts exactly. At least according to the internal sensors. The only time when I managed to get this power draw was running a custom compute shader designed to permanently saturate all GPU compute units. All the games I’ve tried up to date have the GPU draw 20-25 watts power at most. Clearly Apple has some work to do in regards to shader occupancy.
 
  • Like
Reactions: killawat

Malus120

macrumors 6502a
Original poster
Jun 28, 2002
679
1,412
What graphics preset are you using for the individual results in the bar graph, Recommended, Prioritize Performance, Balanced, Prioritize Graphics or Max? That key piece of info can have significant performance difference.

Also need to see in-game screenshots to spot any differences in image quality. Several people have noticed that, for example, native resolution on MacOS version of the game isn't the same as native resolution on PC version. On MacOS, if you look at the missing parts of the tree branches it looks like native resolution is downscaled to ~80% resolution that would have less load on GPU so not direct one to one comparison. To mimic that on PC, screenshots of Image Quality set to both 0.8 (80%) and 1.0 (100%) for comparison.

MacOS 3024x1964 max preset

PC 3200x1800 max preset 80% resolution scaling

PC 3200x1800 max preset 100% resolution scaling



M1 Max GPU has max power consumption of close to 60W not accounting for CPU. You mentioned GPU load while gaming was at or near 100% with throttling but yet it's using only 22W total system power? That doesn't seem right. Additionally, 22W wouldn't even throttle a single active cooling fan and the 14" MBP has two active cooling fans which should be good for at least 80W if not more. Can you confirm by measuring with wall watt meter?

2021-10-19-image-11-j_1100.webp


Once you have MacOS graphics preset plus resolution you can use this chart to get an idea of what GPU it's equivalent to.

https://www.notebookcheck.net/Resident-Evil-Village-Performance-Analysis.538098.0.html
View attachment 2106457
Thanks for the feedback.

I added the settings to the top post but they are custom settings based on Digital Foundry's optimized PC settings. As I note at in the top post, my goal when I started wasn't to create something that is easily comparable with the notebookcheck graph. I was initially looking primarily at settings impact and the quality of MetalFX Upscaling on Mac before deciding to do the comparison for fun.
As I'm only testing the title / hallway, and taking average framerate by eye, I don't feel like its fair to compare the results to a PC which has been through an actual stress test run.
That said if time allows I'll go back and try the presets, and once I get further into in the mac version I'll try and test it against PC in more complex scenes. Unfortunately I only have the Gold Demo on PC (so only one hour of play in the Castle)

While I'm happy to provide screenshots (as time allows, I was in a hurry so just took pics of the screen on PC & only have Mac screenshots ATM) and will happily look into the resolution thing casually when I have some spare time, I'm not going to compare Mac at native 4K to PC at 80% resolution in depth (at least not at this time.)

I feel like that kind of pixel counting work, similar to determining the exact reconstruction resolution, is a job best left for people like Digital Foundry who are experts in this kind of analysis (if such analysis is not forthcoming I'm willing to be part of a collaborative effort to look into it, but given DF's interest in reconstruction, and stated interest in Metal 3, I'm hopeful they'll take a look)

For example, it's completely possible that the Mac isn't rendering at a lower resolution but, perhaps failing to render some elements of the scene correctly (or at all) giving what at first glance might appear to be a lower resolution when it's really more complicated than that.
 
Last edited:
  • Like
Reactions: leifp

diamond.g

macrumors G4
Mar 20, 2007
11,158
2,467
OBX
Thanks for the feedback.

I added the settings to the top post but they are custom settings based on Digital Foundry's optimized PC settings. As I note at in the top post, my goal when I started wasn't to create something that is easily comparable with the notebookcheck graph. I was initially looking primarily at settings impact and the quality of MetalFX Upscaling on Mac before deciding to do the comparison for fun.
As I'm only testing the title / hallway, and taking average framerate by eye, I don't feel like its fair to compare the results to a PC which has been through an actual stress test run.
That said if time allows I'll go back and try the presets, and once I get further into in the mac version I'll try and test it against PC in more complex scenes. Unfortunately I only have the Gold Demo on PC (so only one hour of play in the Castle)

While I'm happy to provide screenshots (as time allows, I was in a hurry so just took pics of the screen on PC & only have Mac screenshots ATM) and will happily look into the resolution thing casually when I have some spare time, I'm not going to compare Mac at native 4K to PC at 80% resolution in depth (at least not at this time.)

I feel like that kind of pixel counting work, similar to determining the exact reconstruction resolution, is a job best left for people like Digital Foundry who are experts in this kind of analysis (if such analysis is not forthcoming I'm willing to be part of a collaborative effort to look into it, but given DF's interest in reconstruction, and stated interest in Metal 3, I'm hopeful they'll take a look)

For example, it's completely possible that the Mac isn't rendering at a lower resolution but, perhaps failing to render some elements of the scene correctly (or at all) giving what at first glance might appear to be a lower resolution when it's really more complicated than that.
DF has at least 1 person on the B3D Forum. It may be worthwhile to ask them about it. I am curious if @leman knows who it is.
 

leifp

macrumors 6502
Feb 8, 2008
368
355
Canada
Nice teaser for the (certainly) upcoming DF analysis! Got me thirsting a bit more for it. I have a desktop PC for gaming but dislike the way performance is often measured between systems of disparate components and operating systems.

(DF asked supporters if they’d like to see a Mac RE VIII deep dive and there were many requests, including my own)
 
  • Like
Reactions: Malus120

Malus120

macrumors 6502a
Original poster
Jun 28, 2002
679
1,412
Nice teaser for the (certainly) upcoming DF analysis! Got me thirsting a bit more for it. I have a desktop PC for gaming but dislike the way performance is often measured between systems of disparate components and operating systems.

(DF asked supporters if they’d like to see a Mac RE VIII deep dive and there were many requests, including my own)
Haha, that's part of the reason I was so desperate to get this post out as soon as possible once I started compiling data... I know once Digital Foundry (hopefully) releases their video (that they better be working on) my (game specific) analysis becomes much less interesting to say the least.

That said I'm really excited for their video, there's so much to unpack here and it goes way beyond what myself and most normal people can do. I (or someone else) will link it to this thread when/if it's released as I think it should propel the discussion forward.

Anyway, it's still a bit unclear whether RE Village is a perfect port in terms of raw performance/image quality(although it's probably about as close to perfect as we're ever likely to get), how well Apple Silicon can compete in these kinds of workloads, etc, but nonetheless MetalFX Upscaling in Quality Mode looks great. Apple hit it out of the park IMHO.
Now let's just hope they can get Performance mode looking (much) better and we'll have a very competitive upscaling solution on Mac which, when combined with Apple Silicon's baseline performance being quite high relative to the majority of the PC market (not just gaming PCs) will hopefully lead to more ports (due to a much larger total addressable market for games than Apple has ever had before.) It would also be interesting to see if they could translate this work to some production workloads... 🤔
 
  • Like
Reactions: leifp

mi7chy

macrumors G4
Oct 24, 2014
10,495
11,155
Haven't followed DF but there are comprehensive comparisons out there for upscaling technologies such as Techspot's review of DLSS, FSR and XeSS. They just need to add MetalFX for completeness. In the end, upscaling is a last resort after true native resolution since even DLSS has upscaling artifacts from my own testing.

https://www.techspot.com/article/2558-dlss-vs-xess-vs-fsr/
 

Malus120

macrumors 6502a
Original poster
Jun 28, 2002
679
1,412
Haven't followed DF but there are comprehensive comparisons out there for upscaling technologies such as Techspot's review of DLSS, FSR and XeSS. They just need to add MetalFX for completeness. In the end, upscaling is a last resort after true native resolution since even DLSS has upscaling artifacts from my own testing.

https://www.techspot.com/article/2558-dlss-vs-xess-vs-fsr/
Good to know!

That said, I really disagree with you on temporal upscaling being a last resort (and it's ok to disagree!)

High resolution displays are great but what we've lost in terms of motion clarity/temporal resolution in the race to to thinner, HD, 4K and now 8K panels is honestly pretty shocking. My primary TV until 2020 was a 1080P Plasma I bought in 2012 so I didn't realize how bad it was until I "upgraded" to a 4K OLED when the PS5 came out.

You say you don't like the artifacts temporal upscaling produces vs native resolution, but IMHO what's actually much worse is the degree of artifacting inherently present on almost all modern TVs/monitors in motion. On OLED it's jerkiness (instant pixel response) while LCD's blur (in some circumstances this can look "better" but both look bad)

The motion clarity of basically any modern screen is terrible at 24-30Hz, barely passable, but not great at 40-60Hz, and only really starts to become acceptable (but still nowhere near CRT or Plasma) at around 90Hz. Black Frame Insertion (BFI) helps to bring motion clarity back up to good levels, but TV and monitor manufacturers are inconsistent in their implementation (likely because the average consumer doesn't understand what it does and just sees a darker screen)

So... We need higher frame rates, but developers obviously want to continue pushing the boundaries of visual fidelity.
Not everyone wants an RTX 4090 (super expensive, hot, and isn't going to fit into a laptop or even SFF anytime soon) and even Nvidia is acknowledging via DLSS 2 & 3 that brute force alone isn't going to cut it.
How do we get higher framerates while still pursing higher fidelity? Temporal reconstruction techniques like MetalFX Upscaling, DLSS, FSR 2.x, XeSS, TSR (UE5), and Checkerboard Rendering are currently the only option as otherwise no matter how powerful GPUs get compute resources are wasted merely rendering ever higher resolutions.

Of course some people don't mind sample and hold blur (the artifacting in modern displays) and just like high native resolutions, if you're one of those people more power to you!

But for me personally? I'll take a convincing temporal upscaling technique with some minor artifacting any day of the week if it'll give me more performance (and thus more temporal (in motion) resolution and less sample and hold blur).
 
Last edited:

diamond.g

macrumors G4
Mar 20, 2007
11,158
2,467
OBX
Now that AMD has announced their version of Frame Generation, I assume it will be open source. With that said, I wonder how long till Apple adopts a similar solution.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.