Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Is NVENC faster than QuickSync? (As an aside why in the world are you not using a Mac to do video editing?)

Ehh you said it wasn't in the recommended listing, not required listing. I think it and Cyberpunk 2077 are the only two games that currently recommend high end GPU's for any meaningful ray tracing settings.

Don't know if its faster. Adobe programs benefit from CUDA and NVIDIA which is why I use them.

I do use a Mac. I have 7 systems in my workflow. Mac Mini, 16" Macbook Pro, Mac Studio, Mac Pro (2010 still going strong), 27" iMac, and 2 Windows PCs.

But After Effects on M1 is still experiencing issues so I use After Effects on Windows. For final production I use Final Cut Pro as even my M1 Mac mini beats out my 3080 using Premiere Pro.

And a 2060 recommended for Dying Light. 2060 is not really a high end GPU.
 
Don't know if its faster. Adobe programs benefit from CUDA and NVIDIA which is why I use them.

I do use a Mac. I have 7 systems in my workflow. Mac Mini, 16" Macbook Pro, Mac Studio, Mac Pro (2010 still going strong), 27" iMac, and 2 Windows PCs.

But After Effects on M1 is still experiencing issues so I use After Effects on Windows. For final production I use Final Cut Pro as even my M1 Mac mini beats out my 3080 using Premiere Pro.
Ah. That is fair.
 
I think quality wise, quicksync is better. But it takes up CPU so I just use NVENC on my screen recordings and gameplay videos. And recording at 50 Mbps I don't notice a difference.
My co-workers do a bunch of Twitch streaming while gaming. It isn't my cup of tea (well that and my upload is garbage). I wonder if a M1 Mac would be a good Twitch streaming tool. Seems like it could let you get full frames on your main rig and send the video to the Mac for processing and upload for live broadcasts.
 
My co-workers do a bunch of Twitch streaming while gaming. It isn't my cup of tea (well that and my upload is garbage). I wonder if a M1 Mac would be a good Twitch streaming tool. Seems like it could let you get full frames on your main rig and send the video to the Mac for processing and upload for live broadcasts.
I do know a lot of people have two systems while streaming. One for the gameplay and one for what you state. I never tried it with Mac. I wonder what capture cards work with macs these days. As you have seen, I don't play high demanding games. Even when I do, I have no problem lowering the quality (I have done that a few times with my 1080). So I just have everything on one computer and I just record not stream. I also do lectures and tutorials on various topics so I still use OBS for that along with gameplay recordings.

I will say, Mac did NOT like my AMD 5700XT which is why I went back to my 1080 then finally upgraded to the 3080. Adobe on Windows was crashing a lot with 5700XT too so I was just done with AMD. Mac did not like OBS's recordings that used AMD encoders, but NVENC recordings are just fine.
 
How pathetic it is for you. That is the problem with these gaming threads. Minecraft, WoW, Stardew Valley, Factorio, Terraria, Borderlands 2. Every one of these is on Mac and every one of these I have well over 300 hours in. The AAA gaming industry is getting far too ridiculous for my liking with loot boxes, season passes, micro transactions and half-*** releases where even my 3080 Ti struggles because they don't optimize well.

Minecraft? Stardew Valley? Seriously ... we may as well start listing titles like Words with Friends then too!

Nobody's saying you can't enjoy those games or spend hundreds of hours on them. But realistically, your argument against the "AAA gaming industry" is essentially one that says you're rejecting everything cutting edge that would motivate someone to spend more for a fast 3D GPU in the first place.

I hate micro transactions and loot boxes they make you purchase, too. But those problems don't even affect me with my PC gaming? I've been playing a lot of Battlefield 2042 recently, for example. It started out full of bugs and not well liked by a lot of players.... but they've been patching it regularly (new patch just dropped for it last night), and they've added a new solider class, new maps, and fixed a LOT of the bugs now. It's a pretty great multiplayer game. And nope ... almost no chance they'll do a native Mac edition.

Before this, I played a lot of Fallout 76 and Overwatch. Again, would be great if I could have played them on my Mac but nope.
 
"While a lot of AAA games keep improving their graphics, their gameplay has stagnated" and "the Mac is a poor platform for AAA titles" can simultaneously be true.
 
  • Like
Reactions: George Dawes
Is NVENC faster than QuickSync? (As an aside why in the world are you not using a Mac to do video editing?)
You should not only look at encoding speed but also quality and file size. Hardware encoders are usually not that great, as you can see with the M1.
 
Actually, Nvidia GPU is the performance per watt king. 70W mobile Nvidia 3060 is 3x faster than Macbook Pro M1 Max 32GPU and 2x faster than Mac Studio M1 Ultra 64GPU on Blender rendering.

Do they fit in an enclosure the size of the Mac Studio?

 
Do they fit in an enclosure the size of the Mac Studio?

The desired performance should dictate the case size and cooling, not the other way around. So what a question, if someone needs that performance?

Design follows functions (and not vice versa).
 
I am typing this response from my 5k iMac, which I adore (huge, beautiful screen and sleep and compact form factor), and it's frustrating how Apple seems to be the only company who knows how to make a decent AOI PC.
The desired performance should dictate the case size and cooling, not the other way around. So what a question, if someone needs that performance?

Design follows functions (and not vice versa).
That's the engineering way of thinking, which is what led me down the path of embracing the Apple ecosystem in its entirety, because Apple is at its core a design-led company, with Apple designers calling the shots, and searching for and having technology made to serve the product experience, rather than engineers being excited about about new hot tech and then trying to turn it into a product.

Is there no value in being able to tuck your desktop neatly under your monitor, or in having it chug quietly away in the background so there is no loud and distracting fan noise that might distract you or interrupt your recording? And the Mac Studio still winds up being cheaper than an equivalently-specced windows PC, even after all those markups.

I can't quantity these the same way I can compare single-core benchmarks on a spreadsheet, but I will wager that they do matter to the creative market that Apple is evidently trying to court here. And can you imagine what an M1-pro equipped iMac would look like?

I guess my answer to your "so what" is that not everything which can be measured matters, just as not everything which matters can be measured. I am willing to concede that Intel may have an edge when it comes to absolute performance in a vacuum, but are we saying that it is the only thing which matters to the end user?
 
  • Like
Reactions: chucker23n1
I am typing this response from my 5k iMac, which I adore (huge, beautiful screen and sleep and compact form factor), and it's frustrating how Apple seems to be the only company who knows how to make a decent AOI PC.
I don't like AIO (in my eyes more disadvantages than advantages) and I like the fact that there is now a Mac Studio instead of a 27" iMac.

That's the engineering way of thinking, which is what led me down the path of embracing the Apple ecosystem in its entirety, because Apple is at its core a design-led company, with Apple designers calling the shots, and searching for and having technology made to serve the product experience, rather than engineers being excited about about new hot tech and then trying to turn it into a product.
But there were also some Macs with problems or bad products. Because they only looked at the optics. That's why I don't think Jony Ive is a good designer. Much of what Ive did has not been design, but merely styling.
With Ive, however, "looking good" was the be-all and end-all, and functionality and ease of use were carelessly sacrificed whenever there was a conflict. This resulted in round mice that caused wrist inflammation, or mice with the charging port on the bottom, and Macs with all ports on the back, even those that were meant to be constantly changed (USB, SD card slots). Some Macs also had poor cooling. With the Mac Studio, this was finally corrected (and even there, some things are suboptimal, like maybe the small ventilation holes => but a step in the right direction).

Is there no value in being able to tuck your desktop neatly under your monitor, or in having it chug quietly away in the background so there is no loud and distracting fan noise that might distract you or interrupt your recording?
In principle, this is also possible with a PC, even though Apple has delivered a very good package here that sets a high standard.

And a tower isn't so bad either.
A good tower can be quieter than a Mac Studio. Large heatsinks and large and slowly rotating fans can provide good and quiet cooling. A tower is normally under the table and you don't see it anymore, so it's further away from your ears.

And the Mac Studio still winds up being cheaper than an equivalently-specced windows PC, even after all those markups.

I don't know current PC components and prices that well anymore, but I still think this is a general statement that is not generally true. You can install a stronger graphics card in the PC alone if you need it.


I can't quantity these the same way I can compare single-core benchmarks on a spreadsheet, but I will wager that they do matter to the creative market that Apple is evidently trying to court here.
That's just talk. If you need power, you need power. And that was also the original post:

"Actually, Nvidia GPU is the performance per watt king. 70W mobile Nvidia 3060 is 3x faster than Macbook Pro M1 Max 32GPU and 2x faster than Mac Studio M1 Ultra 64GPU on Blender rendering."
Your response: "Do they fit in an enclosure the size of the Mac Studio?"


Someone who really uses Blender a lot probably has corresponding priorities.

And can you imagine what an M1-pro equipped iMac would look like?
I don't care.

I guess my answer to your "so what" is that not everything which can be measured matters, just as not everything which matters can be measured. I am willing to concede that Intel may have an edge when it comes to absolute performance in a vacuum, but are we saying that it is the only thing which matters to the end user?
OK
 
Last edited:
Minecraft? Stardew Valley? Seriously ... we may as well start listing titles like Words with Friends then too!

Nobody's saying you can't enjoy those games or spend hundreds of hours on them. But realistically, your argument against the "AAA gaming industry" is essentially one that says you're rejecting everything cutting edge that would motivate someone to spend more for a fast 3D GPU in the first place.

I hate micro transactions and loot boxes they make you purchase, too. But those problems don't even affect me with my PC gaming? I've been playing a lot of Battlefield 2042 recently, for example. It started out full of bugs and not well liked by a lot of players.... but they've been patching it regularly (new patch just dropped for it last night), and they've added a new solider class, new maps, and fixed a LOT of the bugs now. It's a pretty great multiplayer game. And nope ... almost no chance they'll do a native Mac edition.

Before this, I played a lot of Fallout 76 and Overwatch. Again, would be great if I could have played them on my Mac but nope.
So again as is always with these threads “my game isn’t on Mac therefore macs suck at gaming”. Windows doesn’t even have 100% of the console games (Persona 5 is the only reason I have a PlayStation). Does that mean Windows sucks at gaming since it doesn’t have that game? And until August or so, Windows doesn’t have Spider-Man.

This has nothing to do with hardware but marketshare and business decisions. You think PC hardware is JUST NOW ready to play Spider-Man? Something I was able to play 4 years ago on my base PS4. No. Macs can play the games listed. Heck some of the popular games made it to Nintendo Switch which had an outdated GPU even at launch.

So if it takes Windows 4 years to get some games, why would we expect anything faster for Mac?

And maybe we should start rejecting those mega pushes on GPUs. The fact the we are ALMOST at the point where a rumored 4090 (with other components) or the 50 series is almost too much for a single household outlet is crazy. I had to upgrade to 1000w for my 3080 Ti with all my other components.
 
Last edited:
Do they fit in an enclosure the size of the Mac Studio?


What are you talking about? Mobile Nvidia 3060 dGPU goes into 3.5# laptop. Is the Mac Studio 3.5#? No, it's 7.9# so more than double the weight, double the power consumption but half the performance.
 
Last edited:
The CPU + GPU + RAM of that MacBook Pro uses less power than your 70W for the GPU alone.

Nvidia is frequently performance king, but they're worse than Apple, AMD, or Intel at performance per watt.

Wrong again. Apple shows the Macbook Pro M1 Max uses ~32W CPU plus ~60W GPU. 3060 laptop pulls ~100W total from the wall under full load so they're comparable in power consumption but x64 CPU is slightly faster and 3060 GPU is 2.6x faster.

CPU
3m55.81s - AMD 5800H base clock no-boost and no-PBO overclock (CPU Blender 3.0)
4m11s - M1 Pro (CPU Blender 3.1 alpha)

GPU
16.39s - Nvidia 3060 70W mobile (GPU OptiX Blender 3.0)
42.79s - M1 Max 32GPU (GPU Metal Blender 3.1 alpha)
 
The desired performance should dictate the case size and cooling, not the other way around. So what a question, if someone needs that performance?

Design follows functions (and not vice versa).
Yass! Make the Studio 2x taller. Bring back the Cube!!!!!* Resistance is futile.

*Multiple exclaimation points. The sure signs of a diseased mind. Source: mulitple Terry Pratchett books.😁
 
Nothing requires a 3080 Ti. Even the popular Elden Ring works fine on my GTX 1080. I have never once even seen a high end 20 series GPU in the recommended listing for games.

So, so vastly inaccurate. I bring my 3080 Ti to it's knees easily, usually because I exceed Nvidia's idiotically low 12GB of VRAM. I so much want to give my $$$ to AMD next time around since they actually include a reasonable amount of VRAM for the $$$, but VR issues remain with AMD as compared to Nvidia, so until AMD support from the VR industry as a whole becomes as good as it is for Nvidia I'm more-or-less stuck with Nvidia. I can also bring the 3080 Ti to it's knees pushing pancake games into 3D+VR in various methods. For example getting Cyberpunk to produce more than about 40 fps at 4K through vorpX, or Virtual Desktop + SuperDepthVR, or the LukeRoss VR mod is basically impossible if you also want RT. Even the 3090 Ti would not quite suffice for my needs. Here is hoping the next-gen cards ship in a timely manner and are actually available!
 
So, so vastly inaccurate. I bring my 3080 Ti to it's knees easily, usually because I exceed Nvidia's idiotically low 12GB of VRAM. I so much want to give my $$$ to AMD next time around since they actually include a reasonable amount of VRAM for the $$$, but VR issues remain with AMD as compared to Nvidia, so until AMD support from the VR industry as a whole becomes as good as it is for Nvidia I'm more-or-less stuck with Nvidia. I can also bring the 3080 Ti to it's knees pushing pancake games into 3D+VR in various methods. For example getting Cyberpunk to produce more than about 40 fps at 4K through vorpX, or Virtual Desktop + SuperDepthVR, or the LukeRoss VR mod is basically impossible if you also want RT. Even the 3090 Ti would not quite suffice for my needs. Here is hoping the next-gen cards ship in a timely manner and are actually available!
Name one game that CANNOT RUN AT ALL without a 3080 Ti. This is a gaming topic. There might be some professional workload that requires a 3080 Ti, but that is not relevant for this topic.

Not one single games REQUIRES a 3080 Ti to function AT ALL. Running at 4k or 8k ULTRA settings....yes, but not a requirement.

This is the problem with topics like this. And we have been dealing with it for over a decade since I joined the site with the 2010 Mac Pro. How the heck are PS4 and XBox One still getting new games with their outdated graphics? How the heck is Nintendo Switch getting games with its horrible GPU (I mean it is laughably bad)? Its all about optimization and business decisions NOT hardware.
 
Last edited:
Name one game that CANNOT RUN AT ALL without a 3080 Ti. This is a gaming topic. There might be some professional workload that requires a 3080 Ti, but that is not relevant for this topic.

Not one single games REQUIRES a 3080 Ti to function AT ALL. Running at 4k or 8k ULTRA settings....yes, but not a requirement.

This is the problem with topics like this. And we have been dealing with it for over a decade since I joined the site with the 2010 Mac Pro. How the heck are PS4 and XBox One still getting new games with their outdated graphics? How the heck is Nintendo Switch getting games with its horrible GPU (I mean it is laughably bad)? Its all about optimization and business decisions NOT hardware.

Well, I did just provide one usage-scenario in which even a 3080 Ti is woefully insufficient. But yes, you are correct that most games will run on vastly less-powerful GPUs. I've got a another game-rig that has an antiquated 4-Core i7 and an AMD 580 and it's perfectly capable of running even the latest AAA games, with certain limitations or quality compromises.

And you are absolutely correct; a tremendous amount of it comes down to good programming. Those older consoles can get these games because it's a closed-system; the devs know exactly how much they have to optimize their games to get them to work. And some (a very, very few) of them do a really, really good job of it. It's incredible, for example, how good something like GT5 still looks today; and that's a PS3 game (even offered stereoscopic mode!)

If devs could be bothered, we would all be playing with much better graphics on significantly lesser hardware; but outside those handful of excellent houses (Like Polyphony Digital); it's rare to find truly well-optimized games.

Unfortunately, your hardware being 'good enough' just isn't any guarantee that things will really run well; because most of the devs are just that bad.
 
Last edited:
Well, I did just provide one usage-scenario in which even a 3080 Ti is woefully insufficient. But yes, you are correct that most games will run on vastly less-powerful GPUs. I've got a another game-rig that has an antiquated 4-Core i7 and an AMD 580 and it's perfectly capable of running even the latest AAA games, with certain limitations or quality compromises.

And you are absolutely correct; a tremendous amount of it comes down to good programming. Those older consoles can get these games because it's a closed-system; they devs know exactly how much they have to optimize their games to get them to work. And some (a very, very few) of them do a really, really good job of it. It's incredible, for example, how good something like GT5 still looks today; and that's a PS3 game (even offered stereoscopic mode!)

If devs could be bothered, we would all be playing with much better graphics on significantly lesser hardware; but outside those handful of excellent houses (Like Polyphony Digital); it's rare to find truly well-optimized games.

Unfortunately, your hardware being 'good enough' just isn't any guarantee that things will really run well; because most of the devs are just that bad.
Yep I agree. Apple doesn't need to adhere to the PC Master Race that needs everything maxed at 8k resolution and water cooling RGB mess. We can take the lesser quality games on Mac just like consoles do. It all leads to business decisions and marketshare.

I have mentioned in dozens of threads now but as a game developer myself I am not making my game for Mac purely due to Windows marketshare. And my game works fine on Intel integrated graphics from years ago!
 
Yep I agree. Apple doesn't need to adhere to the PC Master Race that needs everything maxed at 8k resolution and water cooling RGB mess. We can take the lesser quality games on Mac just like consoles do. It all leads to business decisions and marketshare.
if Apple had its way, we would still be stuck with old, slow hardware.

It's a fact that PC gaming pushes hardware innovation that benefits not just gamers but everyone in the tech industry.
 
if Apple had its way, we would still be stuck with old, slow hardware.

It's a fact that PC gaming pushes hardware innovation that benefits not just gamers but everyone in the tech industry.
This makes no sense. Apple Silicon isn’t good for the 4K gaming but it’s beating Intel and NVIDIA in work related tasks. Video production requires more GPU than most games do. I have one colleague that has the top end Quadro for their work. Plays games like crap but it beats my 3080 Ti on what can be done in their software.
 
  • Haha
Reactions: mi7chy
This makes no sense. Apple Silicon isn’t good for the 4K gaming but it’s beating Intel and NVIDIA in work related tasks. Video production requires more GPU than most games do. I have one colleague that has the top end Quadro for their work. Plays games like crap but it beats my 3080 Ti on what can be done in their software.

Well, it does make sense in that the chips are optimized properly for specific tasks. This is the way the computing world used to work; ask any Amiga or Atari (ST/TT/Falcon) owner how much faster their computers were than Macs and PCs back in that era. It took Macs and PCs a good decade (not exaggerating) to catch up; yet those Atari and Amiga systems were built with CPUs of about the same raw power as the PCs and Macs. But the OS and the hardware of the Amiga/Atari systems was otherwise lightyears ahead, especially in terms of how they were designed to work together with maximum efficiency. Apple has sort of revived this design philosophy; with chips and an OS designed specifically to make optimal use of each other. It's been a long time coming, but it's nice to see it return.
 
Well, it does make sense in that the chips are optimized properly for specific tasks. This is the way the computing world used to work; ask any Amiga or Atari (ST/TT/Falcon) owner how much faster their computers were than Macs and PCs back in that era. It took Macs and PCs a good decade (not exaggerating) to catch up; yet those Atari and Amiga systems were built with CPUs of about the same raw power as the PCs and Macs. But the OS and the hardware of the Amiga/Atari systems was otherwise lightyears ahead, especially in terms of how they were designed to work together with maximum efficiency. Apple has sort of revived this design philosophy; with chips and an OS designed specifically to make optimal use of each other. It's been a long time coming, but it's nice to see it return.
I think the consoles are the biggest impact on tech in games. With the 20 series and initial launch of the 30 series GPU, not many games used ray tracing (not that I care for it, I never enable it). But since PS5 and Xbox Series consoles we see it more and more. Same with direct storage and others.

Why is it new games continue to run on a GRX 1080 and run well at 1080p?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.