Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

UBS28

macrumors 68030
Original poster
Oct 2, 2012
2,893
2,340
The RTX 4090 will apparently be 3 times faster than the RTX 3090.

And since the Mac Pro will be targetting video editors who will benefit from the GPU, it could be possible that the M1 Max Quadro could be slower than PC’s with a single RTX 4090?

Now the big advantage is that Apple has enough supplies to combine 4 M1 Max together into a M1 Max Quadro, while it will be impossible to get your hands on the RTX 4090 due to the chip shortage crisis we are currently in.
 
The RTX 4090 will apparently be 3 times faster than the RTX 3090.

And since the Mac Pro will be targetting video editors who will benefit from the GPU, it could be possible that the M1 Max Quadro could be slower than PC’s with a single RTX 4090?

Now the big advantage is that Apple has enough supplies to combine 4 M1 Max together into a M1 Max Quadro, while it will be impossible to get your hands on the RTX 4090 due to the chip shortage crisis we are currently in.
I don’t think it makes a huge difference. The video market apple will target will be the final cut crowd.

Where the RTX4090 will be irrelevant.

The true test would be, RTX and Premier vs M whatever and Final cut
 
The RTX 4090 will apparently be 3 times faster than the RTX 3090
How do you know it? Any benchmark?

it will be impossible to get your hands on the RTX 4090 due to the chip shortage crisis we are currently in.
It looks like Nvidia will have fewer supply problems with its upcoming GPUs because it has reserved a lot of capacity at TSMC.
 
  • Like
Reactions: Basic75
Unless the graphic card has much better video decode / encode engines and support for ProRes I don't see a problem.

The single graphic card may be faster at rendering effects but will have trouble with multiple streams of ProRes 4K / 8K footage on a timeline.

The media engine on the M1 Max is already pretty great.
 
I'm a PC gamer, this 3090 is huge, hot, and takes a lot of power. Its loud under full gaming load - and the 4090 is rumored to be more power hungry for yes, even more performance.

Its likely the Mac Pro will be silent, and maybe even more powerful depending on the 4x M1 Max or whatever they are trying to do. I don't think anyone is in trouble - these things target different markets for different reasons.
 
In addition to all the great points mentioned above, even more basically there are too many unknowns to have such a concern: the range of leaked estimates for the performance of the upcoming Nvidia cards is quite wide and we don’t really even *know* that the M1 Max dies or something equivalent in GPU core counts will form the basis of the Mac Pro. So worrying if one of two unreleased products will make the other one obsolete when we have such little information on either is stretching fun speculation a little too far. This may be a rumors forum but it’s best to wait until we have a little more concrete information to go on before worrying.
 
Do the Imagination's GPUs scales well?

In general GPUs scale well. If you’re specifically referring to Imagination’s TBDR technique which Apple’s own is based on, then I don’t see why it wouldn’t. But Apple’s Pro/Max GPUs are the first big modern GPUs that I know of that use that technique.
 
  • Like
Reactions: Xiao_Xi
The whole OP is just a bunch of wild guesses.

We don't know how powerful an RTX4090 would really be, how much power it will drain, how much it will cost or when it will be available.

We also don't know if the M1QuadrupelMax is actually a thing, how it will perform, in what system it will go, what that will cost or even if it will be the absolute top in Apple's lineup.
 
We don't know what kind of hardware Mac Pro will include. We don't even know it will be a tower computer like it is today. Apple may even go for the something like the "trash can Mac Pro" (not saying they will, just trying to say that any kind of design is possible at this point). They could simply create a tier of its own rather than positioning Mac Pro as a tower PC alternative.

I have a custom mini-ITX PC (R9-5900X & RTX3060Ti) and I have realized that I'm using it less and less since I got M1 Max MacBook Pro except for some work specific software and games (for which I use my console more actually). If Apple introduces a Mac Pro that consumes less power than a graphics card alone, and they include better video codecs, machine learning capabilities and we see more compatible apps that will show how inefficient to work with many PCs, I wouldn't care much how it compares to a custom PC that is bigger, louder, more power hungry and more prone to software bugs.
 
  • Like
Reactions: GuruZac and KeithBN
I don't think that we should take this seriously at this point in time, we have too little information on the RTX 4090 for serious comparisons and it's very difficult to say how it'll perform even with this rumoured info. A M1 Max Quadro would likely multiple the current performance of the MacBook Pros... but there aren't too many leaks waving towards Nividia's 40 series and it'll definitely take a long time for them to hit the market.
 
Should the Mac Pro GPU have ray tracing hardware?

How could Final Cut take advantage of ray tracing hardware?
 
Clickbait headlines that indicate "In Trouble" or "Mac Killer" are so annoying. Tom's Hardware is infamous for them IMHO. Saying an unreleased product is in trouble due to an unreleased product that will be expensive, power-hungry, and in great demand does this forum no justice.

I get it.....there will be faster CPUs, there will be faster GPUs. Is that enough to make the switch? What about the effective speed of the Codec? I'm not processing in 8K and don't plan on it and can ill afford a 6K monitor that allows me to edit above 4K resolution (wish I could edit on a 5K monitor, but hard to get). For my photography work, the majority of my work, unless I got to a 100 MB image output I doubt I will outgrow a M1 Max in a long time.

My 54mb images can grow to near 500 MB when doing serious Photoshop work and running that through a add-in can press the limits of many of system, but I'm not seeing any strain at all.

Can we just stop the baiting?
 
RTX 4090 is mainly for gaming, not for workstation. Also, Mac Pro can use multiple GPU.

But with M1 GPU cores, it wont be that better than RTX 4090 for sure unless Apple add specialized chips such as Prores decode/incode chips and NPU. This is why M1 Max is a video editing beast which can even outperform a high end desktop in real life. Who knows if they add ray tracing cores for 3D or more NPU cores for machine running?
 
I personally am very worried about a card which isn't proven, and apparently won't be possible to buy when it is released.

I see white flags replacing six colours in Cupertino.?
 
the new mac pro arm will need
at least up to 128 GB ram but more likely an max of 512 GB / 1 TB or more.
storage blades up to 2-4 TB of apple storage and really needs to have some sata / m.2 (pci-e) ports.
pci-e slots are needed as well.

dual 10G networking rj45 + maybe SFP+ (or better) ports as well.

USB A ports
 
The performance of current Apple Silicon has been amazing and it has basically been confined to notebook computers. Imagine how powerful Apple Silicon could be without the power or cooling constraints of the current products.
 
I don’t see some slight theoretical performances advantage making a difference because they are not running on the same operating system. If one was three times faster then it might mean it’s worth changing operating systems but unless that happens it’s irrelevant.

There’s always improvements so next year Apple is going to come out with their latest and greatest. Also you have to look at supply chain issues and can you even get these cards. Nvidia keeps claiming they’re going to fix the issue but so far it hasn’t happened.
 
The performance of current Apple Silicon has been amazing and it has basically been confined to notebook computers. Imagine how powerful Apple Silicon could be without the power or cooling constraints of the current products.
but can it scale up to server / workstation levels?
What is the chip to chip io like?
How much of an hit will there be with say dual or more socket's setups?
What is the max ram cap?
What is the pci-e lanes like?
What is the max video screens out like?
 
A future Apple Silicon Mac Pro is not (and likely will not) be threatened by a future RTX 4090, even in the extremely unlikely event that it is actually 3x as fast as a 3090.

The reason I can say this without any doubt in my mind is simple.
An Apple Silicon Mac Pro will live or die based not on raw GPU, or even CPU performance, but on how well it succeeds as a platform.

How many of the important apps will have native Apple Silicon versions?
Can Metal make concrete progress in catching up to CUDA, Vulkan and other graphics APIs and begin to achieve real growth in adoption?
How well is Apple going to develop the capability to actually deploy the machine learning capabilities of its Macs?
Will professionals be willing to take a risk on a 2013 nMP like design with less expansion, OR will Apple dramatically expand the types of expansion options available on Apple Silicon and build a real replacement for the 2019 MP?
These and so many other questions are, IMHO what will define whether an Apple Silicon MP is "in trouble," or not.
Given all of those other hurdles, if we can make it to the point where it's actually "Do I want a PC with the 4090 or an Apple Silicon Mac Pro with an M2 Max Quad for rendering?" that in and of itself will represent a significant win in my book. As things stand right now, Apple Silicon could be, on paper, significantly faster than an RTX 4090 and you'd still have a lot of people saying "well, that's nice and all but my workflow relies on CUDA and doesn't have a Metal alternative or Metal support is still in beta and slow/unreliable"

None of this is to say raw speed isn't important just that there's so much else in play that it doesn't make sense to speculate how well a future Mac Pro (which we no very little to nothing about) compares (on paper no less) to an Nvidia GPU that... we know very little about...
 
This click-bait post was successful, lots of replies :). On a more serious note, this is really a non-issue in macOS land. Doesn't matter how powerful the 'RTX 4090' will be, as it is unlikely you would be able to use it in a Mac anyway. On the flipside, it also doesn't matter how much power the 'RTX 4090' would use, as with workstations, power draw is not typically a concern, just speed. The faster you can produce work, the more money you can make.



Rich S.
 
  • Like
Reactions: Timpetus
None of NVIDIA's performance jumps claims have ever been even remotely accurate. It's all fluff.
 
850w TDP?!? https://wccftech.com/nvidia-geforce-rtx-40-ada-lovelace-gpus-september-launch-upto-850w-tdp-rumor/ - Surely, they cannot be serious? This should be the wattage for an entire system, not a single GPU. Of course it will be faster, when you start pushing that much power through the GPU, the frequencies go up and so does the power. This sounds like NVIDIA is copying Intel's modus operandi of simply upping the power envelope and frequencies, then their stock price is simply not sustainable as it sounds like they are done innovating.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.