Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I suspect Ai rendering engines will beat Nvidias raytracing cores pretty soon… if not already?

Apple has shipped Ai cores since M1 on the desktop, and for years and years with the A-series… I think Apple knows whats going on…
 
  • Like
Reactions: ConvertedToMac
The issue is that much of the market for high-end GPUs is either tied to CUDA (/Linux) or to Windows.
Neither of which is available on Apple products, so it is really, really questionable whether it makes any sense for Apple to produce hardware to run software that simply doesn’t exist on the platform. They have chosen to serve certain parts of the video market with dedicated hardware on their SoCs, so … how much remains to adress, really?

Apple has a niche in producing nice, performant hardware with generally good ergonomics and modest dimensions. For the people who are OK with paying a bit more for ”nice”.

As someone who is interested in computer hardware architecture, I’d love to see Apple produce a new Mac Pro system!
Do I think it makes any kind of sense for them to do so? No.
That's because Apple GPU sucks. Simple. Beside, their Metal API also sucks that a lot of software don't event use it unless they really need it such as Adobe.

Apple isn't trying to make better GPU while focus on software aspects such as Metal API which is a huge problem. The hardware itself is slow so CUDA stuff is pointless. You better check TFLOP of Apple GPU compared to Nvidia GPU. Hell, they even advertised that M1 Max = mobile 3080, M1 Ultra = 3090 and yet none of them were able to reach that.

Apple is just not good at making good GPU, that's all. Nothing new since Mac was never known of great GPU performance.
 
What do you all think are the chances Apple discontinues the m1 air?

I kind of like the design better than the m2, and not interested in a 15” laptop…wondering if i should order one before they’re gone. But a 3+ year old computer sounds like not a great idea at this point. Would be great if they kept it on and dropped the price another couple hundred bucks..
 
M3 all depends on when TSMC can workout its issues. New fab in Arizona having issues with staffing trying to ramp up. Then TSMC not getting the quantities they hoped for in their existing fabs. Apple is at the mercy of TSMC.
 
I suspect Ai rendering engines will beat Nvidias raytracing cores pretty soon… if not already?

Apple has shipped Ai cores since M1 on the desktop, and for years and years with the A-series… I think Apple knows whats going on…
For certain tasks they are most definitely faster. Image editing using AI is one of them.
 
That's because Apple GPU sucks. Simple. Beside, their Metal API also sucks that a lot of software don't event use it unless they really need it such as Adobe.

Apple isn't trying to make better GPU while focus on software aspects such as Metal API which is a huge problem. The hardware itself is slow so CUDA stuff is pointless. You better check TFLOP of Apple GPU compared to Nvidia GPU. Hell, they even advertised that M1 Max = mobile 3080, M1 Ultra = 3090 and yet none of them were able to reach that.

Apple is just not good at making good GPU, that's all. Nothing new since Mac was never known of great GPU performance.
Sorry this is opinion and hyperbole, and not really factual. It depends on the task at hand; for some image-editing tasks in Adobe Photoshop and Lightroom, my M1 Ultra is on par with the performance an Nvidia RTX 4080 and faster than any of the 30x0 series. This is in real-world timed tests. There are other packages where this is the same--DxO and Topaz both come to mind. Again, real-world timed tests. Their GPUs are very capable in these scenarios and when the ANE is being leveraged, they outperform the RTX cards on many tasks. They are not great for 3D work. But that's not every use case.
 
Sorry this is opinion and hyperbole, and not really factual. It depends on the task at hand; for some image-editing tasks in Adobe Photoshop and Lightroom, my M1 Ultra is on par with the performance an Nvidia RTX 4080 and faster than any of the 30x0 series. This is in real-world timed tests. There are other packages where this is the same--DxO and Topaz both come to mind. Again, real-world timed tests. Their GPUs are very capable in these scenarios and when the ANE is being leveraged, they outperform the RTX cards on many tasks. They are not great for 3D work. But that's not every use case.
Not being good at 3D already means it's bad.

Adobe Photoshop and Lightroom is such a poor example as they take advantage from high clock CPU speed, not GPU performance which is not even GPU intensive software. If anyone testing Apple GPU with 2D based on not GPU intensive software, then they have no idea what they are doing. Seriously, you only tested in too specific area where GPU doesn't really matter. Why do you even limit the test result with Adobe Photoshop, Lightroom, DXO, and Topaz? Those results dont represent the performance toward 3D at all.

All you doing is justifying its performance based on 2D fields, not 3D fields which is way more GPU intensive. Is it a joke or what? What you are claiming its that Apple GPU is good with none GPU intensive software. So how about game, graphic, AI, and more? Apple didnt even invest any of them.

If you really wanna check Apple GPU's performance, I would highly suggest to test them with GPU intensive software but then, Apple Silicon Mac does NOT have or support GPU intensive software which already defeats its purpose. This is why Mac is only well known for 2D or none GPU intensive software.

If you still want some real life test, Resident evil 8 is a great example: M1 Max is not even close to mobile RTX 3070.
 
Last edited:
Have you tried the latest Docker on apple silicon? There is an option to use Rosetta (on docker 4.19) for emulation and it speeds up the build process quite a bit. Even without this feature, you can still target the x86/amd64 arch, but it was really slow.

View attachment 2210432
Oh cool, I'll try it (though I'm not overly optimistic based on my past frustrations). The QEMO virtualization used before would crash all the time and it was like 1000x slower than my old Intel Mac (no exaggeration).
 
I disagree. The M2 SoC supports up to 24 GB of RAM and up to ten GPU cores, which offers a good amount of performance increase over the M1 SoC. And because iMac case has two cooling fans, it means the M2 can run at full speed with far less chance of running into thermal throttling. That is unless Apple wants to wait for the arrival of the M3 SoC with its totally new CPU and GPU cores in October 2023.
The M2 Max goes up to 96GB RAM.
 
Not being good at 3D already means it's bad.

Adobe Photoshop and Lightroom is such a poor example as they take advantage from high clock CPU speed, not GPU performance which is not even GPU intensive software. If anyone testing Apple GPU with 2D based on not GPU intensive software, then they have no idea what they are doing. Seriously, you only tested in too specific area where GPU doesn't really matter. Why do you even limit the test result with Adobe Photoshop, Lightroom, DXO, and Topaz? Those results dont represent the performance toward 3D at all.

All you doing is justifying its performance based on 2D fields, not 3D fields which is way more GPU intensive. Is it a joke or what? What you are claiming its that Apple GPU is good with none GPU intensive software. So how about game, graphic, AI, and more? Apple didnt even invest any of them.

If you really wanna check Apple GPU's performance, I would highly suggest to test them with GPU intensive software but then, Apple Silicon Mac does NOT have or support GPU intensive software which already defeats its purpose. This is why Mac is only well known for 2D or none GPU intensive software.

If you still want some real life test, Resident evil 8 is a great example: M1 Max is not even close to mobile RTX 3070.
What an utterly ignorant response. Both of those applications leverage the GPU *extensively* for many tasks, and every release uses more and more GPU power (and GPU memory). They are also starting to use the neural engine cores. Tasks like AI-based masking, noise reduction and generative AI-based image editing are *extremely* GPU intensive and have absolutely nothing to do with 3D performance.

Your implication that a GPU only matters for 3D application use is incredibly narrow-minded and completely ignorant to how they are used broadly in the market.
 
  • Like
  • Disagree
Reactions: steve123 and sunny5
Lol, ‘best ever’ versions of logic and final cut pro? You’re kidding right? More like imovie pro and garageband pro
I'd like to see desktop iPadOS devices introduced to continue the drive away from macOS. Now we have powerful tools and best-ever versions of Final Cut Pro and Logic Pro on iPadOS, it's time to go harder and faster in pursuit of the future.
 
  • Like
Reactions: nathansz
What an utterly ignorant response. Both of those applications leverage the GPU *extensively* for many tasks, and every release uses more and more GPU power (and GPU memory). They are also starting to use the neural engine cores. Tasks like AI-based masking, noise reduction and generative AI-based image editing are *extremely* GPU intensive and have absolutely nothing to do with 3D performance.

Your implication that a GPU only matters for 3D application use is incredibly narrow-minded and completely ignorant to how they are used broadly in the market.
Do you even use them? Hell, it's nothing compared to GPU intensive software. Beside, PC can do both so why wouldn't you bother to compare both 2D and 3D instead of only 2D? You just ignoring the major problem with Apple GPU by simply ignoring my point. This is so typical that Mac users only praise Mac based on 2D softwares.
 
Do you even use them? Hell, it's nothing compared to GPU intensive software. Beside, PC can do both so why wouldn't you bother to compare both 2D and 3D instead of only 2D? You just ignoring the major problem with Apple GPU by simply ignoring my point. This is so typical that Mac users only praise Mac based on 2D softwares.
LOL, I make my living using both 2D and 3D software and I buy the right tools for the right jobs. Hands down the Apple Silicon versions of the Adobe products outperform the Windows versions right now, even on top of the line Windows machines. I have both--the Windows machines with a whole range of Nvidia RTX (from RTX 4000's through the newest A6000) cards for 3D rendering and a suite of M1 Max, Ultra and M2 Max Apple Silicon computers. I know exactly which performs better for which tasks. Hate to burst your bubble, but the top of the line Nvidia cards just aren't exactly great for certain tasks, even if they are the best for others and Apple's new GPUs and ANE do extremely well at other tasks--matching or besting the Nvidia cards.
 
  • Like
  • Disagree
Reactions: spaz8 and sunny5
LOL, I make my living using both 2D and 3D software and I buy the right tools for the right jobs. Hands down the Apple Silicon versions of the Adobe products outperform the Windows versions right now, even on top of the line Windows machines. I have both--the Windows machines with a whole range of Nvidia RTX (from RTX 4000's through the newest A6000) cards for 3D rendering and a suite of M1 Max, Ultra and M2 Max Apple Silicon computers. I know exactly which performs better for which tasks. Hate to burst your bubble, but the top of the line Nvidia cards just aren't exactly great for certain tasks, even if they are the best for others and Apple's new GPUs and ANE do extremely well at other tasks--matching or besting the Nvidia cards.
I also make living using both 2D and 3D and what you are saying is totally wrong. And since when Adobe works better on Mac when their optimizations is still poor? Is that all you can say about Apple GPU's performance which isn't even GPU intensive? What a surprise. If you are talking about the GPU usage from monitor, then you are totally wrong about it. Since you cant even show real life test results to prove your point, I'll take it as trolling.
 
I also make living using both 2D and 3D and what you are saying is totally wrong. And since when Adobe works better on Mac when their optimizations is still poor? Is that all you can say about Apple GPU's performance which isn't even GPU intensive? What a surprise. If you are talking about the GPU usage from monitor, then you are totally wrong about it. Since you cant even show real life test results to prove your point, I'll take it as trolling.
Nice try. I've referred to actual applications used and graphics cards I own. You've linked to a YouTube video and referred to a game as a benchmark. Sorry, I think it's clear who the troll is. And now for the Hide button.
 
  • Like
  • Disagree
Reactions: jujoje and sunny5
Nice try. I've referred to actual applications used and graphics cards I own. You've linked to a YouTube video and referred to a game as a benchmark. Sorry, I think it's clear who the troll is. And now for the Hide button.
Game is a great example as they usually take all GPU powers unlike Adobe Photoshop and Lightroom you mentioned. Beside, you dont even have benchmark results for those, is it not?

Also, 2D performance does NOT represent 3D performance and yet you just dont wanna accept the poor GPU performance. Hell, M1 Ultra is not even close to RTX 3090 which makes PC users laugh at us all the time.

What you are saying is that Apple GPU is powerful without proofs instead of bringing Photoshop and Lightroom which isn't even GPU intensive.

Stop dreaming.
 
Last edited by a moderator:
  • Haha
Reactions: jujoje
They're revamping the Mac Pro for a completely replaceable upgrade system probably starting with the M3 and that made the M2 Studio show up quicker as it won't be killing the Mac Pro.

Which is absolutely the smartest way to go. It was a hard lesson learned "Release no Mac Pro before its time, and it had better be completely upgradeable in every way." Ram, processor, everything.

But that Mac Pro would truly be a game changer.
completely replaceable upgrade system from Apple 🤣😆😂
 
Sorry this is opinion and hyperbole, and not really factual. It depends on the task at hand; for some image-editing tasks in Adobe Photoshop and Lightroom, my M1 Ultra is on par with the performance an Nvidia RTX 4080 and faster than any of the 30x0 series. This is in real-world timed tests. There are other packages where this is the same--DxO and Topaz both come to mind. Again, real-world timed tests. Their GPUs are very capable in these scenarios and when the ANE is being leveraged, they outperform the RTX cards on many tasks. They are not great for 3D work. But that's not every use case.
Totally Agree with you. I video edit with DaVinci Resolve on both a high powered PC (RTX4090) to use PC only plug-ins and on my MacBook Pro M2 Max. For editing, stability and render times with Pro Res or H265/264, it's the MBP hands down. The PC is limited to applications that are PC only plug-ins but video editing on it is just not as smooth with occasional crashes that happens twice as often. It is what it is - you choose the right tools for the job. For video editors it makes no sense to pay through the nose for the RTX4090 when overall workflow is simply smoother/quicker on the Mac. I can't speak to 3D rendering or Blender, but creators who are video editing can run circles around 30X0 PCs with a lighter (cheaper) more efficient M2 Macbook Air. I have yet to meet a video editor who doesn't prefer editing on an M2 Mac if they're using Resolve or Premier.
 
You really think a student buying a $499 Mac mini (actually, $697 since they'll need a mouse and keyboard for it) is going to drop $1499 on the Studio display? It would make sense to offer a reasonably priced display for those type of buyers.
True, I was telling/selling my cousin about his future first Mac, and I think he was a bit sticker shocked.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.