Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

sunny5

macrumors 68000
Jun 11, 2021
1,712
1,581
I find it amusing. How would Apple start supporting nVidia again, when they even ditched their long tradition supporting AMD?
From my read on the situation, Apple is comfortable focusing the research and development on their own chips, period. And it does not matter to Apple, that part of the customer base relied on the graphical capabilities of other producers.

Even if I prefer the older Intel era macs. I can very well imagine that several years from now, that I too will be happy to purchase a mac based on future versions of the Apple M-series.
Nvidia is just another Apple. They make their own ecosystem and not allow open up their system to people.

That's why Apple ditched Nvidia over AMD since they can modify/upgrade their GPU for Apple to use Metal. Nvidia's CUDA is simply not allowed that.
 

StuAff

macrumors 6502
Aug 6, 2007
385
256
Portsmouth, UK
NVidia didn't want Apple's business enough, if at all. I got the impression their beta driver program was one or two people in a cubicle in some distant unloved corner. Huang expected Apple to 'throw them a bone' (he tweeted about it) without actually making much of an effort. He could have picked up the phone and actually asked about winning some business. Apple certainly is anything but blameless in this, but ATI/AMD were certainly more amenable. Both of them were in Steve Jobs' doghouse at different times, only one got out of it…
 
  • Wow
Reactions: gusmula

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,527
955
Apple is trying to attract machine learning researchers.
The Swift programming language has a lot of potential to be used for machine learning research because it combines the ease of use and high-level syntax of a language like Python with the speed of a compiled language like C++.
 

sunny5

macrumors 68000
Jun 11, 2021
1,712
1,581
Apple is trying to attract machine learning researchers.

Nvidia is already dominating in machine learning and AI for a while and Apple is just a latecomer. Beside, Mac is using their own library and language which is a huge disadvantage and who even use Mac? Mac is limited only for 2D markets such as video, music, photo, illustration, and more. If you gonna use 3D or other stuffs, you better use PC instead. Too bad that Apple themselves are limiting their own market.
 
Last edited:

avkills

macrumors 65816
Jun 14, 2002
1,182
985
Apple Silicon itself has proven that it's not good for professionals with high specs just like workstation. Mac Pro will die and I dont think Apple is interested in professional markets at all. Sadly, it will affect Mac's major markets such as video and music due to the hardware limitation.

Truth be told, Apple CAN create their own markets but their GPU performance is dramatically poor and many software aren't even interested in Mac at all. There are so many issues with macOS and Mac itself so as long as Apple is being stubborn, I dont think it's gonna be solved. They are limiting themselves too much and it won't work like iPhone's iOS.

At this point, the pro market will die slowly.
I disagree as my M3 Max laptop easily trounces my 2019 Mac Pro in After Effects; and actually it trounces most PCs as well if you believe the validity of Pugent System's AE benchmark.

And everything to do with video editing is way faster on the M3 also, due to the hardware encoders/decoders. Basically anyone using Intel Macs will see immediate faster results on M3. I say M3 because in my eye it is the first system that checked 90% of the boxes for me.

It's kind of sad how much better this laptop is at After Effects; just goes to show how much Intel's Xeon chips from that era sucked.
 

sunny5

macrumors 68000
Jun 11, 2021
1,712
1,581
I disagree as my M3 Max laptop easily trounces my 2019 Mac Pro in After Effects; and actually it trounces most PCs as well if you believe the validity of Pugent System's AE benchmark.

And everything to do with video editing is way faster on the M3 also, due to the hardware encoders/decoders. Basically anyone using Intel Macs will see immediate faster results on M3. I say M3 because in my eye it is the first system that checked 90% of the boxes for me.

It's kind of sad how much better this laptop is at After Effects; just goes to show how much Intel's Xeon chips from that era sucked.
And yet, you only mentioning the video side which only proves my point. Beside benchmarks proves nothing compared to real life working. Currently, Mac is only limited to a few markets and even then, the hardware specs especially RAM is a huge problem.
 

avkills

macrumors 65816
Jun 14, 2002
1,182
985
Nvidia is just another Apple. They make their own ecosystem and not allow open up their system to people.

That's why Apple ditched Nvidia over AMD since they can modify/upgrade their GPU for Apple to use Metal. Nvidia's CUDA is simply not allowed that.
Umm, back when nVidia cards were still supported, You could use OpenCL, Metal or CUDA for compute, at least all those options were available in Adobe Creative Cloud apps.

Metal has gotten a lot better since then.

As far as 3D goes, Apple still commands the largest GPU Ram pool for a single card; that is very beneficial for modeling and such.

The fact that Blender, Octane and RedShift are all embracing Metal on macOS for 3D GPU rendering is a good thing and goes to show that Apple must be serious, otherwise they would not bother.
 

avkills

macrumors 65816
Jun 14, 2002
1,182
985
And yet, you only mentioning the video side which only proves my point. Beside benchmarks proves nothing compared to real life working. Currently, Mac is only limited to a few markets and even then, the hardware specs especially RAM is a huge problem.
I am telling you right now in real world usage, the M3 Max is faster than my Mac Pro at everything. Have not found one thing that the Mac Pro is faster at. The only thing I could beat my M3 Max on would be GPU rendering, but then I would need to add another W6800X card and why waste money on something like that.

If a M3 Ultra or Extreme is ever going to be a thing; it will be very good.
 

sunny5

macrumors 68000
Jun 11, 2021
1,712
1,581
I am telling you right now in real world usage, the M3 Max is faster than my Mac Pro at everything. Have not found one thing that the Mac Pro is faster at. The only thing I could beat my M3 Max on would be GPU rendering, but then I would need to add another W6800X card and why waste money on something like that.

If a M3 Ultra or Extreme is ever going to be a thing; it will be very good.
Because AS Mac Pro sucks and you are comparing M3 Max to M2 Ultra or Intel Mac Pro which is not a fair comparison. Even then, it will be limited for a few market usages and it will never be great on 3D or other stuffs where PC is good at.
 

avkills

macrumors 65816
Jun 14, 2002
1,182
985
Because AS Mac Pro sucks and you are comparing M3 Max to M2 Ultra or Intel Mac Pro which is not a fair comparison. Even then, it will be limited for a few market usages and it will never be great on 3D or other stuffs where PC is good at.
Why isn't a fair comparison, in my opinion the 2019 Mac Pro was the last "workstation" class tower Apple released; pretty much why I bought one only to be stabbed in the back by Apple with the Apple Silicon announcement.

If one *must* be in the Apple ecosystem or prefers macOS over Windows, then it is every bit of a fair comparison.

And since you are kind of getting snotty, please name one nVidia card that has more VRAM than the top spec M3 Max's GPU can address?
 

sunny5

macrumors 68000
Jun 11, 2021
1,712
1,581
Why isn't a fair comparison, in my opinion the 2019 Mac Pro was the last "workstation" class tower Apple released; pretty much why I bought one only to be stabbed in the back by Apple with the Apple Silicon announcement.

If one *must* be in the Apple ecosystem or prefers macOS over Windows, then it is every bit of a fair comparison.

And since you are kind of getting snotty, please name one nVidia card that has more VRAM than the top spec M3 Max's GPU can address?
You are comparing with 3~4 years old tech. Such a poor comparison after all.
 

avkills

macrumors 65816
Jun 14, 2002
1,182
985
You are comparing with 3~4 years old tech. Such a poor comparison after all.
I am not in charge of Apple's upgrade cycle, the 2019 should have been updated a month after new Xeon's were announced by Intel. But Apple Silicon, so that was a no-go for obvious Apple reasons.

Ummm, the last time I checked the Pugent site on After Effects benchmarks (last month); the M3 Max was beating everything.

After Effects is a huge part of my workflow, so performance in it is very important to me; but I also kind of hate Windows. But I have a Windows box anyway.
 

RedTheReader

macrumors 6502a
Nov 18, 2019
503
1,223
Everything is old tech the minute it gets purchased. So what's your point?
You really don't think there's any validity in the idea that a modern computer shouldn't be compared to a 4 year old one? I think the spirit of this thread is looking at Apple through the lens of modern x86 (Windows/Linux) offerings.
 
  • Like
Reactions: sunny5

sunny5

macrumors 68000
Jun 11, 2021
1,712
1,581
I am not in charge of Apple's upgrade cycle, the 2019 should have been updated a month after new Xeon's were announced by Intel. But Apple Silicon, so that was a no-go for obvious Apple reasons.

Ummm, the last time I checked the Pugent site on After Effects benchmarks (last month); the M3 Max was beating everything.

After Effects is a huge part of my workflow, so performance in it is very important to me; but I also kind of hate Windows. But I have a Windows box anyway.
Everything is old tech the minute it gets purchased. So what's your point?
Wow, you really think that's how it works? Your logic already failed.
 
  • Angry
Reactions: hovscorpion12

avkills

macrumors 65816
Jun 14, 2002
1,182
985
You really don't think there's any validity in the idea that a modern computer shouldn't be compared to a 4 year old one? I think the spirit of this thread is looking at Apple through the lens of modern x86 (Windows/Linux) offerings.
I think it would only be valid if macOS ran on those systems *easily* (it can be done, but for the most part, too much of a headache.)

If we digress from this forum being primarily a Mac focused forum and we are purely just comparing tech to tech, then yes it is valid to compare everything to everything.

CPU wise; Apple still holds it's own unless you want to do something stupid like compare it to a 96-core thread ripper and complain because the thread ripper is faster at multi-threaded workflows --- DUH!

GPU wise, Apple still has a lot of work to do, but they also command the largest VRAM pool that can be allocated. Based on the timeline of 3 years, I think they are doing pretty good. And Apple has not even attempted to make a power slurping massive GPU. I doubt they ever will.

Nobody knows what Apple will do because they also never release roadmaps, which is also something that bugs the crap out of me. But at the end of the day, I am very impressed with the M3 Max.
 

hurt97

macrumors member
May 13, 2022
40
68
This thread is calling for the entire industry to put all their eggs in the Nvidia basket because “number go up”?

This fundamental misuse of the marketing term “AI” is going to come crumbling down in a few quarters when people realize this is the same cycle that blockchain, “machine learning”, and crypto has gone down.

Apparently the world learns nothing from every bubble bursting….

AI has its place, but the decrees from above that this is going to transform everything is just a load of ****. There’s no intelligence here, just fancy applied statistics. It’s a great *tool* for certain tasks, but the claims coming from those *who have an economic interest in this being the Next Big Thing* are just that, short term stock bumping.

Nearly every AI company is funded through SPACs. Meaning they’re just a quick way to separate fools from their money.
Found the guy who thought NVDA was too expensive at 300, now he's doubling down.

"Nvidia's largest US customers are Amazon, Microsoft, Google, Meta, and Dell."

You see they are selling to real companies, not SPAC funded AI companies.
 
  • Love
Reactions: turbineseaplane

dmccloud

macrumors 68030
Sep 7, 2009
2,995
1,738
Anchorage, AK
What about the Apple R1 processor? Is that part of a new graphics path that Apple will focus on instead?

Specifics are few and far between, but I believe that R1 is the component responsible for managing the spatial aspects of Vision Pro in conjunction with the cameras and sensors built into the device. The graphics side is still left to the M2.
 

treacher

macrumors regular
Feb 16, 2024
186
317
I've seen other chips run with little difference between wattages. Apple's architecture is designed for industry-leading efficiency, and will probably not use extra wattage.
 
Last edited:
  • Like
Reactions: heretiq
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.