Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Interesting. It appears that it is research applicable to Nvidia since they don’t mention apple silicon in the published paper - read it quickly
Well the original research had already been developed on/for and applied to Apple silicon. This is an expansion of that earlier development to NVIDIA.
 
Since Apple now produces its own GPUs there is no need for hell to freeze over.

Apple's own gpu's still can't compete with AMD gpu's from 2 or 3 years ago on their own OS running their own metal API
 
  • Disagree
Reactions: eas
True. I still remember that I bought the last 13” MacBook Pro (2010) with an “integrated” Nvidia graphics chip, the GeForce 320M, which performance remained strong for several generations (if I recall correctly, the Intel HD graphics 3000 wasn’t as powerful).
I had the mid-2010 15" MacBook Pro with a discrete Nvidia GPU and the GPU was one of many that were defective. It would would cause random kernel panics when it was in use to the point where I had to use a software tool to prevent the machine from ever switching to the discrete GPU to maintain stability. Apple eventually had to do a recall on the laptop to replace the defective chips, which is no doubt part of why the relationship soured.
 
I had the mid-2010 15" MacBook Pro with a discrete Nvidia GPU and the GPU was one of many that were defective. It would would cause random kernel panics when it was in use to the point where I had to use a software tool to prevent the machine from ever switching to the discrete GPU to maintain stability. Apple eventually had to do a recall on the laptop to replace the defective chips, which is no doubt part of why the relationship soured.
Oh! I thought the one with the detective GPU was the 2011!
 
Oh! I thought the one with the detective GPU was the 2011!
It was both years! The mid-2010's defects weren't as widespread and while they resulted in an extended repair program I don't believe there was a class action lawsuit like there was for the 2011 series. The 2011 model issues were even worse and impacted far more units so it got the majority of the negative press coverage.
 
  • Like
Reactions: Populus
This makes sense as Apple is likely running Nvidia chips in their LLM clusters in their data centers. Hopefully they also include these updates in MLX if possible to improve LLM performance on M-Series processors.
 
I had the mid-2010 15" MacBook Pro with a discrete Nvidia GPU and the GPU was one of many that were defective. It would would cause random kernel panics when it was in use to the point where I had to use a software tool to prevent the machine from ever switching to the discrete GPU to maintain stability. Apple eventually had to do a recall on the laptop to replace the defective chips, which is no doubt part of why the relationship soured.
The GeForce 8600M GT. A big stain on their partnership.
 
Nice that Apple is collaborating with nvidia on making AI faster. My advice to Apple is focusing at your own AI offerings first. Because Siri and the other offerings badly need some speed and accuracy.
 
Apple has realized that they have dropped the ball. They are probably desperate because if the market shifts to AI, it would not be good for Apple. They could be the next Blackberry.
 
  • Disagree
  • Like
Reactions: eas and v0lume4
You mean discrete rather than integrated

an integrated gpu is built in to the cpu
Thus, my quote-unquote. Was technically discrete, but back then most integrated GPUs were a separated chip from the CPU. It was quite a different time, but raw power wise, it was an “integrated” graphics chip.

I’m not going to argue here about the technical or literal meaning of a word. It was an integrated GPU and that’s how most people referred to it 15 years ago.
 
Thus, my quote-unquote. Was technically discrete, but back then most integrated GPUs were a separated chip from the CPU. It was quite a different time, but raw power wise, it was an “integrated” graphics chip.

I’m not going to argue here about the technical or literal meaning of a word. It was an integrated GPU and that’s how most people referred to it 15 years ago.

I love when people make their argument and then say "I'm not going to to argue about this"

in any case, I suppose we recollect differently. I would say 15 years ago most people referred to it as a discrete gpu as opposed to an integrated gpu (such as intel hd) even though it had integrated memory

also "back then" intel's integrated gpus were on the same package or die as the cpu, just like today. it was a different time to be sure, but no so much in this instance

anyway, I'm not going to argue here about the technical or literal meaning of a word ;)
 
Last edited:
  • Like
Reactions: vorkosigan1
Apple Intelligence is proof that any help they can get in the area they should outright get it.
 
NVIDIA: “Strike me down and I’ll become more powerful than you can possibly imagine”
 
This whole thing is a total humiliation for Apple. What a disaster for a multi-trillion dollar tech company. They should be a leader in AI. Sorry, but this whole thing is a fiasco.

NVIDIA was always superior than Apple at scientific computing. In fact, nobody is even close to NVIDIA at all.
 
I love when people make their argument and then say "I'm not going to to argue about this"

in any case, I suppose we recollect different. I would say 15 years ago most people referred to it as a discrete gpu as opposed to an integrated gpu (such as intel hd) even though it had integrated memory

also "back then" intel's integrated gpus were on the same package or die as the cpu, just like today. it was a different time to be sure, but no so much in this instance

anyway, I'm not going to argue here about the technical or literal meaning of a word ;)
Lol, I never saw a Core 2 Solo or Core 2 Duo (like the 13” MacBook Pro we’re talking about) with a graphics card on the same die as the CPU. It wasn’t until the first i3 and i5 that Intel started to integrate (literally, this time) their GPUs in the CPU die, that’s when it started to be literally integrated within the CPU die. But we’re talking about chips prior that era.

If you want it, the “integrated graphics” expression used back then was probably referred to being integrated into the motherboard or the chipset.

I just needed to make a quick search of my old 13” 2010 MacBook Pro, and the first review, not less than from Notebookcheck, one of the most renowned websites of tech reviews, to see how “back then” people referred to it as integrated graphics. Was it literally integrated into the CPU? NO because that came later, with the new generation of Intel i3 and i5. But, again, we’re talking about technology previous to that gen.


IMG_0054.jpeg


I don’t really understand why make such a big deal of it, if I (and professional reviewers) want to call it “integrated graphics”, when I honestly think most of us understand what we mean, regardless of where’s the GPU.
 
  • Like
Reactions: nathansz
Apple's own gpu's still can't compete with AMD gpu's from 2 or 3 years ago on their own OS running their own metal API
This isn’t true.

M4 Max is competitive with mobile nvidia offerings right now.

M4 Ultra will roughly tie or beat 4090 desktop in many applications.

Apple is making huge gains every year and 3d rendering is right now at parity with most of what AMD sells across the board.

The 5090 will be another story but the card alone draws 600w which is 2 Mac Pros at full tilt. The fact that they are even competitive using the power they do is already impressive. I expect within 3 years there will be parity, sooner if we see a Quad chip or chiplet designs that offer 4x the Max GPU cores.

The shared memory pool on Apple Silicon also enables very large local AI models to run. Token speed isn’t as fast but that isn’t really a problem when running locally, the fact that you can run the highest quality models at all is a huge benefit and not something possible on most PCs that cost under $10,000.

I suggest you read up on this further to get a better idea of what Apple is doing, once they implement ECC and offer 256+ GB there will be some really impressive possibilities for local model development and operation.


All that said, I’d still like to see Apple support external partners for Graphics Cards, but I doubt it will happen.
 
Interesting to see this partnership. Waiting to see what improvements Apple software will have due to this in the future.
 
  • Like
Reactions: mganu
Lol, I never saw a Core 2 Solo or Core 2 Duo (like the 13” MacBook Pro we’re talking about) with a graphics card on the same die as the CPU. It wasn’t until the first i3 and i5 that Intel started to integrate (literally, this time) their GPUs in the CPU die, that’s when it started to be literally integrated within the CPU die. But we’re talking about chips prior that era.

If you want it, the “integrated graphics” expression used back then was probably referred to being integrated into the motherboard or the chipset.

I just needed to make a quick search of my old 13” 2010 MacBook Pro, and the first review, not less than from Notebookcheck, one of the most renowned websites of tech reviews, to see how “back then” people referred to it as integrated graphics. Was it literally integrated into the CPU? NO because that came later, with the new generation of Intel i3 and i5. But, again, we’re talking about technology previous to that gen.


View attachment 2464324

I don’t really understand why make such a big deal of it, if I (and professional reviewers) want to call it “integrated graphics”, when I honestly think most of us understand what we mean, regardless of where’s the GPU.

well done

I concede!

I think age is starting to eat away at my memory!
 
  • Like
Reactions: Populus
True. I still remember that I bought the last 13” MacBook Pro (2010) with an “integrated” Nvidia graphics chip, the GeForce 320M, which performance remained strong for several generations (if I recall correctly, the Intel HD graphics 3000 wasn’t as powerful).

Born with Snow Leopard, with each new iteration of OSX first, and macOS later, it ran better each year (replacing the HD for an SSD during the Yosemite era, which gave it a second life).

Was sold running its last supported operating system: macOS High Sierra. It was a great machine! My first Mac.
Your story sounds so similar to mine. I had the same machine. My first Mac, used it for over 10 years as my daily driver. A RAM and SSD upgraded saved it after Apple crippled it with an OS update. Also a new battery. I did some good gaming on that machine via BootCamp.

I’ll never own a computer that good again. It truly was from a different Apple than the one today.
 
Your story sounds so similar to mine. I had the same machine. My first Mac, used it for over 10 years as my daily driver. A RAM and SSD upgraded saved it after Apple crippled it with an OS update. Also a new battery. I did some good gaming on that machine via BootCamp.

I’ll never own a computer that good again. It truly was from a different Apple than the one today.
In my case, it was 8 years I think, but it was also a daily usage. And despite the limitations of that machine, and how burning hot it got sometimes, I also played games such as L4D or Portal 2. Yeah, a great machine. I can only hope the next owner has treated it as well as me, or even better.
 
  • Like
Reactions: v0lume4
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.