Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

nph

macrumors 65816
Original poster
Title says it all, but also if you have how much memory did you get?
I just ordered a 13 inch M5 with 24G and hope to run LLM locally.

Thanks

/Peter
 
  • Haha
Reactions: UpsideDownEclair
I do on my M4 with 24GB. Not for anything serious, but for learning, fun, and to be able to have select AI conversations stay local.

No problems at all.

M5 should be 15% to 25% quicker.
 
  • Like
Reactions: snipr125
Yes. I have a new 24gb 1tb M5 13" MBA. I loaded qwen2.5-coder:14b for a trial and it was impressive. Ran as fast as my honking big AI Pc with a Radon 7900xt 20gb. But... It almost instantly got hotter than I have ever felt a laptop reach. And I mean putting an i7 or i9 big fan laptop to shame for heat. No doubt it immediately throttled although I didn't check, but that was the last attempt at an LLM on that box. I didn't buy it for AI and have other equipment for that use. IMO, the Air is the wrong box for heavy duty AI, even if it performs well and I have no desire to burn up a really nice laptop that I need for other uses.

An M5 Mini or Mini Pro should be a super LLM box when they come out, just because of the presence of a fan.
 
  • Like
Reactions: snipr125
Yes. I have a new 24gb 1tb M5 13" MBA. I loaded qwen2.5-coder:14b for a trial and it was impressive. Ran as fast as my honking big AI Pc with a Radon 7900xt 20gb. But... It almost instantly got hotter than I have ever felt a laptop reach. And I mean putting an i7 or i9 big fan laptop to shame for heat. No doubt it immediately throttled although I didn't check, but that was the last attempt at an LLM on that box. I didn't buy it for AI and have other equipment for that use. IMO, the Air is the wrong box for heavy duty AI, even if it performs well and I have no desire to burn up a really nice laptop that I need for other uses.

An M5 Mini or Mini Pro should be a super LLM box when they come out, just because of the presence of a fan.
Periodically I run local AI tasks on my M2 MBA when my MBP is busy with other things. It gets pretty toasty, but I imagine the M4 gets hotter. AI heats it up far faster than gaming. I have a chill pad for it with active fans and sometimes mount a desktop fan to blow across the top, which helps quite a lot in moving the heat away and keeping it going at better speeds for long tasks. I can game on it for 45-60 minutes before I see even a 10% FPS drop in performance without the chill pad or fan, but only 10-15 minutes when running AI models.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.