Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
consider that if all your computers also had -zero- RAM you could go spend time in the mountains...

This is true.

I guess my point is that RAM is not the central factor in my equation.

Interestingly-enough, I am going to spend time "in the mountains" next week, and I will also be taking my M4 MBA (as well as a 4K LCD (I am working-through my Master's, and I need portability (as well as visibility (for my ancient eyes))).

Though I really enjoy the volume of RAM that I (technically, and currently) have available, I am fully confident that the 24GB of RAM in my MBA is more-than-sufficient to meet my demands.

To-the-thesis:

No: I do not really need 48GB of RAM.
 
  • Like
Reactions: goldmac2006
so guys 24gb wont be enough for my needs ?
I would go for 48 because once you buy, you are stuck with whatever you chose.

If you end up needing more, then you have to junk that computer and get another one.

Good for Apple finances, not so much for you.
 
this is insane, the same people defending 8gigs of ram a few months ago are saying this guy needs 48gigs of ram lol

its because he added local LLM. Those things need huge amounts of RAM.

A project I had a little while back was to get a local LLM to read scenes from my story in turn, create an “image generation prompt”, send that to A1111 to generate the image, then insert that image into the scene document (“illustrating my story”). Running both LLM and StableDiffusion side-by-side wouldn’t have been practical on my 24GB MBP.

do you have a small app/script you wrote to do this or does your local LLM handle all of this?
 
I would go for 48 because once you buy, you are stuck with whatever you chose.

If you end up needing more, then you have to junk that computer and get another one.

Good for Apple finances, not so much for you.

there is $400 price difference between 16GB and 32GB models 🥲
 
do you have a small app/script you wrote to do this or does your local LLM handle all of this?
At the time, I knew very little (I come from a web/PHP background) and I wanted to learn more about LLM and python. So I was guided a lot by online AI (like Grok, and ChatGPT in the beginning; Claude and Kimi K2 later) as I kind of bumbled my way through figuring out what is possible and how to do it.

So it’s a python script that handles all the legwork, but you need something like LMStudio and A1111 installed to do the heavy lifting.
 
  • Like
Reactions: MacBH928
A project I had a little while back was to get a local LLM to read scenes from my story in turn, create an “image generation prompt”, send that to A1111 to generate the image, then insert that image into the scene document (“illustrating my story”). Running both LLM and StableDiffusion side-by-side wouldn’t have been practical on my 24GB MBP.
I don't see why you'd need to run them side-by-side, when it should work just as well to run them sequentially.

The first pass reads the entire story and produces all the "image generation prompts" in a text file. It could be annotated, so later passes can easily parse the divisions between elements, and also be properly guided about where to put the image in the original story.

The second pass reads the prompts from the text and produces the images.

The third pass reads the annotations from the text and inserts the images into the story.

If there happens to be enough resources, you can run them in a pipeline. If not, you can run them sequentially.
 
there is $400 price difference between 16GB and 32GB models 🥲
That’s a problem for us all at the moment - ram for Mac Pro is now quite expensive as well. I am not memory limited at the moment but I did want to bring one of my machines up to the maximum amount but the price is shocking.

I would suggest you got your teeth and just take that extra outlay in cost, because it might get worse later.
 
So I was guided a lot by online AI (like Grok, and ChatGPT in the beginning; Claude and Kimi K2 later) as I kind of bumbled my way through figuring out what is possible and how to do it.

I leave this here for your (and others') enjoyment:

 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.