Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
(I realize this is a generic post without exact sizes / numbers - apologies).

I'm probably a medium user on my MBP 13' 2017. I did shell out $ for 16GB of ram but because I plan on keeping this machine for 16 years. My MBA 11' 2015 had 4GB of ram and I never got it to yellow with my usage. My Mac Mini 2011 had 8GB of ram and, same, even with tons of stuff opened, never got to yellow or red - even with Starcraft running.

The cached ram seems to vary, I've noticed gigs and gigs of stuff there even with light usage. My MBP 2017 always has gigs used and gigs cached it seems - but the "pressure?" graph is like 10%. I maybe got it to 30% with a ton of stuff opened intentionally.

The most ram I saw used was when I was using Adobe Lightroom to work on some 150GB of photos and videos. I wish I had taken pictures but even then it didn't get above 60%.

Has me convinced this laptop should easily last me, with my usage, for 10 years. I'd be happy with 8GB of ram if I had to.

16 years, did I read that right? o_O
 
Well, we were discussing in context of MacBook Pro and RAM, which makes your case the same as comparing an iMac Pro for video to a 60-machine render farm.

I'll reiterate, by the time you get to 3000-5000 voices CPU is long overtasked on a MacBook Pro. (probably 2000-4000 voices over tasked).

5000 simultaneous voices? Even if you divide orchestra into *every single instrument* and layer it 4 times, you're nowhere near that number. Every legato patch cuts and crossfades, and you have sections usually, not every single instrument separately (because it makes no sense). If someone needs 5000 stereo voices simultaneously, they maybe need to rethink their approach to orchestration, and "killing voices" is a thing (especially when the voices are way too below threshold to be heard over the newly played sample)

I know a composer working in Hollywood, and the workflows are vastly different. Crazy amount of RAM was very common a couple of years ago when disks were slow and fast disks were expensive. But audio needs didn't increase all that much, and disks got crazy fast in the last few years.

What does VSE or Bidule has to do with trivial or not? It's a sample management/chaining tool. RAM usage in NI Kontakt (which is still one of the best sampler engines) is extremely different whether you set prebuffer to 16KB or 60KB (as it is default)

Most composers have all samples loaded to begin with (even if they dont use them), but with fast disks speeds. nVME SSD Latency is in microseconds, audio latency in such a system is never that low (because its not even possible).

And samples are streamed from disk. If you have 3 orchestras loaded thats usually 350GB of data, so there's no way everything fits into RAM (i.e. a single piano is 50GB, and thats conservative, some are even 120GB). How much you fit into ram and how much you stream from disk depends wholly on your drive's performance... And as drive speeds increase, need for RAM decreases in audio.

So this is where you were wrong initially. Not only that you don't NEED all that sample data in RAM, you never even have it because its inefficient and nearly impossible for huge templates.

Well I do have to say that your argument is fascinating, if a little 'assertive' and cantankerous!
It's clear to me being an established long-term producer who has been involved in several major projects including films, video games and events and having produced widely used music software that no-one really has a clue exactly how much ram is needed for large sample libraries. I use massive 100GB phrase based libraries that aren't multi samples but phrases that need to be pitched, played, mic positioned and altered in realtime. Often they spike a single core on my machines with less the 32Gb. Having said that, my ram pressure is always in the green. Same project, same SSD's, same CPU but more ram in another studio and voila! no spikes.

I'm no coder, but my immediate network includes members of the Logic team in Cupertino and they give me conflicting answers too!

I think it may be time to do an actual test with identical MBP 18's with varying RAM to see how much affect RAM has on these libraries - irrespective, I can afford the extra 16Gb so I will sleep better at night : )
 
Hmmmm.. You all are really making me second guess getting 32 gigs.. But I think its needed.. Heres my graphs from istats the last 30 days.
I have been traveling the last 8-10 days or so and haven't been actually doing any hard work like I usually do..
What are you alls thoughts..
I don't really ever look at istats as much as I should. but I can tell just how the machine is moving that I need to close this app or that app.. etc..

Screen Shot 2018-07-25 at 4.05.44 PM.png
Screen Shot 2018-07-25 at 4.05.58 PM.png
 
  • Like
Reactions: BigMcGuire
Well I do have to say that your argument is fascinating, if a little 'assertive' and cantankerous!
It's clear to me being an established long-term producer who has been involved in several major projects including films, video games and events and having produced widely used music software that no-one really has a clue exactly how much ram is needed for large sample libraries. I use massive 100GB phrase based libraries that aren't multi samples but phrases that need to be pitched, played, mic positioned and altered in realtime. Often they spike a single core on my machines with less the 32Gb. Having said that, my ram pressure is always in the green. Same project, same SSD's, same CPU but more ram in another studio and voila! no spikes.

I'm no coder, but my immediate network includes members of the Logic team in Cupertino and they give me conflicting answers too!

I think it may be time to do an actual test with identical MBP 18's with varying RAM to see how much affect RAM has on these libraries - irrespective, I can afford the extra 16Gb so I will sleep better at night : )
Thanks! Yeah I've been through this discussion a couple of times already and can come across a little feisty.

Problem is it depends so much on storage it's hard to rule anything out, and SSD performance changes as it gets more filled. But the "you need RAM" argument is as old as when RAM was only 3000mb/s fast. :p And libraries while bigger, aren't 10 times as big.

Oh yeah, I will get 32GB as well.
 
If the nature of your work changes so much in two or three years that the lack of 32GB suddenly impedes your ability to function, you most likely will need to upgrade the other aspects such as CPU and storage as well. That means a new machine. Which you'll have more money for then if you don't waste it now on "maybes"... :D

Not necessarily. We are however at an interesting time, but between 2011 and 2016-2017 CPU advancement was pretty stagnant.

Your 2011 high end CPU wasn't a lot different to the high end from 2016-2017. Maybe 25% different.
 
  • Like
Reactions: Aea
I still never have any issues with just 8gb of ram. However, I just do some occasional photoshop and etc. I can't imagine a scenario where I would need 16 gb of ram. This varies very much from person to person.
 
RAM is such a misunderstood topic, with a lot of opinions dating back 10-20 years. What you need, what you want, what is nice to have, what is useful, are all different criteria.

If you open Activity Monitor and see it's using 14GB out of 16GB RAM, you can assume you should get more. However as has been stated, the system will use all the RAM it can as to not waste resources. If it's using 12GB out of 16GB, then is is trying everything it can and still can not use more than 12GB.

Memory pressure is the key to all of this, it is the metric about how stressed the RAM is becoming. Once it reaches 100%, that's it, but it's a pretty large headroom considering most here are seeing 30-40% pressure. The scratch disk will always be used regardless of RAM, it's more efficient to store certain things on there than it is to hold it in the RAM. The system will always try to make RAM available over everything else, so that when you open a new file or something, it is available. This speeds up the entire system and is good.

I've got all sorts running on this iMac, it's got 16GB RAM. Working on projects I may have open Illustrator, Photoshop, InDesign, SketchUp, AutoCAD, Cinema4D, Keyshot, ProtoPie, Keynote, Powerpoint, Word, dozens of tabs and other smaller apps. These tend to be large files for big design projects, and the most I've seen my memory pressure get to is 77%, usually it stays around 50%.

So that means I could use 32GB, as I am occasionally getting close to the limit. But all that limit in my case means is that I'd have to close down a few applications, which is a far more efficient and correct way of working. We're all lazy, and we all find it easier to have everything spread out on the desk in front of us rather than organized, but that's just the way it is. All I am basically using is 8GB, but this machine would struggle, and my work pattern would struggle. 16GB is enough, I would like 32GB but today it wouldn't make any significant difference to the use of the machine, possibly in a couple of years; however at that point we'll just upgrade it as the CPU differences will be worth it.

To all those asking or getting confused by it all, literally just look at the memory pressure graph in Activity Monitor, try looking whilst in the middle of common tasks. If it is below 50%, you are absolutely fine. Do not look at the memory being used, swap, or anything of the sort - yes it is important, but far too complicated and unimportant when making RAM decisions unless you're a computational scientist or something. If your machine is currently 16GB and is on 40% memory pressure, then getting 32GB will just reduce that to 20% memory pressure, that's the simplest way of thinking about it.
 
I know a composer working in Hollywood, and the workflows are vastly different. Crazy amount of RAM was very common a couple of years ago when disks were slow and fast disks were expensive. But audio needs didn't increase all that much, and disks got crazy fast in the last few years.

I was discussing this with colleagues recently, we're all designers, graphics, video, audio pros.

Seems our aluminum Mac Pros all used to need 64GB+ to even function at a basic level, and would get pegged when working on huge Photoshop files with 100 layers and such.

When I first switched to MPBs around 2012, graphics apps felt slow for a time, but I've been doing quite fine doing all the same work on modern machines with 8-16 GB.

That said, I ordered a 2018 15" with 32 GB for the extra few hundred bucks because... why not. I make money off the machine and there's no downside.
 
  • Like
Reactions: throAU
But 1,000 samples isn't an especially high number in certain use cases. For an entire orchestra, try 3,000-5,000 simultaneous voices. Then add continuous cross-fading between samples. The difference in latency between RAM and NVMe storage at low buffers starts to become obvious. And problematic.

Another thing to consider is that some Kontakt sample library makers such as Orchestral Tools have their own scripts to make real-time adjustments to samples (in this case called “Capsule”) and the script itself uses quite a bit of RAM per instance, in addition to the sample size. I suppose you could say “then don’t use those sample libraries that require so much RAM” and you’d probably be right.

not? It's a sample management/chaining tool. RAM usage in NI Kontakt (which is still one of the best sampler engines) is extremely different whether you set prebuffer to 16KB or 60KB (as it is default)

To an extent the amount of sample data loaded into RAM can be mitigated. I usually have my buffer in Kontakt set even lower than that at 6kb. But there’s no way I can load a full orchestra with all necessary articulations for each instrument on my 2012 quad i7 Mac Mini with 16GB RAM. Samples are streamed from an SSD in an external USB 3.0 enclosure. Logic Pro X can freeze tracks which frees up processor usage, but it doesn’t remove instruments from RAM. I’ve read that Cubase can do this though.

However, the question now is could I fare much better with the same 16GB on the 2018 MBP? Or would I be better off springing for the 32GB?
 
....

To all those asking or getting confused by it all, literally just look at the memory pressure graph in Activity Monitor, try looking whilst in the middle of common tasks. If it is below 50%, you are absolutely fine. Do not look at the memory being used, swap, or anything of the sort - yes it is important, but far too complicated and unimportant when making RAM decisions unless you're a computational scientist or something. If your machine is currently 16GB and is on 40% memory pressure, then getting 32GB will just reduce that to 20% memory pressure, that's the simplest way of thinking about it.


It should be said that memory pressure is extremely non-linear with actual active memory usage. That bold statement there is wrong. If you're at 40% memory pressure on 16GB, you're probably running 14GB of active RAM, just kind of a guess, but approximately. 14GB of active RAM on a 32GB machine is ~0% memory pressure. And typically, what that memory pressure more closely represents is how much memory that would ideally be in RAM is actually being swapped. So once you start climbing to high pressures, the responsiveness of your computer is going to be tanking as IO gets saturated with swapping activities. Now, closing an app or two could help, but it depends. Are you already running a lean profile or not? If so, just closing safari isn't going to really do a ton. And what it means is that while your computer is working, you can't do anything else. That's not very useful.
 
  • Like
Reactions: throAU
I run some large hardware simulations that can sometimes need over 16GB. That's the main reason I got the 32GB.

However, I do notice during normal operation that the MBP will consume all of the 32GB with cached files and that it does much less memory compression than last year's 16GB model. Does 32GB make the MBP feel faster? I think so, but it's hard to say how much and I'm too lazy to investigate.
 
How do you see percentage points of memory pressure, or are you just calculating with some figures?
 
A couple of points to add to the discussion:

  • People expect RAM usage to grow the same way as it has in the past, but apps are not really increasing the RAM requirements for the same tasks anymore. Same with the OS. When RAM requirements increase now it's because of data.
  • One recent reason why RAM requirements grew (slightly) was the switch from 32-bit to 64-bit apps. We've taken the hit of that now. That's not going to happen again within the lifetime of current MBP's.
  • Historically, one reason for increasing RAM requirements has been that CPU's have been getting faster at incredible rates. As they got faster we were able to solve larger problems, larger problems required more RAM. But CPU's (used in MBP's) have not been getting significantly faster over the past 5-6 years. Some, but nowhere near the rates seen previously.
  • People forget that RAM is slow. In order for more RAM to be really useful it needs to get faster. Relative to the CPU, RAM is extremely slow. Currently, modern computers can access all of 16GB RAM in well under a second. If the same computer had 128GB RAM, it would take over 3 seconds to access all of the RAM. For the extra RAM to become really useful, the CPU needs to be able to access it faster. But RAM isn't really getting faster right now. High end desktop systems use four memory channels to get around this, servers use eight. MBP is limited to two channels. It doesn't help to have more RAM if the CPU can't access it in time.
  • Comparing 16GB on a 13" to 16GB on a 15" is different because of the dGPU. The iGPU will use some of system RAM for graphics, whereas the dGPU has it's own dedicated RAM. Could be the same if you're comparing an old MBP and considering a new one.
  • A lot of apps are really inefficient with respect to RAM usage. This doesn't really help non-programmers, but I know from experience with a lot of algorithms from a lot of fields that the vast majority of them focus on gaining performance while either using more memory or keeping it constant. It's rare to see algorithms that sacrifice a small bit of performance in order to gain memory efficiency. There is a lot that app makers could do to make more efficient use of memory, or of system resources in general. It's possible that with memory prices having been as high as they have now for a few years, that app makers would actually start focusing on memory efficiency a bit more.
All of these points mean that RAM requirements should not be expected to grow in a linear fashion going forward. It's very reasonable to expect memory requirements to grow more slowly for a while, and for 16GB to actually last quite a bit longer than people generally expect. There are indeed use cases that require or benefit from 32GB (or more) and some have been presented here, but they are really a lot more rare than people think. And if you have one of those use cases you generally know it already. Virtual machines is one example where it's somewhat hard to get around the memory requirements. I've understood music production to be another. Bioinformatics could be another, and there are more as well. But generally, if 16GB is good for you now and your work doesn't significantly change, then it's likely to be good in 2-3 years as well.
 
  • But generally, if 16GB is good for you now and your work doesn't significantly change, then it's likely to be good in 2-3 years as well.

For the sake of discussion, what about 8 or 10 years down the road...? If the butterfly keyboard can survive even 8... :D
 
It should be said that memory pressure is extremely non-linear with actual active memory usage. That bold statement there is wrong. If you're at 40% memory pressure on 16GB, you're probably running 14GB of active RAM, just kind of a guess, but approximately. 14GB of active RAM on a 32GB machine is ~0% memory pressure. And typically, what that memory pressure more closely represents is how much memory that would ideally be in RAM is actually being swapped. So once you start climbing to high pressures, the responsiveness of your computer is going to be tanking as IO gets saturated with swapping activities. Now, closing an app or two could help, but it depends. Are you already running a lean profile or not? If so, just closing safari isn't going to really do a ton. And what it means is that while your computer is working, you can't do anything else. That's not very useful.

As noted in the part you didn't bold, it's a simple way of thinking about it. I believe I mentioned it was more complicated than I stated, however to help people who don't understand, it's a simplistic way of achieving a relatively decent understanding of personal requirements. Of course they could go into frequency, IO, BUS's and all-sorts, but my post was aimed at helping those who don't actually need to go into all of that stuff to make a purchasing decision.

16GB is still a lot of RAM! Just because "My 7 year old machine had 16GB, therefore I need double" doesn't mean it's true. Your 7 year old machine was actually sold with say 4GB of RAM, you upgraded it in the not so recent past to 16GB - so today 16GB is fine. 32GB is a huge amount of RAM, and it's a great upgrade option for those who need it, but anyone even pondering whether they need it or not really doesn't need it.
 
  • Like
Reactions: CodeJoy
For the sake of discussion, what about 8 or 10 years down the road...? If the butterfly keyboard can survive even 8... :D
:)

For the sake of discussion, if Apple RAM upgrade prices were half of what they currently are, and you're planning to keep the machine for 10 years, then I'd say get the 32GB and not worry about it. With prices as they are, you'd be much better off to get a 16GB machine now, and upgrade to a 32GB machine after 5-6 years. If you're need is 16GB now, it only really makes sense to still get a 32GB machine now if you are likely to need the extra memory within the next 12 months.

Don't get me wrong though, I'm not generally against more RAM. I'm not generally against people wasting money on useless upgrades either. And for people with infinite money, just pile it on, no problem. It's only when those upgrades are massively overpriced, and price sensitive people ask in forums whether it's worthwhile to pay the asking price for it, that I start challenging the widespread idea that you need more than you really do.
[doublepost=1532589441][/doublepost]
As noted in the part you didn't bold, it's a simple way of thinking about it. I believe I mentioned it was more complicated than I stated, however to help people who don't understand, it's a simplistic way of achieving a relatively decent understanding of personal requirements. Of course they could go into frequency, IO, BUS's and all-sorts, but my post was aimed at helping those who don't actually need to go into all of that stuff to make a purchasing decision.
Well put. Memory in modern computers is far far far far far far far more complicated than people think. It's far far far far far more complicated than even the youtubers think, who give reviews and advice to regular users. And frankly, computers in general are a lot more complex than people think. They are also much faster than people know. A current MBP has similar computing power to a top end supercomputer from 15 years ago. We do know how to make use of a lot more computing power today than 15 years ago, but it's very questionable whether most users can make full use of a laptop supercomputer on their own.
 
People were defending 2 GB not long ago. Doesn't mean they're right to do so.

You may or may not need the RAM, but if you do in the future, you're boned.

not that long ago? I'm sorry, but 15 years ago is a long time ago.

I got 16gb six years ago. All that time later, 16gb is still more than enough.

Hell, even gamers dont *need* more than 8gb. Applications simply aren't bloated and clunky enough to need that much memory, and we can only realistically do so many things at once. We arent limited by RAM, the RAM is limited by humanity.
 
  • Like
Reactions: Ploki
Another thing to consider is that some Kontakt sample library makers such as Orchestral Tools have their own scripts to make real-time adjustments to samples (in this case called “Capsule”) and the script itself uses quite a bit of RAM per instance, in addition to the sample size. I suppose you could say “then don’t use those sample libraries that require so much RAM” and you’d probably be right.



To an extent the amount of sample data loaded into RAM can be mitigated. I usually have my buffer in Kontakt set even lower than that at 6kb. But there’s no way I can load a full orchestra with all necessary articulations for each instrument on my 2012 quad i7 Mac Mini with 16GB RAM. Samples are streamed from an SSD in an external USB 3.0 enclosure. Logic Pro X can freeze tracks which frees up processor usage, but it doesn’t remove instruments from RAM. I’ve read that Cubase can do this though.

However, the question now is could I fare much better with the same 16GB on the 2018 MBP? Or would I be better off springing for the 32GB?

Kontakt scripts shouldn't increase RAM usage as much as CPU (and disk usage) instead. All the samples that are used are already loaded anyway, scripts just manipulate them, and scripts themselves are pretty small.

USB3.0 doesn't say much, but you already bottlenecked your drive to an extent. I said NVMe in my original post which is what i literally meant, connected via TB3 or PCIe. Most if not all USB3 enclosures are SATA (which cap drive at 500mb/s) and USB has more much more latency than tb3 or native PCIe, which means you need a bigger prebuffer (and audio buffer) because the drive cannot react as fast.
good NVMe drives have latency well under millisecond. SATA3/m.2 not so much, especially not via USB3.

I'm pretty certain that with a fast (2gb/s) NVMe drive connected via TB3 or PCIe, you could get a way without even loading anything into RAM. (samsung NVMe has ~14000 IOPS for random R/W).
For the same cash 128GB RAM costs you could get 4x256GB NVMe drive.

My point: Not all SSDs are made equal, and i was pretty clear in my original post when i said NVMe, which excludes SATA and USB3. Since you can never load everything into RAM (because its simply too much DATA) i think we can agree that how much RAM is needed is directly affected by the performance of the drive which is was my point since beginning.

But nobody really experiments with that, samplers still work the same way they used to work 10 years ago when drives were 70mb/s and had 100 IOPS...

The question you have is simple, you probably would fare better (especially using internal drive), but anyway, get 32GB :D It's not that much more expensive and its soldered, and in every scenario it will off better performance.

You are correct about cubase, and yeah, that's a mandatory function to efficiently run large templates.

First answer is a good analysis on the topic of SSDs:
https://pcpartpicker.com/forums/topic/243099-sata-iii-ssd-vs-nvme-ssd



I was discussing this with colleagues recently, we're all designers, graphics, video, audio pros.

Seems our aluminum Mac Pros all used to need 64GB+ to even function at a basic level, and would get pegged when working on huge Photoshop files with 100 layers and such.

When I first switched to MPBs around 2012, graphics apps felt slow for a time, but I've been doing quite fine doing all the same work on modern machines with 8-16 GB.

That said, I ordered a 2018 15" with 32 GB for the extra few hundred bucks because... why not. I make money off the machine and there's no downside.
as far as i know you can set where the scratch disk is for Adobe Photoshop and that affects its performance a lot.
Depending on which year those macs are, 2008 for example have DDR2 800MHz, which in ideal scenario gets 6gb/s. (Newer are better, but all alu mac pros are hampered by old SATA drive bays).

That said, getting 32GB is obviously a better option if you can afford it - soldered RAM is terrible.
 
Last edited:
  • Like
Reactions: galactic orange
  • People expect RAM usage to grow the same way as it has in the past, but apps are not really increasing the RAM requirements for the same tasks anymore. Same with the OS. When RAM requirements increase now it's because of data.

I couldn't disagree more. Many apps are moving towards an Electron (i.e. essentially a standalone web browser for every app) model which consumes memory like no tomorrow.
 
USB3.0 doesn't say much, but you already bottlenecked your drive to an extent. I said nVME in my original post which is what i literally meant, connected via TB3 or PCIe. Most if not all USB3 enclosures are SATA (which cap drive at 500mb/s) and USB has more much more latency than tb3 or native PCIe, which means you need a bigger prebuffer (and audio buffer) because the drive cannot react as fast.
good nVME drives have latency well under millisecond. SATA3/m.2 not so much, especially not via USB3.
Would be nice, do you know of any product that lets you connect NVMe drives via TB3? I've never seen one. Technically, you also wouldn't saturate USB 3.1 unless you do NVMe RAID, which I absolutely haven't seen available for portable devices.

People do this in desktops though. You can get 4-way NVMe RAID which is starting to approach the limits of PCIe. Insanely fast data transfer rate, though probably no better for latency. This would technically be possible over TB3 as well. Would be kinda cool if Apple could offer such a peripheral. But sadly they don't, and I don't know that anyone does.
 
I couldn't disagree more. Many apps are moving towards an Electron (i.e. essentially a standalone web browser for every app) model which consumes memory like no tomorrow.
True, though for me that falls under the point of programmers making inefficient use of memory. For the user it may not matter though, if they don't have a choice between an efficient and inefficient app that does the same thing. I'm not sure how it really affects total performance though, since you're not really increasing your working set. Maybe you know.

At some point, I think programmers will need to start considering efficient use of memory. Right now the model is basically to use more memory in order to save coding time. Which is understandable, it's just not something I find appealing.
 
  • Like
Reactions: Aea and Ploki
It's a fairly compelling argument to (almost) write once and (almost) run everywhere. I don't think this reasoning will change anytime soon.

That said I'm not sure if this direction is sustainable. Spotify taking 1.5 GB of memory just to play a song on disk? Egregious.
 
  • Like
Reactions: CodeJoy and Ploki
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.