Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Out of interest , are you running Minikube or a multi node K8s cluster (ie, with Vagrant) on your machine ? And with running multiple VMs at same time how does the machine cope in terms of fan noise and thermals?

I have the same use case and really want to go for the i9 and 64GB, but held back with all the talk about loud fan noise on this spec - although not intending to use a 4K external monitor in my case which seems to cause much of the complaints about fans, my main concern is having to shutdown VMs to stop the machine sounding like a jet engine.. which clearly won’t be ideal when you pay for all those extra cores and RAM.
I’m actually using MicroK8s through Multipass. Basically, it’s Canonical’s K8s stack. It’s fantastic and I highly recommend it.

The VMs I tend to run at the same time are: Ubuntu (for MicroK8s), Windows 10 (for various things, VMWare, bootcamp partition), Docker’s VM (used to be called Docker Machine, now it doesn’t have a name and is lower level), and another Ubuntu VM (with a desktop environment, also VMWare) for various things. I run a single Kubernetes node locally because for the work I do, I don’t care about underlying implementation of how my pods are being provisioned, plus I don’t need the same kind of redundancy and fault tolerance I would expect in a production environment.

If everything is idle, no fans turn on. But there is one exception; there is a known bug in Docker on macOS that puts HyperKit (the macOS hypervisor) into some kind of infinite loop, pegging a CPU core at 100%. Fans will turn on here, for sure, but that’s a bug and you can always close Docker. I don’t need Docker running unless I’m actively building images, which isn’t a majority case.

Obviously, if I’m running large pipeline jobs, all resources are maxed out. So yes, CPU gets up to 100% on all cores and the fans turn on if it’s a long job. But I mean... duh? ;)

If I’m using the computer normally without any VMs (normal usage for me is iTerm, Firefox, Outlook, one or two JetBrains products, Apple Music, and Messages) then I get something around 10+ hours. It’s INSANE. In the beginning, I thought the battery indicator was broken because it was barely ticking down. But yea, I think I can get almost 11 hours if I really squeeze it, but 9 at a minimum is absolutely no problem at all.

Docker’s lower level VM has nearly zero impact on battery life while idle, other than if you hit the bug I mentioned earlier. I don’t feel it.

The Ubuntu Server VM running my Kubernetes cluster.... I really have no idea about it bare, but while it’s running my whole cluster, it knocks my battery life down to about 8 hours if idle, maybe 7 if I have to restart it often. That is, my apps are running, fully hydrated (memory has ballooned to 32 gb), performing health checks, and chaos monkey is running. Startup is bad on battery because my software is extremely IO and CPU intensive at that time, but again, that’s common sense.

The Windows and Ubuntu Desktop VMs at idle eat about another hour of battery life at idle. It seems to not matter if I run one or both of them; they eat about the same amount of battery combined as they do one at a time.

So basically, I can get 6-7 hours with absolutely everything running, but idle. That’s positively insane. Keep in mind that I’m coming from a maxed out 15 inch MBP and it did the same thing, but I got about 4 hours.

However, real-world, if I’m running data sets through my code, there’s a 99% chance I’ll be plugged into a charger. So I have no idea what kind of battery life I’d get under those conditions, but it should be obvious that it’s bad. Also, real-world, I’m very likely not running the K8S cluster at all if I’m on battery, since I really don’t need the whole cluster for small tests. At that point, I just use the host macOS and bootstrapping/test-harness scripts I’ve written, including running dependency microservices directly on the host.

That’s just the way I work, though. Everyone is different.

But yea... No other machine will get you this battery life, at any price. And the fans stay silent when you’re not taxing the system, even if you’re running N virtual machines. Don’t worry about it.

The fans will turn on when it makes sense.

Hope this helps!
 
  • Like
Reactions: kmahmood
React native development. Chrome uses a lot of memory. Also need Xcode, Android studio, and webstorm open. Plus docker and some android emulators. I don’t need 64. I could do and have been able to work with 16 GB, but after continued use I definitely notice when 16 GB is not enough and the machine slows down and I have to start micromanaging open apps. I could probably do 32 GB, but why take the risk.
 
React native development. Chrome uses a lot of memory. Also need Xcode, Android studio, and webstorm open. Plus docker and some android emulators. I don’t need 64. I could do and have been able to work with 16 GB, but after continued use I definitely notice when 16 GB is not enough and the machine slows down and I have to start micromanaging open apps. I could probably do 32 GB, but why take the risk.
I often run RN projects in Xcode and Android studio (webstorm) and docker as well, but not at the same time. I am not the main developer, I just manage the projects and so far 16Gb have been enough for me. What are the minimum tasks for you to notice that 16 is not enough?
 
My technical justification is for VMs. I sometimes run quite a few at a time in lab/test environments. However, I don't do this very often and can't really think of any situation where I would be running enough to use all 64GB.

Since we're all friends here (right?) ... I'll lay down on the shrink couch and give you my real reason: It's FORO (fear of running out). When the company that makes my #1 favorite laptop computer releases a new laptop model with a maximum amount of XX RAM, I will get that maximum amount because there could be, at some point, some sort of situation where I may need all of it.

Thank every God: I have no interest in the Mac Pro.
 
I often run RN projects in Xcode and Android studio (webstorm) and docker as well, but not at the same time. I am not the main developer, I just manage the projects and so far 16Gb have been enough for me. What are the minimum tasks for you to notice that 16 is not enough?

I have never exactly counted, but I can tell when the machine is hitting its limit. I make heave use of desktops and hate restarting my computer. I usually have about 5 or 6 desktops. I have about 100 tabs opens. I am usually working with one or two projects for work plus a personal project. Android studio and webstorm use a lot of memory and so does the Android emulator. I could close apps down, but consider that micro-managing. I like to just switch desktops and resume where I left off.
 
I have never exactly counted, but I can tell when the machine is hitting its limit. I make heave use of desktops and hate restarting my computer. I usually have about 5 or 6 desktops. I have about 100 tabs opens. I am usually working with one or two projects for work plus a personal project. Android studio and webstorm use a lot of memory and so does the Android emulator. I could close apps down, but consider that micro-managing. I like to just switch desktops and resume where I left off.
I like using many desktops as well. This week I had to restart my computer several times because terminal in Webstorm needed a reboot because it would not register the password I type. Do you think that having 32 or 64 would have saved me from rebooting?
 
I don't get why there seem to be a few folk whose mission is to persuade that anything more than 16Gb RAM is a waste except in specifically defined 'high end' use cases.

My previous 2017 MacBook Pro had 16Gb and was terrible for running Windows 10 in VMWare. My current 16in MacBook Pro has 64Gb and runs it so well that it is likely I will delete the bootcamp partition. It's a huge jump in performance and makes my work (which is basic, not fancy high end video/ graphics/ compiling/ research) much easier. Being able to run Mac OSX and Windows on the same machine simultaneously and seamlessly has productivity benefits for me.

Maybe 32Gb would have been enough. But the marginal price for 32Gb more RAM was good enough value for me and I don't know what the future RAM requirements for OSX and Windows will be.
 
  • Like
Reactions: ascender
My RAM use is separate from my CPU use. It's not because I want to have several VMs running with 8GB or more each, that those VMs necessarily will be doing a lot. I'm looking at simulating a server network before spending on the hardware. But the servers will work request-based, so when I'm checking my mail in between testing sessions, the servers are idle. No smoke here :)

You're a power user of sorts, most people don't do this. Load up on that ram, you can't upgrade later.
 
React native development. Chrome uses a lot of memory. Also need Xcode, Android studio, and webstorm open. Plus docker and some android emulators. I don’t need 64. I could do and have been able to work with 16 GB, but after continued use I definitely notice when 16 GB is not enough and the machine slows down and I have to start micromanaging open apps. I could probably do 32 GB, but why take the risk.
People underestimate how much CPU and memory resources are used in web development and web-ish (I made up that word) development, such as React Native. Not only that, but it’s storage IO intensive, so if you’re exhausting real memory, virtual memory kicks in and slows literally everything down. Compilations (Babel, WebPack, et. Al) slows down due to misses/faults, but now the virtual memory is also slowing down storage access. At least these Macs have blisteringly fast storage, but still, it’s a bad situation.

Or just pay the money for more RAM and it’s no longer a problem.

I like your style. You made the right decision.
 
Out of interest , are you running Minikube or a multi node K8s cluster (ie, with Vagrant) on your machine ? And with running multiple VMs at same time how does the machine cope in terms of fan noise and thermals?

I have the same use case and really want to go for the i9 and 64GB, but held back with all the talk about loud fan noise on this spec - although not intending to use a 4K external monitor in my case which seems to cause much of the complaints about fans, my main concern is having to shutdown VMs to stop the machine sounding like a jet engine.. which clearly won’t be ideal when you pay for all those extra cores and RAM.
Zero fan noise from my unit. I’m not sure the fans have activated since I purchased the computer. You can’t use a small sample size of geeks or support forums, which attract negative responses, to gauge the scale of an issue. Consider the thousands of users not affected by the issue who aren’t posting. The i9/64GB/5500M spec has been wonderful and the fans probably will be activated less than my less powerful MacBook Pro 2012 and 2015 for the same workload anyway.
 
  • Like
Reactions: gplusplus
This week I had to restart my computer several times because terminal in Webstorm needed a reboot because it would not register the password I type.

I have found that Jetbrains IDEs (Webstorm and IntelliJ being the ones I use more often) have a tendency to grow in memory over days of use. I try to quit and restart it every day or so, before typing slows down nad it lags behind.

I think more memory would let this go longer before it starts bothering you, but it hasn't been needed for our dev teams as long as we just quit the IDE every so often. (FWIW, many of our developers are still on 16GB.)
[automerge]1580422946[/automerge]
At least these Macs have blisteringly fast storage, but still, it’s a bad situation.

That reminds me of the 1st gen MBA I had with a spinning hard disk. Two whole gigabytes. When it ran out of memory, it was time to get coffee.
 
Definitely could not imagine 64 Gb could be used by average user, though I will definitely buy 32Gb next time
 
I have found that Jetbrains IDEs (Webstorm and IntelliJ being the ones I use more often) have a tendency to grow in memory over days of use. I try to quit and restart it every day or so, before typing slows down nad it lags behind.

I use IDEA ultimate (aka IntelliJ) and often leave it running for weeks at a time -it uses a chunk of memory but can’t say I’ve seen it constantly grow with time, and can’t say I’ve seen it slow things down like that, but I am using 64GB so possibly it’s an issue of increasing to some arbitrary amount (ie the java memory limit) which has a detrimental affect on your machines but isn’t high enough to affect higher memory machines
 
I use IDEA ultimate (aka IntelliJ) and often leave it running for weeks at a time -it uses a chunk of memory but can’t say I’ve seen it constantly grow with time, and can’t say I’ve seen it slow things down like that, but I am using 64GB so possibly it’s an issue of increasing to some arbitrary amount (ie the java memory limit) which has a detrimental affect on your machines but isn’t high enough to affect higher memory machines

Interesting. That does make the case for more memory, or some fine tuning on memory settings on lower machines.
 
for the Intellij crowd,
I use IDEA ultimate (aka IntelliJ) and often leave it running for weeks at a time -it uses a chunk of memory but can’t say I’ve seen it constantly grow with time, and can’t say I’ve seen it slow things down like that, but I am using 64GB so possibly it’s an issue of increasing to some arbitrary amount (ie the java memory limit) which has a detrimental affect on your machines but isn’t high enough to affect higher memory machines

For the IDEA crowd-- check out VSCode... though not sure about java support
 
For the IDEA crowd-- check out VSCode... though not sure about java support

I have to say, switching to an Electron app to try to save memory seems like a stretch. But I have heard that VSCode is less bad at that then Atom or some others.
 
I have to say, switching to an Electron app to try to save memory seems like a stretch. But I have heard that VSCode is less bad at that then Atom or some others.

Not only would you likely not save a lot of memory, you’d almost certainly lose functionality.

I use IntelliJ in spite of it being jvm based because of what it can do. Saving 10% memory usage (if any) to lose functionality isn’t a worthwhile trade to me.
 
  • Like
Reactions: AlanShutko
64GB MacOS yes...

I would not buy 64GB ram on a Windows based PC knowing how poorly Windows manages memory. However on MacOS I would forsure because it'll use whatever you have, and so at least it's not going to waste. I have 32GB on my gaming PC which really doesn't get touched but at $150 for 32GB of 3600mhz Corsair... couldn't pass that up.
Next month I'll be picking up the 16" and I plan on getting it with 32GB, for the above mentioned reason being 1, and 2 it can't be upgraded later aswell I do run some Linux VM's.

My work windows laptop runs a linux development VM with 8GB, a test VM with 2GB and a windows program that gobbles up 2GB of RAM, outlook and a browser on 16GB, but somehow people here need 64GB to do that on MacOS.
 
I don't necessary need it all the time for what I do. I want it.

I don't think people are against anyone buying 64GB, it's your money to waste. The problem is more the following two:

1) People whip up a worst case scenario that they'll almost never encounter to justify it.
2) People argue almost everyone needs 32GB to future proof, "pushing" people who have no insight into their needs into upgrading.

If they didn't ask for liver for each upgrade tier, we wouldn't have this discussion. Id say, if you don't know exactly why you need more than 16GB, chances are you'll never experience a scenario where you'll run into any major detrimental issues due to it.
 
  • Like
Reactions: Eliott69 and 88Keys
I don't think people are pushing everyone to buy 32GB. That's an overstatement. The OP asks people why they would need 64GB, and everyone here has given their opinion and advice. There is no right or wrong justification.

if you don't know exactly why you need more than 16GB, chances are you'll never experience a scenario where you'll run into any major detrimental issues due to it.

IMHO, that's the worse possible answer. You're basically calling that person stupid, and it's not helpful. They don't know why they would need it, hence the question.
 
Perhaps just a modicum of explanation for a particular use of RAM ... a small example case illustrating differing time complexities for alternative algorithms ...

In the top graph below you see the discrete solution of a partial differential equation using four different methods, two different Gauss-Jordan eliminations, an iterative improvement, and a Monte Carlo (random) method. Note how each algorithm yields more-or-less a straight line on a log-log plot of time as a function of problem size, i.e., the time complexity is a power function (n=size of problem, t=time of execution). Each algorithm has its own slope (the exponent of the power function). At very small problems and short times, the graphs vary somewhat from linearity due to the overhead of initialization and final result reporting, but for medium to large problems the plots are remarkably linear.

1.png

In the bottom graph for even larger and more complex problems, notice that each GE code, MATLAB vs NumPy, switches lines at some point as the size of the problem gets larger (i.e., have different slopes, NumPy switches at around log(n)=8.75 while MATLAB switches at around 9.25). We can debate why these slope switches occurred. But also notice that the MATLAB results go haywire beyond log(n)=10. This is perhaps a RAM versus SSD paging issue, as paging became prevalent above log(n)=10 for MATLAB? I should mention that MATLAB was being executed on an iMac while NumPy was run on a MBP. The iMac had both a faster CPU and four times the RAM as the MBP, yet MATLAB failed in a bad way earlier than NumPy. My point is not that NumPy scaled better than MATLAB, my point is that more RAM memory allows for larger problems to be solved when performing memory intensive number crunching calculations -- notice how quickly the MATLAB runs are dislodged from their line for log(n)>10. (The reason for the use of different machines was the MATLAB license was only available on the iMac, and I didn't have control over the iMac's use -- otherwise we would have run even larger problems. These tests were also performed 8 years ago, and both employed sparse matrix techniques.)

2.png

In a nutshell, more RAM allows for slightly larger problems to be solved since when RAM limits are exceeded the algorithms fail catastrophically.

Solouki
 
Perhaps just a modicum of explanation for a particular use of RAM ... a small example case illustrating differing time complexities for alternative algorithms ...

.............



In a nutshell, more RAM allows for slightly larger problems to be solved since when RAM limits are exceeded the algorithms fail catastrophically.

Solouki
Thanks for the informative post.

  1. Is this a 64gb imac vs 16gb mbp?
  2. Can you say how big the data sets were in both cases(small vs large)?
 
I did. I use Houdini fairly often and it loves all the ram it can get. I also use Final Cut, Logic, and some other ram devouring programs.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.