Try diagonalising a 2^N by 2^N matrix with a sufficient precision (e.g. to find the energy spectrum of a [sufficiently badly behaving] quantum physical system of N spin-1/2 particles). With 128GB of RAM you can go up to, say, 16 particles before you run out of memory while doing the computation. With 256GB you can go up to 17 particles. Is crashing your software and not getting results vs. not crashing and finishing a computation with an actual result difference enough? ;-)
By the way, even the 256GB maximum on RAM in a 2019 "Pro" workstation is a really bad joke. To get a perspective, our old Mac Pro 5,1, used for numerical computations supports (and has installed) 128GB of RAM. 10 years and $10k spent later, you can actually get something that would beat that ancient hardware... Sigh...
I've never seen a research lab etc running Macs - all the ones I've seen seem to run PCs either with Windows or dedicated bespoke non-OS setups. Just can't see a lab going for Macs, why would they? Genuine question.
Last edited: