MP 7,1 1.5TB of RAM? What kind of work needs that?

high heaven

macrumors regular
Original poster
Dec 7, 2017
231
45
I can't think of any works using 1.5TB of RAM, especially with macOS. Can anyone explain and tell me which software and work require tons of RAM space up to 1.5TB with macOS?
 

ZombiePhysicist

macrumors 6502a
May 22, 2014
978
671
One thing I wish is there were some RAM disk tools.

But SSDs cannot handle the super fast random accesses like RAM can. Sequential throughput is great, but lots of random read/writes are significantly slower than RAM.

So if you have a lot of random accesses, like for database work, having a giant RAM disk for the that, with some kind of write-out caching to a fast SSD would be pretty great.

It would be cool to have like a UPS battery backup for the motherboard itself too.
 
  • Like
Reactions: Snow Tiger

barbu

macrumors 6502
Jul 8, 2013
493
554
wpg.mb.ca
Massive media projects, basically. 8K video with 7 channel audio? That's gonna be a lot of huge files held in memory. Or a Logic project with hundreds of instruments and effects, and so on.
Simply put, if you work with huge files, you need RAM to match.
 
  • Like
Reactions: tdhurst

bxs

macrumors 65816
Oct 20, 2007
1,048
448
Seattle, WA
Some reading is best done to get a decent understanding for the reasons for equipping a computer with large amounts of memory.

In terms of speed (and speed is good) we have in an ordered list (an incomplete list at best)

Processor performance
Processor cache (L1, L2, L3, etc)
Memory (RAM)
Fast i/o devices such as SSDs
Flash-based devices
Spinning disks (single spindles and multiple spindles using RAID)
Networks
Tape media

Big memory -> https://en.wikipedia.org/wiki/Big_memory

"Big memory computers are machines with a large amount of RAM (random-access memory) memory. The computers are required for databases, graph analytics, or more generally, data science and big data.[1] Some database systems are designed to run mostly in memory, rarely if ever retrieving data from disk or flash memory. See list of in-memory databases.

The performance of big memory systems depends on how the CPU's or CPU cores access the memory, via a conventional memory controller or via NUMA (non-uniform memory access). Performance also depends on the size and design of the CPU cache.

Performance also depends on OS design. The "Huge pages" feature in Linux can improve the efficiency of virtual memory.[2] The new "Transparent huge pages" feature in Linux can offer better performance for some big-memory workloads.[3] The "Large-Page Support" in Microsoft Windows enables server applications to establish large-page memory regions which are typically three orders of magnitude larger than the native page size.[4]"


Scientists Create 'Universal' Computer Memory That Could Change The Way We Store Data
DAVID NIELD
24 JUN 2019


Also, file servers will benefit tremendously having large amounts of memory (RAM) as file data being served up can be pre-fetched from slow storage devices into memory so the data can be served up quickly to the file server clients as required. This becomes more and more important as the client base scales up to 100s and beyond.
 

AdamSeen

macrumors regular
Jun 5, 2013
157
122
One thing I wish is there were some RAM disk tools.

But SSDs cannot handle the super fast random accesses like RAM can. Sequential throughput is great, but lots of random read/writes are significantly slower than RAM.

So if you have a lot of random accesses, like for database work, having a giant RAM disk for the that, with some kind of write-out caching to a fast SSD would be pretty great.

It would be cool to have like a UPS battery backup for the motherboard itself too.
There is - MacOS comes with ramdisk utilities. For development you could have a master database in RAM and replicate to another database that’s on the disk, intermittently copying out the changes or dropping the data on restart - depending how important the data is.
 
  • Like
Reactions: ZombiePhysicist

ZombiePhysicist

macrumors 6502a
May 22, 2014
978
671
There is - MacOS comes with ramdisk utilities. For development you could have a master database in RAM and replicate to another database that’s on the disk, intermittently copying out the changes or dropping the data on restart - depending how important the data is.
What is the app or is it some command line tool?

I'd love an app that basically on boot, copies and entire drive over, and then boots and runs from ram.
 

northernmunky

macrumors 6502a
Jan 19, 2007
780
202
London, Taipei
Adobe Premiere / After Effects / DaVinci Resolve
I work for a company that does lots of After Effects & Photoshop work and our projects regularly eat up over 128Gb RAM to store all the textures we work with and our work is small fry compared to bigger companies. In fact I've noticed GPU RAM is getting eaten up more and more as these apps get more optimised for GPU acceleration.
 
  • Like
Reactions: OkiRun

AidenShaw

macrumors P6
Feb 8, 2003
18,489
4,489
The Peninsula
I'd love an app that basically on boot, copies and entire drive over, and then boots and runs from ram.
I've had workflows that benchmark *slower* from a RAM disk.

The reason is that RAM disks are CPU heavy - the CPU is working to transfer each of the bytes.

Real disks (particularly PCIe disks) offload all of the data transfer work to the DMA engine - zero CPU involvement until the drive signals "done".

If your application is heavy on both CPU and "disk", the RAM disk can steal cycles from the CPU and slow the overall speed.
 

ZombiePhysicist

macrumors 6502a
May 22, 2014
978
671
I've had workflows that benchmark *slower* from a RAM disk.

The reason is that RAM disks are CPU heavy - the CPU is working to transfer each of the bytes.

Real disks (particularly PCIe disks) offload all of the data transfer work to the DMA engine - zero CPU involvement until the drive signals "done".

If your application is heavy on both CPU and "disk", the RAM disk can steal cycles from the CPU and slow the overall speed.
That's a great point. I wonder if it may depend somewhat on how many cores you have perhaps too? How many full time cores would it take to move memory around? So perhaps less impact on a high core machine? Or if each core has to arrange it's own memory, then perhaps more. So yea, depends on the nature of the applications you're running, I suspect.
 

Snow Tiger

macrumors 6502a
Dec 18, 2019
581
363
One thing I wish is there were some RAM disk tools.

But SSDs cannot handle the super fast random accesses like RAM can. Sequential throughput is great, but lots of random read/writes are significantly slower than RAM.

So if you have a lot of random accesses, like for database work, having a giant RAM disk for the that, with some kind of write-out caching to a fast SSD would be pretty great.

It would be cool to have like a UPS battery backup for the motherboard itself too.
please ... do not give me any more cool ideas . Just when I think I'm up to my eyeballs in projects , you just had to remind me how wicked fast ram disks are . My Apple IIGS had a ram disk ( with battery back up , too ) and it was quite fast in PRODOS . I played around with ram disks with the Nehalem Mac Pros , even though Apple thought they were unnecessary . I wasn't exactly thrilled , either , due to the small capacities involved ( no more than 128 GB ) .

Now , 1 or 2 TB of fast main system memory in a MP7,1 just screams ram disk . Calling Alex tsialex , make us a tool please :cool: .
- - Post merged: - -

In terms of speed (and speed is good) we have in an ordered list (an incomplete list at best)

Processor performance
Processor cache (L1, L2, L3, etc)
Memory (RAM)
Fast i/o devices such as SSDs
Flash-based devices
Spinning disks (single spindles and multiple spindles using RAID)
Networks
Tape media
Somewhere in between mechanicals and tape , you might add Blu-ray disks . They are a small capacity but legitimate archival media . It's a shame we can't span ODD in macOS .
 
Last edited:
  • Like
Reactions: ZombiePhysicist

s66

macrumors regular
Dec 12, 2016
103
58
What is the app or is it some command line tool?

I'd love an app that basically on boot, copies and entire drive over, and then boots and runs from ram.
Command line or 3rd party tool, but the command line is just one line.

See e.g. here:

More detailed information can be found in the man page of hdiutil. Type "man hdiutil" to see details and options.

You can combine it with diskutil into one command so you just have to use one line.
[or you can use GUI based disk utility from apple I suppose, but it's far more limited in what it can do than the command line diskutil]

Diskutil again has it's own manpage: just type "man diskutil" on the command line (it'll take a while to read it all, it's a long one)

Anyway here's an example of how you use both combined to quickly create a RAM disk:
Code:
$ diskutil erasevolume HFS+ 'RAMdisk' `hdiutil attach -nomount ram://1048576`
Started erase on disk3
Unmounting disk
Erasing
Initialized /dev/rdisk3 as a 512 MB case-insensitive HFS Plus volume
Mounting disk
Finished erase on disk3 RAMdisk
$
That creates the RAM disk of 1048576 blocks (each 512 byte long) [total of half a Gigabyte] and mounts it.
Ready to use.

Reboot and you lost it all.
 

Snow Tiger

macrumors 6502a
Dec 18, 2019
581
363
Yes, the bane of RAM disks.
the other shoe always has to drop ...

WAIT , I have it .

Meet my little friend :

high_res_nvdimm_32gb.jpg


Micron Nonvolatile registered DIMM (NVRDIMM) . ECC 288 Pin DDR4 . Available in capacities as high as 32GB .

Evidently , it doesn't need a battery .

Spec sheet attachment below .

ass36c4gx72xf1zjpeg.jpg
 

Attachments

Last edited:
  • Like
Reactions: OkiRun and bxs

AidenShaw

macrumors P6
Feb 8, 2003
18,489
4,489
The Peninsula
That's a great point. I wonder if it may depend somewhat on how many cores you have perhaps too? How many full time cores would it take to move memory around? So perhaps less impact on a high core machine? Or if each core has to arrange it's own memory, then perhaps more. So yea, depends on the nature of the applications you're running, I suspect.
Cores is certainly something that could affect things. If you have lots of cores, losing one to RAM disk data copies wouldn't be significant.

One the other hand, are typical RAM disk drivers single-threaded? NVMe drives support massive parallelism - so it might be interesting to see whether single-threaded CPU-based RAM disks would even be faster than massively parallel NVMe DMA transfers

Evidently , it doesn't need a battery .
It does seem to need the super-capacitor or an alternate power source to keep data intact after unexpected power failures.

optane.jpg
 

AdamSeen

macrumors regular
Jun 5, 2013
157
122
The best use case for RAM is for latency, so whilst a database in memory may or may not have higher throughput (although I would expect it to have), the access times will be substantially quicker. Ideal for workloads with lots of small transactions with OLTP databases, like Postgres.

Another good use case is docker containers because often you don’t want them to persist and you should get better performance. Have a look at tmpfs for that use case or ramdisk. Not sure if they’ll be any difference between them, but build times should be quicker with ramdisk

One thing to bare in mind, I’ve noticed the OS itself caches up to half of the RAM quite aggressively and then starts paging out. So lots of files will get automatically cached on use.
 
Last edited:
  • Like
Reactions: OkiRun

th0masp

macrumors 6502
Mar 16, 2015
410
210
germany
This all seems like a really long-winded way of stating that there are no obvious reasons to install that kind of memory in a machine running the typical video-, photo- or audio-applications as advertised by Apple and generally associated with the platform. 😇
 

Snow Tiger

macrumors 6502a
Dec 18, 2019
581
363
This all seems like a really long-winded way of stating that there are no obvious reasons to install that kind of memory in a machine running the typical video-, photo- or audio-applications as advertised by Apple and generally associated with the platform. 😇
Still image and audio editing workflows have had their bandwidth needs met a long time ago in Mac Systems . There's a good reason why all those souped up ten year old Mac Pros are still widely used .

If these types of apps are not running fast or completely , it's because of a poorly written program ( e.g. Photoshop ) or apps not running on the hardware optimized in their code ( e.g. Apple's Logic ) .

The big wildcard is video editing . It keeps getting increasingly sophisticated ( e.g. 8K ) . If there is any latency experienced , an option like a sizable ram disk might just be quite valuable .
 

astrorider

macrumors 6502a
Sep 25, 2008
504
34
This all seems like a really long-winded way of stating that there are no obvious reasons to install that kind of memory in a machine running the typical video-, photo- or audio-applications as advertised by Apple and generally associated with the platform. 😇
The way this and the original post are written seems to doubt that Macs have applications outside video/audio that benefit from having lots of RAM, even with the examples from users above. Apple's website actually advertises Mac Pro performance for Matlab, Mathematica, and development build times in addition to video/audio/photo. This shouldn't be surprising since Macs have long been popular in many scientific and technical fields, even before Apple switched to a Unix-based OS that gained mainstream application support, which made Macs a no-brainer. The latest Mac Pro just means there's now less of a need to use additional machines (cloud or local) for some high memory jobs as well.
 

bxs

macrumors 65816
Oct 20, 2007
1,048
448
Seattle, WA
When I first laid my hands on a Cray-1 super computer back in late 1970s it had just 8 MB of memory; one million 64-bit words in Cray terms. At first we could only afford to rent 1/4 of this 8 MB while we developed additional code for the OS, and over several months we increased this to rent more and more until we accepted the Cray-1 with all of its 8 MB of memory. It cost our company in the region or 12 million dollars.

Today, with Clusters and their distributed memory the amount of memory available for tough complex problems needing huge amounts of memory is simply enormous and a lot less expensive.
 
  • Like
Reactions: ZombiePhysicist

now i see it

macrumors 601
Jan 2, 2002
4,514
8,910
I think the 1.5 GB capacity was designed into the system to future proof it for the next 15 years.
It wasn't long ago that a professional high end workstation maxed out at 128MB of RAM.
Photoshop ran on 8MB of RAM. The standard RAM config for a mainstream workstation was 16MB.

So things change with time