I have a script that regularly has to duplicate large files, up to 1 GB, on the same disk. So it is reading and writing at the same time on the same drive. Sometimes I'm doing this on an SSD but usually its running on hard drives. So with all that extra seek time, it really bogs down the process.
It occurred to me that with the amount of RAM in computers these days, it would be very easy to queue up the files copies, and do them one at a time where it reads one whole file in to ram, and then writes out the whole file back to disk. So you'd be maxing out throughput in both directions instead of seeking out of control.
Is there an "easy" way to do copies like this?
The only way I can think to do it, is to have my script create a RAM disk, then copy from the drive to the ram disk, then copy back from the ram disk to the drive. That should work and it should be faster, but I've worked with RAM disks before in macos scripts and it tends to be buggy and poorly supported. It would be great if there was some command like `cp` that natively supported copying like this?
I don't even know what to call it, hence the weird name of this post.
It occurred to me that with the amount of RAM in computers these days, it would be very easy to queue up the files copies, and do them one at a time where it reads one whole file in to ram, and then writes out the whole file back to disk. So you'd be maxing out throughput in both directions instead of seeking out of control.
Is there an "easy" way to do copies like this?
The only way I can think to do it, is to have my script create a RAM disk, then copy from the drive to the ram disk, then copy back from the ram disk to the drive. That should work and it should be faster, but I've worked with RAM disks before in macos scripts and it tends to be buggy and poorly supported. It would be great if there was some command like `cp` that natively supported copying like this?
I don't even know what to call it, hence the weird name of this post.