I've been having this strange issue and I haven't had much luck investigating through Google etc.
I have been writing a program in C for a university assignment, using the MPI framework (for writing distributed processing apps). When you test/run the application, the framework creates as many processes as you request. Through my testing, I've might have 20-30 instances of the same program running on the one PC, and through bugs in the code they get stuck at 100% CPU or wait for an event indefinitely. Looking in activity monitor, I see all the stuck processes, using about 10MB of ram / 40MB of virtual ram. I have 4GB of ram on my system, and I've had at least 2GB free at any point throughout.
However, what I've noticed happening, is that when these processes are running, my VM Size shoots up from 180GB or so to around 400GB, and about 1GB of my hard disk space disappears (I use iStat, which is how I realised it in the first place). I've been letting my friend, doing the same assignment, use my Mac through ssh. He was working on it overnight, and when I woke up in the morning, I had only 74MB HDD space left, and errors popping up all over the place - although I still had plenty of RAM free. Also, only 200MB of actual swap space was being used. After rebooting, I had 12GB free again.
Why is this happening? In the most extreme case, my friend could've run 200 processes using up 1GB of RAM. I've been reading about VM size, which apparently has something to do with allocating memory pages for use later, resulting in some insane VM sizes which should be ignored. Why is OS X seemingly allocating huge chunks of my HDD?
Thanks for any help in advance
Didn't know whether to put this in here or under Programming...
I have been writing a program in C for a university assignment, using the MPI framework (for writing distributed processing apps). When you test/run the application, the framework creates as many processes as you request. Through my testing, I've might have 20-30 instances of the same program running on the one PC, and through bugs in the code they get stuck at 100% CPU or wait for an event indefinitely. Looking in activity monitor, I see all the stuck processes, using about 10MB of ram / 40MB of virtual ram. I have 4GB of ram on my system, and I've had at least 2GB free at any point throughout.
However, what I've noticed happening, is that when these processes are running, my VM Size shoots up from 180GB or so to around 400GB, and about 1GB of my hard disk space disappears (I use iStat, which is how I realised it in the first place). I've been letting my friend, doing the same assignment, use my Mac through ssh. He was working on it overnight, and when I woke up in the morning, I had only 74MB HDD space left, and errors popping up all over the place - although I still had plenty of RAM free. Also, only 200MB of actual swap space was being used. After rebooting, I had 12GB free again.
Why is this happening? In the most extreme case, my friend could've run 200 processes using up 1GB of RAM. I've been reading about VM size, which apparently has something to do with allocating memory pages for use later, resulting in some insane VM sizes which should be ignored. Why is OS X seemingly allocating huge chunks of my HDD?
Thanks for any help in advance