OK, so my MC experience is a little old: I invented a modeling language called VSL used to produce models for variation simulation using a software product called VSA. I think it (at least VSA) still lives in some form as part of Siemens NX. (Prior to that, models had to be written in Fortran!)
It wound up being used by pretty much all the auto manufacturers, aircraft manufacturers, disk drive makers, etc. That was 30 years ago, though. But still, I probably have some insight into what your needs might be.
FWIW it was a HUGE breakthrough when, as a result of using VSA, GM was able to produce a Corvette where the hood could be assembled to the body using holes, not slots. Plop a bolt through a hole, tighten it, done! Huge big deal! The Corvette was the first car where that was possible. Of course, now it's standard.
-------
If you have a 100mbyte/sec limitation, it sounds like hard drive. That's a pretty typical SATA drive transfer rate now. As others have suggested, you may simply not have enough RAM, forcing the software to use disk storage or swap space to process your large datasets.
Your first concern is what platform your software runs on, and secondly, if on multiple platforms, which one it runs best on....
You may find Linux a better platform than either OSX or Windows. But, very much dependent on vendor support for the platform.
I would get something based on a high memory-capacity server board. This could go from build-it-yourself to the mid-tier, low-cost commodity servers used for web and database servers, to nice name-brand servers (I used to like IBM, but dunno if these are Lenovo now and how the quality is.)
You will need a 64-bit OS to take advantage of the memory. 16G? 32G? 64G?
Somebody more familiar with your particular software should be able to give you some guidance. You want to try to get the entire dataset and whatever working storage is needed into RAM!
Use a RAMDisk of some sort if the software INSISTS on using temporary disk storage. On Linux, use a tmpfs filesystem for any temp files. (It's in RAM, but backed by swap should you run out of RAM.)
I would get the basic box, with a standard disk drive, and increase RAM until you see no more improvement. At that point, you can experiment with faster disks.
If you really want fast, reliable drives, go SAS, not SATA. It uses the same connections as SATA - you can actually plug SATA drives into a SAS frame.
The best/fastest enterprise drives are only produced in SAS versions.
Consider RAID 0 and/or flash drive(s) if you do need to increase disk throughput.
Let''s say you have this huge dataset that your software has to read, that doesn't change between runs (or doesn't change much), and the read time is a significant portion of the processing. (Don't think that would really be the case, though.) Flash would be perfect for that. Even cheap SATA flash drives now do typically 200mbyte/sec. Throw a couple in a RAID 0 array for a cheap 400mbyte/sec. (There are higher-priced flash cards that plug into the motherboard bus directly that will do 800+mbytes/sec but they are quite costly, and driver support seems limited to Windows only.)
Consider add-on array processing boards, such as from MicroWay, if your software supports that. (Microway has been doing this for many, many years. I see they have moved to using GPUs. They used to use custom array calculation chips.) They also offer complete workstations, which might make a lot of sense. Their workstations will take as much of 256GB of RAM. If this will help you, of course you will have to consider on what OS platform(s) it's supported on. Even a consumer Nvidia card with software that can take advantage of the GPU for array processing could speed things up considerably.
Depends on what you are doing in your simulation. For the kinds of applications I was involved in, would definitely make a difference, because variation simulation randomly varies parameters in a model. For example, for mechanical assemblies, it's varying hole sizes, lengths, etc. to simulate real-world manufacturing variations. So for each simulation, there can be a ton of math to re-calculate the model, plugging parts together into an assembly. Lots of 3D translations. Generating random numbers under some distribution is a trivial amount of computing compared to running the model.