I've attached an example data file that the program needs to read (it is actually a comma separated file but had to change the extension). The thing is, it needs to read data like this from multiple streams and process it at speeds up to one piece per second. It will be working discreetly to calculate things such as the moving average and must also take into account things such as network latency and inability to access data without hanging the computer. I'm really not looking for programming help as such as I can handle that part I'm more asking for help with how best to approach the problem. What do I need to look out for? What would the best approach be? Does anyone have any experience with reading in large data sets and processing them? Especially when the data is being streamed to the computer from the internet in real time. I would imagine it would be handy to write a helper application that just receives the data from the net and stores it into a file so that the main program can process it at it's leisure and does not need to worry about the network side of things. I guess it also helps with security as you are sure the data you are reading is in the correct format. All this will be done in C, although I'm starting to think Python maybe a better alternative as writing the file handling part is unnecessarily complex in C.