Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

AdamA9

macrumors 65816
Original poster
Feb 2, 2010
1,232
355
hey guys,

I couldn't think of a good place to put this, so I thought here is as good as any.

I need some help. On a monthly basis I take a download of data, and currently am just pasting this into Excel, and all my calculations run and graphs update and everything - which is great because it means I don't have to do a lot of work when presenting this. However, I've noticed as the data is growing, the calculations are taking longer and longer to run, and now it's got to the point of crashing.

Is there something better than Excel, that I can use to just paste my data in, and then have it produce tables and graphs showing the data as I need it?

Thanks Guys.

:)
 
OKay thanks, I was hoping it would be a simple move to something that could just pick up my data, query it and display it all nice for me.
 
yes, I have come to realize, like as stated above, Excel is the best app. It is absurd the amount of stuff that it can do. maybe updating your mac would help the matter, like upping the RAM.
 
yes, I have come to realize, like as stated above, Excel is the best app. It is absurd the amount of stuff that it can do. maybe updating your mac would help the matter, like upping the RAM.

I'm actually doing this on a Windows PC. I'm at work you see, with just 2GB RAM. There are quite a lot of calculations running on these sheets, with various look-ups.
 
if you are willing to go into windows then spotfire decision suite is the way to go. very expensive, takes some training but is by far the best way to analyse and present large multivariate data sets.
 
One other approach is to have running updates. That is, you import the data into a massive database (worksheet in Excel), then set up other worksheets which drive the data, and then other worksheets for the display and still others for the graphs. I set up such a system with direct download from our mainframe. It used to take them 10-15 hours per week. By rearranging and automating, then entire process took 25 minutes, and 15 was downloading from the mainframe. Most of the other 10 minutes were updating the ppt displays that were automatically linked to the Excel displays. (All done using XL2003 on Windows XP)

You might want to check out Excel User. Charley Kyd is one of the best at training in this area; see especially Dashboards.

Worth a look anyway.
 
One other approach is to have running updates. That is, you import the data into a massive database (worksheet in Excel), then set up other worksheets which drive the data, and then other worksheets for the display and still others for the graphs. I set up such a system with direct download from our mainframe. It used to take them 10-15 hours per week. By rearranging and automating, then entire process took 25 minutes, and 15 was downloading from the mainframe. Most of the other 10 minutes were updating the ppt displays that were automatically linked to the Excel displays. (All done using XL2003 on Windows XP)

You might want to check out Excel User. Charley Kyd is one of the best at training in this area; see especially Dashboards.

Worth a look anyway.

You know that actually makes sense, I don't know why I didn't think of that! Thanks :)
 
You know that actually makes sense, I don't know why I didn't think of that! Thanks :)

Well, I'm sure it sounds like your set up. The part I didn't cover was the running totals (each worksheet can have its own length of time, etc.), so that you can set up how much is displayed (how many days, weeks, months, etc.) by changing only one cell in the entire workbook.

Thus, a workbook that originally was 90 MB and took an average of 25 minutes of recalculation time, was reduced to just under 10 MB, with an average 30 seconds recalculation time. It is not the amount of data, but how you access that data. Some functions are very processor intensive and bog the workbook down; i.e. they recalculate every time you make any changing in the workbook. By changing setup and types of functions and formulas, you can significantly reduce the time it takes.

But, hey, just offering another thought, from an old codger who has been there and done it.
 
Well, I'm sure it sounds like your set up. The part I didn't cover was the running totals (each worksheet can have its own length of time, etc.), so that you can set up how much is displayed (how many days, weeks, months, etc.) by changing only one cell in the entire workbook.

Thus, a workbook that originally was 90 MB and took an average of 25 minutes of recalculation time, was reduced to just under 10 MB, with an average 30 seconds recalculation time. It is not the amount of data, but how you access that data. Some functions are very processor intensive and bog the workbook down; i.e. they recalculate every time you make any changing in the workbook. By changing setup and types of functions and formulas, you can significantly reduce the time it takes.

But, hey, just offering another thought, from an old codger who has been there and done it.


It sounds like the exact set up I need. So thanks for your thoughts, it's a great help if it reduces my stress when running this sheet. The problem I was having was that every time I made an adjustment a calculation was run, and that was slowing me down. I thought about turning off auto-calculation, but I'm going to give your suggestion a go next week. I'm running rows and rows of calculations such as:

=SUMPRODUCT(--(ISNUMBER(SEARCH(D24,Data!AT:AT))),--(Data!K:K="IT/Software - Europe/E-L/iPA/Workflow / AEP"),--(Data!E:E>=$V$9),--(Data!E:E<=$W$9))

and

=SUM((Data!AM:AM="iPA Workbench error")*(Data!K:K="IT/Software - Europe/E-L/EDMS/Unlock Batch"))

And I don't know if these COUNTIF and SUMPRODUCT calculations are excessively resource hungry though? :confused:
 
You might want to check out this as well: Decision Models discussion of volatile functions, especially the fine distinction involving SUMIF examples.

SUMPRODUCT is very powerful but carries a large overhead in terms of calculation. By rearranging data setup in the data worksheet, you can use simpler functions and many more and at the same time drastically reduce calculation times.

=SUMPRODUCT(--(ISNUMBER(SEARCH(D24,Data!AT:AT))),--(Data!K:K="IT/Software - Europe/E-L/iPA/Workflow / AEP"),--(Data!E:E>=$V$9),--(Data!E:E<=$W$9))

and

=SUM((Data!AM:AM="iPA Workbench error")*(Data!K:K="IT/Software - Europe/E-L/EDMS/Unlock Batch"))

And I don't know if these COUNTIF and SUMPRODUCT calculations are excessively resource hungry though?

BTW both of your examples have the same overhead.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.