I have just started using VBA and my knowledge is limited to a couple of books (VBA for dummies for example), so I am looking for some help to speedup an excel workbook I have created using a number of functions I have programmed in VBA.
Basically my spreadsheet is a summary of production activity across 10 machines, each machine having 12 'measures'. I have therefore defined 120 functions in VBA. To add to this the summary table has a set of parameters such as period, product type etc, so the data can be dynamically changed when these values are changed.
Each measure is calculated by looping through a table in a worksheet adding to its total if the criteria of that function is met. So far I have 4 months worth of data which is 200,000 rows of data, and it calculates each request on the summary sheet in about 120 seconds.
My question is, is their a better way to handle the dataset/table to speed up the process further? My inital thoughts are:
a) On selecting the parameters of the summary report and clicking the update button, filter the table by that criteria and load it into an array of which each function will loop through. That way it reduces the dataset speeding up the looping process?
b) Holding the dataset in a database and referring to the data here rathert han in excel? (although I would like ot have a drill down facility so users can see the detail of the summarised data)
Any thoughts?
Basically my spreadsheet is a summary of production activity across 10 machines, each machine having 12 'measures'. I have therefore defined 120 functions in VBA. To add to this the summary table has a set of parameters such as period, product type etc, so the data can be dynamically changed when these values are changed.
Each measure is calculated by looping through a table in a worksheet adding to its total if the criteria of that function is met. So far I have 4 months worth of data which is 200,000 rows of data, and it calculates each request on the summary sheet in about 120 seconds.
My question is, is their a better way to handle the dataset/table to speed up the process further? My inital thoughts are:
a) On selecting the parameters of the summary report and clicking the update button, filter the table by that criteria and load it into an array of which each function will loop through. That way it reduces the dataset speeding up the looping process?
b) Holding the dataset in a database and referring to the data here rathert han in excel? (although I would like ot have a drill down facility so users can see the detail of the summarised data)
Any thoughts?