MikkelDuif
New Member
- Joined
- Nov 26, 2016
- Messages
- 9
Hello,
I am currently working out a dashboard in Excel 2016, which might be used on earlier versions of Excel as well.
Right now, the speed of the dashboard is alright, but I would like to know of any enhancements in performance if possible, as there might be added more data later on, or it might be run on a slower computer.
The dashboard consists of 16 dashboard sheets, 3 data sheets, and 1 sheet for collecting data from data sheets to reference to used graphs in frontend.
The sizes of data sheets are as follows:
Data 1: 16,185 rows, 15 columns (242405 cells total)
Data 2: 86 rows, 53 columns (4,548 cells total)
Data 3: 426 rows, 44 columns (18,744 cells total)
The collection of data sheets is mainly based on =vlookup functions combined with =match, and lookup values are changed with a dropdown in the frontend, so the data showed becomes dynamic as to show what the user wants.
E.g., data 1 sheet has 5 different variables (e.g. a time period, a product, etc.). Column B,C,D,E show 4 of the variables, and I use column A as a =B&C&D&E, so I can =vlookup the data and match with column 1 which stores the last variable.
I have tried to search the internet for performance on calculations with with different functions, e.g. =vlookup, =index, =match, =getpivotdata etc.
Does anybody know if I would experience any enhancements in performance if I change my approach on how to get the data? Or is it mainly for older versions I would experience that?
The data sheet currently has 2,500 cells filled out, where around half of it are =vlookup functions.
The sheet does not use any macros except for 2, which are only run upon request.
I am currently working out a dashboard in Excel 2016, which might be used on earlier versions of Excel as well.
Right now, the speed of the dashboard is alright, but I would like to know of any enhancements in performance if possible, as there might be added more data later on, or it might be run on a slower computer.
The dashboard consists of 16 dashboard sheets, 3 data sheets, and 1 sheet for collecting data from data sheets to reference to used graphs in frontend.
The sizes of data sheets are as follows:
Data 1: 16,185 rows, 15 columns (242405 cells total)
Data 2: 86 rows, 53 columns (4,548 cells total)
Data 3: 426 rows, 44 columns (18,744 cells total)
The collection of data sheets is mainly based on =vlookup functions combined with =match, and lookup values are changed with a dropdown in the frontend, so the data showed becomes dynamic as to show what the user wants.
E.g., data 1 sheet has 5 different variables (e.g. a time period, a product, etc.). Column B,C,D,E show 4 of the variables, and I use column A as a =B&C&D&E, so I can =vlookup the data and match with column 1 which stores the last variable.
I have tried to search the internet for performance on calculations with with different functions, e.g. =vlookup, =index, =match, =getpivotdata etc.
Does anybody know if I would experience any enhancements in performance if I change my approach on how to get the data? Or is it mainly for older versions I would experience that?
The data sheet currently has 2,500 cells filled out, where around half of it are =vlookup functions.
The sheet does not use any macros except for 2, which are only run upon request.