I'll start with my question, first. What operations tend to bog excel down, and how can I speed up handling of very large data sets?
My circumstances are as follows:
I have several sheets, each of which are upward of 10K rows and 20 columns, but one of which is over 60K rows.
These are reports from various sources. I use a "Lookup" sheet to do an index/match on all those sheets, and get a combined view of all these reports. Enter a unique ID, and it populates with the information EVERY sheet provides (or doesn't) on that ID. It also does a number of QOL tasks, like combining first/last names, stripping prefixes or suffixes from usernames to get real first/last names (to compare various sources of tracking, make sure they are accurate), and so some math on dates, like to see if the last login was a long time ago.
Is there anything that might help? Maybe, for instance, using helper columns to do a single index and match for a unique ID, then using a direct reference in each cell for that row based on those results, instead of doing an index/match in every single cell for a row? Like, instead of index/match, just do one match and then use indirect() to go to A-(match result), B-(match result), C-(match result), etc?
I use a lot of index and match, of course, I use a lot of right(), left(), len(), and concatenate() to reformat fields or emails into full names (our emails and usernames are full-name-based), but I suspect the biggest problem is just the amount of data, and I also suspect that nothing will solve that.
EDIT:I should add that I also use a fair amount of conditional formatting, but only on the lookup sheet, which is typically less than 200 rows at all times, never more than 1K.
My circumstances are as follows:
I have several sheets, each of which are upward of 10K rows and 20 columns, but one of which is over 60K rows.
These are reports from various sources. I use a "Lookup" sheet to do an index/match on all those sheets, and get a combined view of all these reports. Enter a unique ID, and it populates with the information EVERY sheet provides (or doesn't) on that ID. It also does a number of QOL tasks, like combining first/last names, stripping prefixes or suffixes from usernames to get real first/last names (to compare various sources of tracking, make sure they are accurate), and so some math on dates, like to see if the last login was a long time ago.
Is there anything that might help? Maybe, for instance, using helper columns to do a single index and match for a unique ID, then using a direct reference in each cell for that row based on those results, instead of doing an index/match in every single cell for a row? Like, instead of index/match, just do one match and then use indirect() to go to A-(match result), B-(match result), C-(match result), etc?
I use a lot of index and match, of course, I use a lot of right(), left(), len(), and concatenate() to reformat fields or emails into full names (our emails and usernames are full-name-based), but I suspect the biggest problem is just the amount of data, and I also suspect that nothing will solve that.
EDIT:I should add that I also use a fair amount of conditional formatting, but only on the lookup sheet, which is typically less than 200 rows at all times, never more than 1K.