Hey!
Hoping someone could answer this for me!
I have a data set comprised of 2 sheets of around 500,000 rows of data each in a single workbook.
The two sheets are the same report, both containing a unique identifier in column A and a corresponding status value in column B (which can only be 1 of 4 different values) however one sheet is old data the other is new data. The purpose is to highlight changes in status between the old data and new. The new data is being automatically exported daily from an SAP query tool which overwrites the previous export.
So, I have a macro that is built into a Macro enabled workbook template. It removes the old data sheet then takes the previous ‘new’ data and moves that to the ‘old’. It then imports the new data from the SAP export and creates a new ‘new data’ sheet. In other words it is continually cycling data from new to old.
In the outset I set up the workbook using data connections so when the new sheet is added, the data is pulled in from the new extract ready to conduct the next step which is the comparison. The trouble with this method is because the data set is so large the process of importing it is very slow, so I opted instead for a quicker option of instructing the workbook to open the new data file and simply moving the sheet into the template.
Once the new data is in place, the code will drop a vlookup (nested within an IF statement) in cell C2 in the ‘old’ data sheet, then a fill down command is used to copy the formula down for all rows.
This is the problem. The code works fine, except that it takes a very long time to process, like 10 mins or so.
My question is whether using a fill down is the most efficient means of applying a VLOOKUP to such a large range or whether there could be an alternative to shave off a few mins of processing time!
The pc has an i5 processor with 16gb RAM so should be ok to handle this stuff, and I’d expect some delay using a set this large, but I just wondered if it could be speeded up by using and alternative means?
Any ideas?
Thanks so much!
Hoping someone could answer this for me!
I have a data set comprised of 2 sheets of around 500,000 rows of data each in a single workbook.
The two sheets are the same report, both containing a unique identifier in column A and a corresponding status value in column B (which can only be 1 of 4 different values) however one sheet is old data the other is new data. The purpose is to highlight changes in status between the old data and new. The new data is being automatically exported daily from an SAP query tool which overwrites the previous export.
So, I have a macro that is built into a Macro enabled workbook template. It removes the old data sheet then takes the previous ‘new’ data and moves that to the ‘old’. It then imports the new data from the SAP export and creates a new ‘new data’ sheet. In other words it is continually cycling data from new to old.
In the outset I set up the workbook using data connections so when the new sheet is added, the data is pulled in from the new extract ready to conduct the next step which is the comparison. The trouble with this method is because the data set is so large the process of importing it is very slow, so I opted instead for a quicker option of instructing the workbook to open the new data file and simply moving the sheet into the template.
Once the new data is in place, the code will drop a vlookup (nested within an IF statement) in cell C2 in the ‘old’ data sheet, then a fill down command is used to copy the formula down for all rows.
This is the problem. The code works fine, except that it takes a very long time to process, like 10 mins or so.
My question is whether using a fill down is the most efficient means of applying a VLOOKUP to such a large range or whether there could be an alternative to shave off a few mins of processing time!
The pc has an i5 processor with 16gb RAM so should be ok to handle this stuff, and I’d expect some delay using a set this large, but I just wondered if it could be speeded up by using and alternative means?
Any ideas?
Thanks so much!
Last edited: