JeffGrant
Well-known Member
- Joined
- Apr 7, 2021
- Messages
- 558
- Office Version
- 365
- Platform
- Windows
Hi All,
My workbook is currently over 50MB, made up of:
45 sheets. Some of which are very wide but not so deep, whilst are other are small but convenient because the data starts at A1.
Many modules of VBA, with several thousand lines of code. The primary process VBA Routine has 58 ElseIf's in it.
Many PQ queries
and around 50 tables and 50 pivots tables
I have one table that gets added to everyday. It is only 5 columns wide, however, the number of records added every day can be between 500 during the week and 4000 on Fri/Sat/Sun. At the moment, that table is sitting at 147k records and growing for at least another 70 days, so I expect it to reach well over 200k records.
I have a quad core PC with 40Gb of DDR4 RAM.
The PQ code is used to pull data from the web and massage it, then the VBA code works on the resultant PQ tables.
As much as possible, I can do I have converted computations to arrays.
I don't have any conditional formatting, but I do have many blocks of data colour coded for easy viewing.
I recently removed all of the charts and all that did was reduce the size of the workbook by less than a negligible 1Mb.
I have added extra 20Gb of RAM, which sped the code up quite a bit and the workbook stopped crashing
The problem I have is this.
1. As the workbook get bigger, the computer slows down, that is expected.
2. The frequency of various parts of the VBA code being skipped is increasing. For example, I can run the same data over the code twice and get two different results.
3. Pivot tables are not being refreshed by the VBA code and I have to manually refresh them, and
4. I regularly save the workbook without any errors, but the next day when I open it, I get corrupted worked book messages and have to either use the Excel Open & Repair feature, if I do get it open, I then use the Clean feature from M-Z Tools over the VBA code or I have to revert to a copy that is 2 or 3 days old and lose some data.
As much as I read that Excel should handle millions of rows and many complex formulae, I am wondering if I am reaching the limits of Excel and the sales blurbs about its power are just that, exaggerated sales blurbs.
I don't understand why sequential VBA code is being skipped.
I don't understand why the saved workbooks are getting corrupted so regularly without any error messages?
Would it make much difference is I pulled out the growing table and put it in another workbook?
Can my primary VBA sub with 58 ElseIf's be the culprit? I have tried various other structures using CASE and Do's and those structures have not made any substantial difference.
I have tried pulling out the ElseIf's into their own subs, but that just added many more lines of code for no real benefit. Most of the ElseIf's are simply checking for a Yes/No value in a cell. With the ElseIf's, the criteria either passed or failed, and are one line statements to either terminate the routine and move on, or move to the next criteria test. So Mini-subs, just seems to be a waist of time.
Any and all suggestions are very graciously received as this model has been in the making for over two years now, and I now seem to be running into a volume/size related problem.
As always, thanks for all of your contributions.
Jeff.
My workbook is currently over 50MB, made up of:
45 sheets. Some of which are very wide but not so deep, whilst are other are small but convenient because the data starts at A1.
Many modules of VBA, with several thousand lines of code. The primary process VBA Routine has 58 ElseIf's in it.
Many PQ queries
and around 50 tables and 50 pivots tables
I have one table that gets added to everyday. It is only 5 columns wide, however, the number of records added every day can be between 500 during the week and 4000 on Fri/Sat/Sun. At the moment, that table is sitting at 147k records and growing for at least another 70 days, so I expect it to reach well over 200k records.
I have a quad core PC with 40Gb of DDR4 RAM.
The PQ code is used to pull data from the web and massage it, then the VBA code works on the resultant PQ tables.
As much as possible, I can do I have converted computations to arrays.
I don't have any conditional formatting, but I do have many blocks of data colour coded for easy viewing.
I recently removed all of the charts and all that did was reduce the size of the workbook by less than a negligible 1Mb.
I have added extra 20Gb of RAM, which sped the code up quite a bit and the workbook stopped crashing
The problem I have is this.
1. As the workbook get bigger, the computer slows down, that is expected.
2. The frequency of various parts of the VBA code being skipped is increasing. For example, I can run the same data over the code twice and get two different results.
3. Pivot tables are not being refreshed by the VBA code and I have to manually refresh them, and
4. I regularly save the workbook without any errors, but the next day when I open it, I get corrupted worked book messages and have to either use the Excel Open & Repair feature, if I do get it open, I then use the Clean feature from M-Z Tools over the VBA code or I have to revert to a copy that is 2 or 3 days old and lose some data.
As much as I read that Excel should handle millions of rows and many complex formulae, I am wondering if I am reaching the limits of Excel and the sales blurbs about its power are just that, exaggerated sales blurbs.
I don't understand why sequential VBA code is being skipped.
I don't understand why the saved workbooks are getting corrupted so regularly without any error messages?
Would it make much difference is I pulled out the growing table and put it in another workbook?
Can my primary VBA sub with 58 ElseIf's be the culprit? I have tried various other structures using CASE and Do's and those structures have not made any substantial difference.
I have tried pulling out the ElseIf's into their own subs, but that just added many more lines of code for no real benefit. Most of the ElseIf's are simply checking for a Yes/No value in a cell. With the ElseIf's, the criteria either passed or failed, and are one line statements to either terminate the routine and move on, or move to the next criteria test. So Mini-subs, just seems to be a waist of time.
Any and all suggestions are very graciously received as this model has been in the making for over two years now, and I now seem to be running into a volume/size related problem.
As always, thanks for all of your contributions.
Jeff.