largeselection
Active Member
- Joined
- Aug 4, 2008
- Messages
- 358
Quick Summary: Is it possible to simulate closing Excel and opening Excel from within a workbook using a Macro?
Some Details: Basically I have a workbook which runs and when it is done I see that the memory use (in the Ctrl+alt+del system tasks section) for Excel is at 1,250MB. It won't let me update a pivot table I have in the workbook because it says the system is out of resources. I noticed that if I close out of the Excel and re-open the file, Excel will only be at 270MB and everything works fine so it appears to be the result of the code that I'm running that causes the issue. So is it possible to simulate the closing of excel without actually closing excel?
More Details: The way the workbook works now is I get a large CSV data file. So in my code, I create a connection via ODBC to that text file and pull in some data (in one example 300K rows). Then in my workbook about 35 columns of formulas are added to the table that imports and the formulas all drag down to the bottom. I then copy/paste values all those columns since there are a ton of formulas over a lot of cells so I value them out to help the size. That worksheet (containing the original data+35 columns of formulas) is copied into a new workbook, which I automatically save as a new CSV file. I then delete the data (and query) from the original workbook. I then alter the external data connection (again via ODBC to a CSV file) for a pivot table to reflect the name/address of the new CSV file that I created (with formulas) and refresh to update the data.
Everything seems to run fine until I try and refresh the pivot table. At that point it says "ODBC exceeds system resources" and when I look at system tasks I see 1,250MB. But at that point, the connection for that pivot table has already been changed. So if I close the workbook and then re-open it I see that system tasks has Excel at only 280MB and when the workbook re-opens the pivot table/connection automatically refreshes (and since the connection has already been established) the updated data is reflected in the table.
So does this seem like a crazy process? Do you have suggestions on how I could accomplish this in a less "system-hoggy" way?
THANKS!
Some Details: Basically I have a workbook which runs and when it is done I see that the memory use (in the Ctrl+alt+del system tasks section) for Excel is at 1,250MB. It won't let me update a pivot table I have in the workbook because it says the system is out of resources. I noticed that if I close out of the Excel and re-open the file, Excel will only be at 270MB and everything works fine so it appears to be the result of the code that I'm running that causes the issue. So is it possible to simulate the closing of excel without actually closing excel?
More Details: The way the workbook works now is I get a large CSV data file. So in my code, I create a connection via ODBC to that text file and pull in some data (in one example 300K rows). Then in my workbook about 35 columns of formulas are added to the table that imports and the formulas all drag down to the bottom. I then copy/paste values all those columns since there are a ton of formulas over a lot of cells so I value them out to help the size. That worksheet (containing the original data+35 columns of formulas) is copied into a new workbook, which I automatically save as a new CSV file. I then delete the data (and query) from the original workbook. I then alter the external data connection (again via ODBC to a CSV file) for a pivot table to reflect the name/address of the new CSV file that I created (with formulas) and refresh to update the data.
Everything seems to run fine until I try and refresh the pivot table. At that point it says "ODBC exceeds system resources" and when I look at system tasks I see 1,250MB. But at that point, the connection for that pivot table has already been changed. So if I close the workbook and then re-open it I see that system tasks has Excel at only 280MB and when the workbook re-opens the pivot table/connection automatically refreshes (and since the connection has already been established) the updated data is reflected in the table.
So does this seem like a crazy process? Do you have suggestions on how I could accomplish this in a less "system-hoggy" way?
THANKS!