ClimoC
Well-known Member
- Joined
- Aug 21, 2009
- Messages
- 584
I have 2 csv's, one 30mb the other 150mb.
Once they're recordsets, comparison code begins. Values from the 30mb csv are appended to the extra columns I added to the RS from the 150mb file.
It's been running for ages now (a lot of comparisons to do) - and with each passing moment, the RAM being taken up to hold the Recordsets in memory is growing... now up to 522,000kb
My question is why? I'd understand 20-50mb for Excel, 180mb for the records (though I'd hope less, since it's data in memory, not API stuff).
In my head, it would be Excel memory + both the workbooks size into memory + both the workbooks as recordsets in memory + the 30 on top of the 150 (once they're merged in memory via the script)
This would still only get to 450-480mb of Ram in my head... why am I (at time of writing) now up to 531,000kb?
PS there's no ADODB memory leak in my code - no infinite loops - no unhandled errors
Once they're recordsets, comparison code begins. Values from the 30mb csv are appended to the extra columns I added to the RS from the 150mb file.
It's been running for ages now (a lot of comparisons to do) - and with each passing moment, the RAM being taken up to hold the Recordsets in memory is growing... now up to 522,000kb
My question is why? I'd understand 20-50mb for Excel, 180mb for the records (though I'd hope less, since it's data in memory, not API stuff).
In my head, it would be Excel memory + both the workbooks size into memory + both the workbooks as recordsets in memory + the 30 on top of the 150 (once they're merged in memory via the script)
This would still only get to 450-480mb of Ram in my head... why am I (at time of writing) now up to 531,000kb?
PS there's no ADODB memory leak in my code - no infinite loops - no unhandled errors