ShogunStealth
New Member
- Joined
- Nov 6, 2021
- Messages
- 23
- Office Version
- 2019
- Platform
- Windows
I am working with a large amount of data, over 600K+ and up to Column K. The raw data is series of columns which looks at SharePoint websites to look at numerous metrics to determine things like;
Largely it is a series summation mainly several complex countifs, sumifs with fill-down, fill-rights type statements and numerous other formatting features. I don't need help with any of the formulas because it all works. The issue or advise I am seeking is how to pause the computation into blocks of effort as processing power is horrendous, especially when I don't filter the list the number of websites down to reduce the total of 600K+ line items. My i7-10510U CPU @ 1.80GHz[Cores 4] ran at 100% CPU for over 2 hours with completing the processing. It was interesting that memory never went much over 50% or 8 Gb
Are there techniques or strategies to either;
This is a copy the heading in the raw data
- For each website how what are the file types for each website and the entire spreadsheet; eg .docx, .xlss, .pdf, etc and many others up to 150 different one assuming anything after the last "." is a file type.
- How many of each file types are there for for each website and entire sheet eg docx 7, pdf 13 etc
- What is the capacity of each of those files types for each website and the entire spreadsheet. Docx 140 Mb, pdf 2,123 Mb etc
- When was the last time any file in that website updated - site 1 - 2 years 5 months 16 days
- The age of each file type across 6 date ranges 0 -14 days, 15 - 30 days, 31- 90 days, 91 days - 180 days , 181 - 365 days, more than 365 days
Largely it is a series summation mainly several complex countifs, sumifs with fill-down, fill-rights type statements and numerous other formatting features. I don't need help with any of the formulas because it all works. The issue or advise I am seeking is how to pause the computation into blocks of effort as processing power is horrendous, especially when I don't filter the list the number of websites down to reduce the total of 600K+ line items. My i7-10510U CPU @ 1.80GHz[Cores 4] ran at 100% CPU for over 2 hours with completing the processing. It was interesting that memory never went much over 50% or 8 Gb
Are there techniques or strategies to either;
- throttle the CPU from maxing out to 100% to stop damage to my laptop hardware
- when some of the tables are 250 (x) and 150 to stage or phase the fill-down or fill-right computations
- or apply something similar to sumifs or countifs
This is a copy the heading in the raw data
Filename | List Name | Location | File Extension | Current Version Size KB | Total version Size KB | Site address | Created By | Last Modified By | Last Modified Date | Created Date |