Hi all,
I have a cost model for housing which is all set up using dynamic arrays and in a few instances 7-8 cells I had to use LAMBDA functions, for example, SCAN, BYROW, MAP and so on so perform certain calculations. In a final sheet I have a data table that performs the same calculation for 25 different cases.
Once I change the input the Data table takes about 2-3 seconds to load (recalculating) and then it performs the calculation as intended. However, I noticed that if I interrupt the calculation or change too many inputs before the 2-3 seconds are over the data table gets corrupted (I can replicate the error every time) and the data table shows the same output for all 25 cases. I spent quite some time debugging the sheet and did the following:
1. Got rid of all error, data validation warnings, format inconsistencies, "cleaned up" inconsistencies in sheet = No impact
2. Ran a macro on every sheet and identified all the cells that took more than 0 ms to run and then hardcoded those formulas = Data Table calculation time (with wrong results due to the hard coding) ran slightly faster like 0-2seconds. But still I was able to replicate the Data table corruption
3. Reduced the Data table to only 3 cases = Data Table calculation time also got reduced significantly but I could still corrupt the table.
4. Got rid completely of all SCAN functions (which I believe are the most computation heavy) = Data Table calculation time also got reduced significantly but I could still corrupt the table.
5. Final step, got rid of all LAMBDA functions in the sheet = This is the only time in which the Data Table doesn't get corrupted.
So in conclusion, I managed to debug and find that Data tables + LAMBDA + Dynamic arrays is too much for excel, but now what can I do with this information? I need those LAMBDA functions in order to achieve my goal, otherwise I would not be able to have everything dynamic, which beats the purpose of the entire cost model. So although I could rethink some of the SCAN logics and so on, LAMBDAs like LAMBDA(r;SUM(r)) are simply not possible to avoid.
The only solution (not really a solution) that I could find i to turn manual mode, modify the values then calculate or turn calculation mode on. This of course works but the sheet is intended to be used by many people and this is an extra step that makes it not user friendly at all. I can also put a button to do that but want to avoid using VBA at all due to compatibility issues (I will try to do the same using office scripts, but the issue remains).
Has anyone else encountered this type of issue and has some workaround? I would show some formulas but it doesn't really help since the sheet is quite heavy.
Thanks!
I have a cost model for housing which is all set up using dynamic arrays and in a few instances 7-8 cells I had to use LAMBDA functions, for example, SCAN, BYROW, MAP and so on so perform certain calculations. In a final sheet I have a data table that performs the same calculation for 25 different cases.
Once I change the input the Data table takes about 2-3 seconds to load (recalculating) and then it performs the calculation as intended. However, I noticed that if I interrupt the calculation or change too many inputs before the 2-3 seconds are over the data table gets corrupted (I can replicate the error every time) and the data table shows the same output for all 25 cases. I spent quite some time debugging the sheet and did the following:
1. Got rid of all error, data validation warnings, format inconsistencies, "cleaned up" inconsistencies in sheet = No impact
2. Ran a macro on every sheet and identified all the cells that took more than 0 ms to run and then hardcoded those formulas = Data Table calculation time (with wrong results due to the hard coding) ran slightly faster like 0-2seconds. But still I was able to replicate the Data table corruption
3. Reduced the Data table to only 3 cases = Data Table calculation time also got reduced significantly but I could still corrupt the table.
4. Got rid completely of all SCAN functions (which I believe are the most computation heavy) = Data Table calculation time also got reduced significantly but I could still corrupt the table.
5. Final step, got rid of all LAMBDA functions in the sheet = This is the only time in which the Data Table doesn't get corrupted.
So in conclusion, I managed to debug and find that Data tables + LAMBDA + Dynamic arrays is too much for excel, but now what can I do with this information? I need those LAMBDA functions in order to achieve my goal, otherwise I would not be able to have everything dynamic, which beats the purpose of the entire cost model. So although I could rethink some of the SCAN logics and so on, LAMBDAs like LAMBDA(r;SUM(r)) are simply not possible to avoid.
The only solution (not really a solution) that I could find i to turn manual mode, modify the values then calculate or turn calculation mode on. This of course works but the sheet is intended to be used by many people and this is an extra step that makes it not user friendly at all. I can also put a button to do that but want to avoid using VBA at all due to compatibility issues (I will try to do the same using office scripts, but the issue remains).
Has anyone else encountered this type of issue and has some workaround? I would show some formulas but it doesn't really help since the sheet is quite heavy.
Thanks!