LET ME SAY IT AGAIN UPFRONT - OUR IT DEPARTMENT WILL NOT ALLOW US TO USE VBA !!!
After fighting with our IT Department - and losing - I now understand my company's policy of not allowing VBA macros inside our company network.
Especially in the engineering department we have proprietary material. By opening the window to VBA our IT Department believes the chance is too great for malicious code to be able to sneak in inside of a macro.
So...moving on. The only option left is to write more complex formulas "the old way" to make Excel do what we need it to do.
We know Excel can do it, but my department is a too busy and a little to rusty to know how to crank out this answer.
My challenge. We use Data Acquisition Systems to gather test data. The test data is saved as Excel files.
The hard part comes because the test can be run on one of three possible test stands. Each different test stand produces a data file that has unique quirks related to what particular test stand the test was performed on. All three test stands produce a certain amount of garbage data. For example, all values less than zero at the beginning of a test can be thrown away. We have actually thought about setting the lower threshold to be a larger number like 100. Then after building to a peak, all data after the peak value is...questionable and can probably be throw out as well.
Furthermore, the test will take different times depending on which test stand the test is performed on.
The final challenge is that the technicians performing the test have limited technical training.
That is why we want to be able to have the technician "copy and paste" the excel file into another excel report template file. Inside the report template the Junk data will automatically be removed, the Pass/Fail equation applied, and the final test report self populates the results.
We have "parts" of the report template already working. But the initial part that must accept different / random length data files and discard the junk data before converting into different units of measure is slowing down the project. Throwing out all values below a lower threshold is already done. But throwing out all data points "X" seconds - AFTER the peak loading is stumping us.
Can anyone suggest examples or particular subjects to study to help find the solution?
Sincere thanks for your input.
After fighting with our IT Department - and losing - I now understand my company's policy of not allowing VBA macros inside our company network.
Especially in the engineering department we have proprietary material. By opening the window to VBA our IT Department believes the chance is too great for malicious code to be able to sneak in inside of a macro.
So...moving on. The only option left is to write more complex formulas "the old way" to make Excel do what we need it to do.
We know Excel can do it, but my department is a too busy and a little to rusty to know how to crank out this answer.
My challenge. We use Data Acquisition Systems to gather test data. The test data is saved as Excel files.
The hard part comes because the test can be run on one of three possible test stands. Each different test stand produces a data file that has unique quirks related to what particular test stand the test was performed on. All three test stands produce a certain amount of garbage data. For example, all values less than zero at the beginning of a test can be thrown away. We have actually thought about setting the lower threshold to be a larger number like 100. Then after building to a peak, all data after the peak value is...questionable and can probably be throw out as well.
Furthermore, the test will take different times depending on which test stand the test is performed on.
The final challenge is that the technicians performing the test have limited technical training.
That is why we want to be able to have the technician "copy and paste" the excel file into another excel report template file. Inside the report template the Junk data will automatically be removed, the Pass/Fail equation applied, and the final test report self populates the results.
We have "parts" of the report template already working. But the initial part that must accept different / random length data files and discard the junk data before converting into different units of measure is slowing down the project. Throwing out all values below a lower threshold is already done. But throwing out all data points "X" seconds - AFTER the peak loading is stumping us.
Can anyone suggest examples or particular subjects to study to help find the solution?
Sincere thanks for your input.