RockandGrohl
Well-known Member
- Joined
- Aug 1, 2018
- Messages
- 801
- Office Version
- 365
- Platform
- Windows
Hi all,
I'm looking to replace a legacy system to select adverts from a database which uses some pretty beefy VBA code with a new system that uses criteria to take the information from the same database.
My question is whether autofilters are a good idea or if there's an inherent fault with them which results in unreliability etc.
As an example, if we had an advert in the Worcester Advertiser Newspaper which picks up in Worcester, Droitwich, Bromsgrove, Pershore then the paper information would be held as a variable, the database opened and there would be various loops down the 28,000 rows on this datbase, for example:
If criteria X, Y and Z are selected, then loop down until I find a row in the database >= "tdate" then from there check if the pickups for the tour match the paper, check this matches that, check this, etc etc.
Alternatively, I'm thinking of a radical new solution which just uses filters to prevent Excel wasting resources by continually looping and resetting and looping and resetting. I realise that for Excel to loop down a column checking one criteria is extremely fast - we're doing 1,000's of rows per second on these machines, but the code behind it is over 180 lines long and it's impossible to debug any errors.
From stepping through, it seems like it really does just boil down to loop down and check for this, loop down and check for that, which is so much easier to read and write with autofilter, the criteria can be easily adjusted and you can read over the lines and actually parse it.
Seriously, 180 lines to do one function, and the rows are all nested within each other, so it just stretches down and across the VBE. I reckon I could do the same function in about 30-40 lines max!
As long as there isn't a problem with autofilter, of course.
I'm looking to replace a legacy system to select adverts from a database which uses some pretty beefy VBA code with a new system that uses criteria to take the information from the same database.
My question is whether autofilters are a good idea or if there's an inherent fault with them which results in unreliability etc.
As an example, if we had an advert in the Worcester Advertiser Newspaper which picks up in Worcester, Droitwich, Bromsgrove, Pershore then the paper information would be held as a variable, the database opened and there would be various loops down the 28,000 rows on this datbase, for example:
If criteria X, Y and Z are selected, then loop down until I find a row in the database >= "tdate" then from there check if the pickups for the tour match the paper, check this matches that, check this, etc etc.
Alternatively, I'm thinking of a radical new solution which just uses filters to prevent Excel wasting resources by continually looping and resetting and looping and resetting. I realise that for Excel to loop down a column checking one criteria is extremely fast - we're doing 1,000's of rows per second on these machines, but the code behind it is over 180 lines long and it's impossible to debug any errors.
From stepping through, it seems like it really does just boil down to loop down and check for this, loop down and check for that, which is so much easier to read and write with autofilter, the criteria can be easily adjusted and you can read over the lines and actually parse it.
Seriously, 180 lines to do one function, and the rows are all nested within each other, so it just stretches down and across the VBE. I reckon I could do the same function in about 30-40 lines max!
As long as there isn't a problem with autofilter, of course.