Speed Up VBA

VbaHell

Well-known Member
Joined
Jan 30, 2011
Messages
1,220
Hello All

The code below looks in column "A" for a match "27009" and also "27003" it then copies these rows to sheet2 and then deletes the existing rows. Is there a way please to speed this up as there are thousands of records each day and at the moment this is slow

Sub Proforma()
Dim LR As Integer

Application.ScreenUpdating = False

LR = Range("A" & Rows.Count).End(xlUp).Row

Dim c As Range

For i = LR To 2 Step -1
Select Case Cells(i, 1)
Case "27009", "27003"
Cells(i, 1).EntireRow.Copy Sheets("Output").Cells(Rows.Count, "A").End(xlUp).Offset(1, 0)
Cells(i, 1).EntireRow.Delete
End Select
Next

Application.ScreenUpdating = True

End Sub
 
Ah, so you didn't want the blank rows that were created by deleting the 27003 and 27009 data.
That is at odds with your original code. :confused:
 
Upvote 0

Excel Facts

Create a chart in one keystroke
Select the data and press Alt+F1 to insert a default chart. You can change the default chart to any chart type
The other thing I noticed, not sure why it is happening or if I've mis-read the question, is that I tested mirabeau's code on 100,000 rows and 2 columns of data (no blanks) and the original sheet ended up with 6,645 rows and the Output sheet ended up with 19,930 rows so I don't know what happened to the other 73,425 rows. :confused:
Hey, that's a good observation Peter. Very valid.

Seems that I had my b's and c's in the wrong place. Here's a corrected version which checks out OK with my testing
Code:
Sub delrowsfastercorrected()

Dim n&, m&, k&, l&
Dim a(), b()
Dim i&, j&
n = Cells.Find("*", searchorder:=xlByRows, searchdirection:=xlPrevious).Row
m = Cells.Find("*", searchorder:=xlByColumns, searchdirection:=xlPrevious).Column
a = Range("A1").Resize(n, m)
ReDim b(1 To n, 1 To m)
For i = 2 To n
    If a(i, 1) <> "27009" And a(i, 1) <> "27003" Then
        k = k + 1
        For j = 1 To m
            a(k, j) = a(i, j)
        Next j
    Else
        l = l + 1
        For j = 1 To m
            b(l, j) = a(i, j)
        Next j
    End If
Next i
Range("A2").Resize(n, m).ClearContents
Range("A2").Resize(k + 1, m) = a
With Sheets("output")
    .Cells(Rows.Count, 1).End(3).Offset(1).Resize(l, m) = b
End With

End Sub
 
Upvote 0
Peter_SSs,

I also tried out your 2 codes on some test data, which was generated by the code below.

Your post#5 code seemed to work OK for up to about 45k rows. But for 100,000 rows it transferred all of the data to the output sheet, including those that it shouldn't have, leaving none on the start sheet (sheet1).

Your post#9 code gave me a 1004 error with that dataset. I didn't look further for the reason.

Am I somehow misreading or misusing your codes?
Code:
Sub testdata()
Const rws& = 100000
Const cls& = 3
With Sheets("sheet1")
    .Activate
    .UsedRange.Delete
    With .Cells.Resize(rws, cls)
        .Cells = "=char(int(rand()*26)+65)"
        .Resize(, 1) = "=27003+int(rand()*7)"
        .Resize(1, cls) = "=char(column()+64)"
        .Value = .Value
        .Resize(, 1).NumberFormat = "@"
    End With
End With
With Sheets("output")
    .UsedRange.ClearContents
    Sheets("sheet1").Cells.Resize(1, cls).Copy .Cells(1)
End With
Sheets("sheet1").Activate
End Sub
 
Upvote 0
Peter_SSs,

I also tried out your 2 codes on some test data, which was generated by the code below.

Your post#5 code seemed to work OK for up to about 45k rows. But for 100,000 rows it transferred all of the data to the output sheet, including those that it shouldn't have, leaving none on the start sheet (sheet1).

Your post#9 code gave me a 1004 error with that dataset. I didn't look further for the reason.

Am I somehow misreading or misusing your codes?
No, you are not misreading or misusing.
For a long time I was not able to reproduce the problem you described (using Excel 2010) but eventually remembered that there had been an issue with Excel 2007 AutoFiltering. So I'm guessing that you are using Excel 2007 and have re-discovered the bug discussed here. It is something that I thought would probably have been addressed in a subsequent update, but apparently not.

In case Excel 2007 is being used, I have re-thought my solution and moved to Advanced Filter, which appears to be quicker anyway and automatically preserves the original data order which was an issue with my earlier code.
With 100,000 rows this code has tested faster than your array method once the column count goes above 3 or 4. Yours is slightly faster for less columns, and I suspect for less rows, though I haven't bothered testing at this stage as the times are not significant anyway (unless there are a lot of columns I guess).

My new code..
Rich (BB code):
Sub CopyAndDelete_v2()
  Dim c As Long
  Dim rCrit As Range

  Application.ScreenUpdating = False
  With ActiveSheet.UsedRange
    c = .Column + .Columns.Count
    Set rCrit = Cells(1, c).Resize(3)
    rCrit.Value = Application.Transpose(Array(.Cells(1, 1).Value, "27003", "27009"))
    .AdvancedFilter Action:=xlFilterCopy, CriteriaRange:=rCrit, CopyToRange:= _
      Sheets("Output").Range("A" & Rows.Count).End(xlUp).Offset(1), Unique:=False
    rCrit.Resize(2).Value = Application.Transpose(Array(.Cells(1, 1).Value, "<>27003"))
    rCrit.Offset(, 1).Resize(2).Value = Application.Transpose(Array(.Cells(1, 1).Value, "<>27009"))
    .AdvancedFilter Action:=xlFilterCopy, CriteriaRange:=rCrit.Resize(2, 2), _
        CopyToRange:=Cells(1, c + 2), Unique:=False
    .Resize(, c + 1).EntireColumn.Delete
  End With
  Application.ScreenUpdating = True
End Sub

I believe that you still require a tweak of your code in post #12.
Try creating test data with your code in post #13 with the added line
Rich (BB code):
.Resize(1, cls) = "=char(column()+64)"
.Offset(, cls - 1).Resize(, 1).Value = Evaluate("ROW(" & .Address & ")")
.Value = .Value
This just records the row that the original data was on.
After running your new code over this data there was always a rogue duplicate data row, out of order, at the bottom of the original sheet.
The total number of data rows (that is, excluding headings) in the two sheets was 100,000 which presumably means there was also one row too few on the Output sheet.
 
Upvote 0
No, you are not misreading or misusing.
For a long time I was not able to reproduce the problem you described (using Excel 2010) but eventually remembered that there had been an issue with Excel 2007 AutoFiltering. So I'm guessing that you are using Excel 2007 and have re-discovered the bug discussed here. It is something that I thought would probably have been addressed in a subsequent update, but apparently not.

In case Excel 2007 is being used, I have re-thought my solution and moved to Advanced Filter, which appears to be quicker anyway and automatically preserves the original data order which was an issue with my earlier code.
With 100,000 rows this code has tested faster than your array method once the column count goes above 3 or 4. Yours is slightly faster for less columns, and I suspect for less rows, though I haven't bothered testing at this stage as the times are not significant anyway (unless there are a lot of columns I guess).

My new code..
...


I believe that you still require a tweak of your code in post #12.
Try creating test data with your code in post #13 with the added line
...
This just records the row that the original data was on.
After running your new code over this data there was always a rogue duplicate data row, out of order, at the bottom of the original sheet.
The total number of data rows (that is, excluding headings) in the two sheets was 100,000 which presumably means there was also one row too few on the Output sheet.
Peter,

Thanks for the clarifications.

Yes, I am using Excel 2007, and yes, on my machine your recent code seemed to do everything it should.

On relative speeds etc. we seem to have had this sort of discussion previously, notably in the thread
http://www.mrexcel.com/forum/excel-questions/576769-there-faster-way-do-3.html
where it seemed that inter alia relative speeds may be reversed on different computers.

I agree that an array-based code, dealing as it does with individual elements, is likely to slow significantly as the length of row increases.

So I ran another code (as below), which deals with entire rows, on test data with 10,000 rows and 20 columns (maybe typical magnitudes for this sort of problem?). For what it's worth, on the same dataset it ran more than twice as fast as your advanced filter code. But again maybe that's just a specific machine thing.
Code:
Sub transf_and_del()
Dim n&, m&, k&
Sheets("sheet1").Activate
n = Cells.Find("*", searchorder:=xlByRows, searchdirection:=xlPrevious).Row
m = Cells.Find("*", searchorder:=xlByColumns, searchdirection:=xlPrevious).Column

With Range("A2:A" & n)
    .Offset(, m) = "=if(or(" & .Address & "=27003," & .Address & "=27009),"""",1)"
    .Offset(, m).Value = .Offset(, m).Value
End With

Range("A1").Resize(n, m + 1).Sort Cells(1, m + 1), Header:=xlYes

k = Application.Sum(Cells(1, m + 1).Resize(n)) + 1
If n - k >= 1 Then _
    Cells(k + 1, 1).Resize(n - k, m).Cut Sheets("output").Cells(Rows.Count, 1).End(3)(2)
Cells(1, m + 1).Resize(n).ClearContents

End Sub
 
Upvote 0
On relative speeds etc. we seem to have had this sort of discussion previously, notably in the thread
http://www.mrexcel.com/forum/excel-questions/576769-there-faster-way-do-3.html
where it seemed that inter alia relative speeds may be reversed on different computers.
Yes, I remember that discussion but had lost track of the thread, thanks for the link back to it.

And yes, I think we have the same situation again. With the codes from posts #14 and #15 and 100,000 x 20 data my average result times were as shown below.

It is common for board members to claim 'faster' code. Does it mean nobody should make such claims?

Excel Workbook
ABC
1Using Excel 2007Using Excel 2010
2mirabeau1.7911.734
3Peter_SSs1.2341.125
Times
 
Upvote 0
Yes, I remember that discussion but had lost track of the thread, thanks for the link back to it.

And yes, I think we have the same situation again. With the codes from posts #14 and #15 and 100,000 x 20 data my average result times were as shown below.

It is common for board members to claim 'faster' code. Does it mean nobody should make such claims?

Times

*ABC
1*Using Excel 2007Using Excel 2010
2mirabeau1.7911.734
3Peter_SSs1.2341.125

<tbody>
</tbody>


Excel tables to the web >> Excel Jeanie HTML 4
It all makes for interesting discussion.

You seem to have a fairly hi-spec computer.

So how would this one go on your machine with 100,000 x 20 data?
Code:
Sub transf_and_del()

Dim n&, m&, c&
Dim u(), b(27003 To 27009) As Boolean
Sheets("sheet1").Activate
n = Cells.Find("*", searchorder:=xlByRows, searchdirection:=xlPrevious).Row
m = Cells.Find("*", searchorder:=xlByColumns, searchdirection:=xlPrevious).Column

a = Range("A1:A" & n)
ReDim u(1 To n, 1 To 1)
b(27003) = True: b(27009) = True
On Error Resume Next
   For i = 1 To n
        If Not b(a(i, 1)) Then u(i, 1) = 1
    Next i
On Error GoTo 0
Cells(1, m + 1).Resize(n) = u
Range("A1").Resize(n, m + 1).Sort Cells(1, m + 1), Header:=xlYes
c = Application.Count(Cells(1, m + 1).Resize(n))
Cells(c + 1, 1).Resize(n - c, m).Cut Sheets("output").Cells(2, 1)
Cells(1, m + 1).Resize(n).ClearContents

End Sub
 
Upvote 0
Excel has a very fast built in "Find" feature. Pops right to the searched for text anywhere in a sheet. Might be worth giving it a speed try.
Code:
Sub SearchFindQuick()

   'Type "Find Me" in any cell, then run this sub

   If WorksheetFunction.CountIf(Cells, "Find Me") = 0 Then
      MsgBox "You didn't type 'Find Me'"
      Exit Sub
   End If

   'This will now use Excel's built in very fast function "Find"

   Cells.Find(What:="Find Me", After:=[A1], LookIn:=xlFormulas, _
      LookAt:=xlPart, SearchOrder:=xlByRows, _
      SearchDirection:=xlNext, MatchCase:=False).Activate

End Sub
 
Upvote 0
Also can find multiple occurences of text searched in any sheet if change following:

Change "After:=[A1]" to "After:=[ActiveCell]" and it will find each next instance in that sheet each time macro is run.
 
Last edited:
Upvote 0
So how would this one go on your machine with 100,000 x 20 data?
I still had the identical data I tested for those times I reported earlier so used that again. For me, this code pretty much matches my Advanced Filter method.
Excel 2007: 1.383
Excel 2010: 1.176



Excel has a very fast built in "Find" feature. Pops right to the searched for text anywhere in a sheet. Might be worth giving it a speed try.
Why not give it a try yourself and report your results?
Test data can easily be produced using the code in post #13.
To be comparable to our recently reported times change the cls constant to 20 and use a timer like this to time your code
Code:
Sub chuckchuckit()
    Dim t As Single
    t = Timer
    
    'Your code here
    
    MsgBox "Code took " & Format(Timer - t, "0.000 secs")
End Sub
 
Upvote 0

Forum statistics

Threads
1,221,467
Messages
6,160,018
Members
451,611
Latest member
PattiButche

We've detected that you are using an adblocker.

We have a great community of people providing Excel help here, but the hosting costs are enormous. You can help keep this site running by allowing ads on MrExcel.com.
Allow Ads at MrExcel

Which adblocker are you using?

Disable AdBlock

Follow these easy steps to disable AdBlock

1)Click on the icon in the browser’s toolbar.
2)Click on the icon in the browser’s toolbar.
2)Click on the "Pause on this site" option.
Go back

Disable AdBlock Plus

Follow these easy steps to disable AdBlock Plus

1)Click on the icon in the browser’s toolbar.
2)Click on the toggle to disable it for "mrexcel.com".
Go back

Disable uBlock Origin

Follow these easy steps to disable uBlock Origin

1)Click on the icon in the browser’s toolbar.
2)Click on the "Power" button.
3)Click on the "Refresh" button.
Go back

Disable uBlock

Follow these easy steps to disable uBlock

1)Click on the icon in the browser’s toolbar.
2)Click on the "Power" button.
3)Click on the "Refresh" button.
Go back
Back
Top