Speed of multiple URL web scraping

johnnyL

Well-known Member
Joined
Nov 7, 2011
Messages
4,546
Office Version
  1. 2007
Platform
  1. Windows
Need some help with this code.

1) The current code leaves a bunch of IE windows open.
2) The current code leads to a Run-time error '-2147437259 (80004005)':
3) It takes forever to run, Hopefully someone can assist me in converting it to use MSXML2.XMLHTTP60 for example, I heard that works faster.


Code:
'
'-----------------------------------------------------
'   Run-time error '-2147437259 (80004005)':    ' This Error Occurs, eventually, in the 'Yahoo_One_Year_Estimates_Scrape_Error' section \/ \/ \/
'                                                   Also many internet explorer windows are left open that should have been closed
'
'   Automation Error
'   Unspecified Error
'-----------------------------------------------------
'
'
'   Global Variables That will be used
'
    Public Doc                                      As HTMLDocument
'
    Public StockMainPageURL                         As String       ' This will be the main portion of the URL that we send to Internet Explorer
    Public TotalURL                                 As String       ' This will be the complete URL that we send to Internet Explorer
'
    Public CellCounter                              As Integer      ' This will be used to adjust left to right on web site cells
    Public RowCounter                               As Integer      ' This adjusts the offset from the top of the spreadsheet to the start of the columns
    Public StockCount                               As Integer      ' This counts the actual stocks being analyzed currently
    Public TotalStocksToLoad                        As Integer      ' This counts the stocks that should be analyzed right now
'
    Public PageLoadAttempt                          As Long         ' This counts the number of times we have tried to load a page
'
'-------------------------------------------------------------------------------------------------------------------------------
'
Private Sub RefreshEntireDocument_Click()
'
'   This will Clear certain cell values in the spreadsheet when the $B$1 'Refresh' cell is clicked
'
    Range("$B$5:$K$254").Select                                 ' Select the range of $B$5 thru $J$254
    Selection.ClearContents                                     ' Delete the contents of this range
'
'
' -------------------------------------------------------------------------------------------------------------------------
'
'   Scrape stocks to consider looking into further from 1st URL page
'
    RowCounter = 5                                              ' Start loading stock values recieved into the 5th row of Excel
    MaxYahooDelay = 0                                           ' Initialize MaxYahooDelay = 0
'
    CellCounter = 0                                             ' Left to right cell counter
    PageLoadAttempt = 0                                         ' Initialize PageLoadAttempt = 0
    TotalStocksToLoad = 100                                     ' we will Scrape this amount of stocks from the 1st loaded page of stocks
'
    Call Scrape_BarChart_Stock_Page_1                           ' Scrape the amount of TotalStocksToLoad into excel
'
' -------------------------------------------------------------------------------------------------------------------------
'
'   Scrape stocks to consider looking into further from 2nd URL page
'
    CellCounter = 0                                             ' Left to right cell counter
    PageLoadAttempt = 0                                         ' Initialize PageLoadAttempt = 0
    TotalStocksToLoad = 100                                     ' we will Scrape this amount of stocks from the 2nd loaded page of stocks
'
    Call Scrape_BarChart_Stock_Page_2                           ' Scrape the amount of TotalStocksToLoad into excel
'
' -------------------------------------------------------------------------------------------------------------------------
'
'   Scrape stocks to consider looking into further from 3rd URL page
'
    CellCounter = 0                                             ' Left to right cell counter
    PageLoadAttempt = 0                                         ' Initialize PageLoadAttempt = 0
    TotalStocksToLoad = 50                                      ' we will Scrape this amount of stocks from the 3rd loaded page of stocks
'
    Call Scrape_BarChart_Stock_Page_3                           ' Scrape the amount of TotalStocksToLoad into excel
'
' -------------------------------------------------------------------------------------------------------------------------
' -------------------------------------------------------------------------------------------------------------------------
'
'   Scrape values from Yahoo to Update the one year estimates from previous pages of stocks scraped
'
    RowCounter = 5                                              ' Start loading stock values recieved into the 5th row of Excel
    PageLoadAttempt = 0                                         ' Initialize PageLoadAttempt = 0
    TotalYahooDelay = 0                                         ' Initialize TotalYahooDelay = 0
    TotalYahooPageAttempts = 0                                  ' Initialize TotalYahooPageAttempts = 0
    TotalStocksToLoad = 250                                     ' we will Scrape this amount of stocks from the 3rd loaded page of stocks

    Call Scrape_Yahoo_One_Year_Estimates                        ' Scrape the amount of TotalStocksToLoad into excel
'
' -------------------------------------------------------------------------------------------------------------------------
'
'   Display some final results in the status bar
    Application.StatusBar = "Spreadsheet Refreshing Complete :)" ' & "    Avg Yahoo Delay = " & AvgYahooDelay & "     Avg Yahoo Page Attempts = " & AvgYahooPageAttempts
'
End Sub
'-------------------------------------------------------------------------------------------------------------------------------
'-------------------------------------------------------------------------------------------------------------------------------
'
Private Sub Scrape_Yahoo_One_Year_Estimates()                       ' *** Good up to here ***
'
'
    For StockCount = 1 To TotalStocksToLoad                         ' Grab One Year stock price estimate
'
'
ReloadScrape_Yahoo_One_Year_Estimates:
'
'       Load all of the Update one year estimates
        DelaySeconds = 0                                            '   Initialize DelaySeconds to zero
        PageLoadAttempt = PageLoadAttempt + 1                       '   Add 1 to our PageLoadAttempt counter
''''        TotalYahooPageAttempts = TotalYahooPageAttempts + 1         '   This will be the total yahoo Page Attempts
'
        StockMainPageURL = "finance.yahoo.com/quote/"               '   This will be the main portion of the URL that we send to Internet Explorer
        CurrentStockSymbol = Trim(Range("B" & RowCounter).Value)    '   This is the stock symbol that we will be addressing
'
'       Setup and Load the Internet Explorer Page ...
''''        Dim IE As New SHDocVw.InternetExplorer  ' This works
        Dim IE As New InternetExplorer
''      Dim IE As MSXML2.XMLHTTP60
''      Set IE = New MSXML2.XMLHTTP60
'
        TotalURL = "https://" & StockMainPageURL & CurrentStockSymbol   ' This will be the complete URL that we send to Internet Explorer
'
        If CurrentStockSymbol = 0 Or CurrentStockSymbol = "" Or IsEmpty(CurrentStockSymbol) = True Then ' If no stock symbol found @ $B?  then ...
            PageLoadAttempt = 0                                                                         '   Reset PageLoadAttempt = 0
            StockCount = TotalStocksToLoad                                                              '   Indicate no more stocks to load
'
            IE.Quit                                                                                     '   Close Internet Explorer Window
            Set IE = Nothing                                                                            '   Clear Internet Explorer Memory
'
            Exit Sub                                                                                    '   Exit this sub
        Else
'
            On Error GoTo Yahoo_One_Year_Estimates_Scrape_Error                                         '   If Error occurs then goto Yahoo_One_Year_Estimates_Scrape_Error
'
            Set IE = New InternetExplorer                                                               '   Open Internet Explorer Browser
'
'           Browser address that we will be scraping values from
            IE.navigate TotalURL                                                                        '   Load the Internet Explorer URL
'
'           Make the Browser window, that we will be scraping values from, visible
            IE.Visible = True                                           '   Make Internet Explorer Windows Visible
'
'           Allow mouse clicks and such while browser window is loading ... Loop until browser window is fuilly loaded, ie. READYSTATE_COMPLETE
            Do While IE.readyState <> 4 And DelaySeconds <= 19                                          '   Loop while IE is still loading and <= 19 seconds delayed
''              Application.Wait DateAdd("s", 1, Now)
                Application.Wait (Now + TimeValue("00:00:01"))                                          '   Delay for 1 second
                DoEvents                                                                                '   Enable Mouse Clicks
'
'               Update status bar to inform the user of what is occurring
                Application.StatusBar = "Loading website … " & TotalURL & "    Stock # " & (RowCounter - 4) ''''& _
''''                                "   Delay Seconds =  " & DelaySeconds & "    Page Load Attempts = " & PageLoadAttempt & _
''''                                "   Avg Yahoo Delay = " & AvgYahooDelay & "     AvgYahooPageAttempts = " & AvgYahooPageAttempts
'
                DelaySeconds = DelaySeconds + 1                         '   Add 1 to our DelaySeconds Counter
'
''''                If DelaySeconds > MaxYahooDelay Then MaxYahooDelay = DelaySeconds   '   Save the MaxYahooDelay
''                  TotalYahooDelay = TotalYahooDelay + 1
'
            Loop                                                        ' Loop back
'
'           Allow mouse clicks and such while browser window is loading ... Loop until browser window is fuilly loaded, ie. READYSTATE_COMPLETE
            Do While IE.Busy And DelaySeconds <= 19 ' Or IE.readyState <> 4 And DelaySeconds <= 19  ' Loop while IE is still loading and <= 19 seconds delayed
''              Application.Wait DateAdd("s", 1, Now)
                Application.Wait (Now + TimeValue("00:00:01"))          '   Delay for 1 second
                DoEvents                                                '   Enable Mouse Clicks
'
'               Update status bar to inform the user of what is occurring
                Application.StatusBar = "Loading website … " & TotalURL & "    Stock # " & (RowCounter - 4) ''''& _
''''                                "   Delay Seconds =  " & DelaySeconds & "    Page Load Attempts = " & PageLoadAttempt & _
''''                                "   Avg Yahoo Delay = " & AvgYahooDelay & "     AvgYahooPageAttempts = " & AvgYahooPageAttempts
'
                DelaySeconds = DelaySeconds + 1                         '   Add 1 to our DelaySeconds Counter
'
''''                If DelaySeconds > MaxYahooDelay Then MaxYahooDelay = DelaySeconds   '   Save the MaxYahooDelay
            Loop                                                        ' Loop back
'
'
            If DelaySeconds > 19 Then                                   ' If we have delayed for > 19 seconds to allow the page to load then ...
                IE.Quit                                                 '   Close Internet Explorer Window
'
                If PageLoadAttempt <= 4 Then GoTo ReloadScrape_Yahoo_One_Year_Estimates '   If we have'nt tried 4 reloads of this page then reload page again
            End If                                                      ' End If
'
            If PageLoadAttempt > 4 Then                                 ' If we have tried 4 reloads of the URL page then Display a message box & Exit program
                MsgBox "We've reloaded the same web page  " & PageLoadAttempt & " times without success so we're going to pause the program" & _
                " so you can investigate.", , "Multiple errors detected"
'
                PageLoadAttempt = 0                                     '   Reset PageLoadAttempt = 0
'
                Stop                                                    '   Stop this Excel program!
            End If
'
            Set Doc = IE.document
'
        End If
'
'
''''        TotalYahooDelay = TotalYahooDelay + DelaySeconds
''''        AvgYahooDelay = TotalYahooDelay / (RowCounter - 4)
''''        AvgYahooPageAttempts = TotalYahooPageAttempts / (RowCounter - 4)
'
'       Update status bar to inform the user of what is occurring
        Application.StatusBar = "Gathering Data from website … " & TotalURL & "    Stock # " & (RowCounter - 4) ''''& _
''''                                "   Delay Seconds =  " & DelaySeconds & "    Page Load Attempts = " & PageLoadAttempt & _
''''                                "   Avg Yahoo Delay = " & AvgYahooDelay & "    AvgYahooPageAttempts = " & AvgYahooPageAttempts
'
        Range("J" & RowCounter).Value = Doc.getElementsByTagName("td")(11).innerText        '   Scrape the Yahoo 52 Week Price Range
        Range("K" & RowCounter).Value = Doc.getElementsByTagName("td")(31).innerText        '   Scrape the Yahoo One Year Price Estimate
'
        On Error GoTo 0                                                                     '   Clear Errors & Set Excel Error handling to Default
'
        RowCounter = RowCounter + 1                                                         '   Advance to next row in Excel sheet
'
        IE.Quit                                                                             '   Close Internet Explorer Window
        Set IE = Nothing                                                                    '   Clear Internet Explorer Memory
'
        PageLoadAttempt = 0                                                                 '   Reset PageLoadAttempt = 0
'
    Next                                                                                    '   Load next stock until all are loaded
'
    Exit Sub                                                                                ' Exit this Sub
'
Yahoo_One_Year_Estimates_Scrape_Error:
'
'   Tried this solution from google \/ \/ to solve errors, No luck :(                       ' Shut down all Internet Explorer windows
''    Dim wsh As Object
''    Dim windowStyle As Integer: windowStyle = 1
''    Dim waitOnReturn As Boolean: waitOnReturn = True
'
''    Set wsh = VBA.CreateObject("Wscript.Shell")
''    wsh.Run "taskkill /F /IM iexplore.exe", windowStyle, waitOnReturn
'
'
'
''    IE.Quit                                                                             '   Close Internet Explorer Window
    Set IE = Nothing                                                                    '   Clear Internet Explorer Memory
'
'   This works some what
    Set IE = New InternetExplorer                                                           ' Open Internet Explorer Browser
'
'
    Resume Next                                                                             ' Go back to the next line after the previous error occurred
'
End Sub
'________________________________________________________________________________________________________________________________________________________
 
[QUOTE/ lrobbo314]
Just another thing to consider. You are writing values to the sheet over and over. If you store the results in memory, e.g., array, dictionary, etc., and write the results in one go at the end, you should see time improvements.
[/QUOTE]

I will relook into the approach of arrays, I have not done that in many years so I will have to google that to refresh myself. Thank you for the idea. :)
 
Last edited:
Upvote 0

Excel Facts

Excel Can Read to You
Customize Quick Access Toolbar. From All Commands, add Speak Cells or Speak Cells on Enter to QAT. Select cells. Press Speak Cells.
Without going through the code and trying to show exactly how you would implement it, here is a simplified example. Each of these bits of code will do the same thing but the second one will be faster. Especially considering how many time you seem to write to the worksheet.

Either way, just something to consider.

VBA Code:
Sub LongWay()
For i = 1 To 20
    Range("A" & i).Value = i
Next i
End Sub

Sub ShortWay()
Dim AR(1 To 20) As Integer

For i = 1 To 20
    AR(i) = i
Next i

Range("A1").Resize(UBound(AR), 1).Value = Application.Transpose(AR)
End Sub
 
Upvote 0
lrobbo314 said:
Without going through the code and trying to show exactly how you would implement it, here is a simplified example. Each of these bits of code will do the same thing but the second one will be faster. Especially considering how many time you seem to write to the worksheet.

Either way, just something to consider.

Thank you lrobbo314 !

I did some googling today to try and relearn the array stuff.

I found how to write the array vertically as your code demonstrates, and also how to get it to write the array horizontally.

What I did not find is how to write out the array both horizontally and vertically. :(

What I would need to do is write out the array 4 columns wide at a time ... $O1 thru $R1 ... then $O2 thru $R2, etc.

The following is an example of some code that I whipped up to start with based on your coding example ...
VBA Code:
Dim ar(1 To 1000) As Integer    ' 250 web pages, 4 scraped values per page

PagesToScrape = 250    ' Number of web pages to load/scrape from, we will scrape 4 values from each web page that is loaded
ArraySlotNumber = 0    ' Reset ArraySlotNumber to zero

For i = 1 to PagesToScrape
'
'    Load Webpage to scrape from
'    Do some code ...

'
''    Range("O" & RowCounter).Value = num(Doc.getElementsByClassName("right-border-separator")(1).innerText)  ' Avg Analyst 1 yr price
''    Range("P" & RowCounter).Value = Doc.getElementsByClassName("block__colored-header")(3).innerText        ' Analyst stock strength
''    Range("Q" & RowCounter).Value = Doc.getElementsByClassName("block__average_value")(3).innerText         ' Analyst rating 1 - 5
''    Range("R" & RowCounter).Value = Doc.getElementsByClassName("bold")(3).innerText                         ' # of analysts
'
'   Save the 4 values that we want to scrape from each web page loaded, into an array
    ArraySlotNumber = ArraySlotNumber + 1
    ar(ArraySlotNumber) = num(Doc.getElementsByClassName("right-border-separator")(1).innerText)  ' Avg Analyst 1 yr price
'
    ArraySlotNumber = ArraySlotNumber + 1
    ar(ArraySlotNumber) = Doc.getElementsByClassName("block__colored-header")(3).innerText        ' Analyst stock strength
'
    ArraySlotNumber = ArraySlotNumber + 1
    ar(ArraySlotNumber) = Doc.getElementsByClassName("block__average_value")(3).innerText         ' Analyst rating 1 - 5
'
    ArraySlotNumber = ArraySlotNumber + 1
    ar(ArraySlotNumber) = Doc.getElementsByClassName("bold")(3).innerText                         ' # of analysts
'
Next i
'
'    Array should now be filled with 4 values from each of the 250 web pages that were loaded ... 1000 values total
'
'Range("O1").Resize(UBound(ar), 1).Value = Application.Transpose(ar)    ' Writes values vertically across Excel row
Range("O1").Resize(1, UBound(ar)).Value = ar    ' Writes values horizontally down excel column

I haven't tested that code yet, I hope to shortly after the grandchildren go to bed, but that is what I have whipped up so far. Not sure how to break the array dumping results into 4 columns wide per line though.
 
Upvote 0
You have a 1 dimensional array in your example. You can make it multidimensional as below.

This is untested but should work or at least get you going in the right direction.

VBA Code:
Dim AR(1 to 250, 1 to 4)
PagesToScrape = 250    ' Number of web pages to load/scrape from, we will scrape 4 values from each web page that is loaded
ArraySlotNumber = 0    ' Reset ArraySlotNumber to zero

For i = 1 to PagesToScrape
'
'    Load Webpage to scrape from
'    Do some code ...

    ar(ArraySlotNumber,1) = num(Doc.getElementsByClassName("right-border-separator")(1).innerText)  ' Avg Analyst 1 yr price
    ar(ArraySlotNumber,2) = Doc.getElementsByClassName("block__colored-header")(3).innerText        ' Analyst stock strength
    ar(ArraySlotNumber,3) = Doc.getElementsByClassName("block__average_value")(3).innerText         ' Analyst rating 1 - 5
    ar(ArraySlotNumber,4) = Doc.getElementsByClassName("bold")(3).innerText                         ' # of analysts
   ArraySlotNumber = ArraySlotNumber + 1
Next i

Range("O1").Resize(ubound(ar), UBound(ar,2)).Value = ar
 
Upvote 0
Well, you seem to making some gradual improvements, but lrobbo314's point is a good one, and you will certaintly see speed gains by implementing this technique.

As an aside, there aren't any other excel spreadsheets open at the same time, are there? Or are there any formulas in the worksheet that are drawing on VBA Code that might be slowing things down? That's usually a problem that I encounter.

That seems like a recipe for a CPU cycle waste, aka "Battery Burner".
This hadn't occured to me, and you may be right (especially when you have c.250 companies/shares to check). That said, it wouldn't be running for particularly long as a proprotion of the entire process. In any event, I went to go and investigate and learnt a whole lot in the process (thank you!!). You tend to see DoEvents used as part of the waiting loop with IE, but just because others do it doesn't mean it necessarily should be used here. From what I can tell (and hopefully, you or someone may correct me if I'm wrong), I think it comes down to how long it takes for IE to complete its loading of each site = it appears that WAIT should be used for durations of 1 second or more (in one second increments). And that's exactly what you have.

You may want to test both approaches to see what the time impact is for each - if it is causing the process to wait unnecessarily an extra second at each site, even when scaled out to 250 sites, that (obviously) becomes 250 seconds (4ish minutes). Out of curiosity, what kind of variance in overall time do you see each time you run the Phase 3 process? It would be very interesting if you could time how long each it takes to run the process for each share/company.
 
Upvote 0
Other solutions
I've been giving it some thought and I've been thinking about how to bring down the overall time it takes to do this. Here's what I've got:
- Selenium - IE is slow. I think it's the slowest of all the browsers (?). If you were to use Selenium to run Chrome or Firefox, that might lead to an overall reduction in time. I might be wrong.
- API - you could use one of the many scraping API services. PhantomJSCloud is one I use and quite like. There is a free tier of 500 requests per day, which would meet your needs, but you'd only be able to run your process max 2 times a day before you need to start paying for it.
- "MutliThreading" - article - I don't know how useful this would be here when each "thread" is operating an instance of Internet Explorer. I'd be interested to see how long it would take as compared to the others.

Essentially, we have hit the limits of my knowledge (such as it is) here, and I'm guessing now. It's very interesting, so please keep me posted.
 
Upvote 0
Other solutions
I've been giving it some thought and I've been thinking about how to bring down the overall time it takes to do this. Here's what I've got:
- Selenium - IE is slow. I think it's the slowest of all the browsers (?). If you were to use Selenium to run Chrome or Firefox, that might lead to an overall reduction in time. I might be wrong.
- API - you could use one of the many scraping API services. PhantomJSCloud is one I use and quite like. There is a free tier of 500 requests per day, which would meet your needs, but you'd only be able to run your process max 2 times a day before you need to start paying for it.
- "MutliThreading" - article - I don't know how useful this would be here when each "thread" is operating an instance of Internet Explorer. I'd be interested to see how long it would take as compared to the others.

Essentially, we have hit the limits of my knowledge (such as it is) here, and I'm guessing now. It's very interesting, so please keep me posted.

This is a good point. If I were doing this as a personal project, I'd probably steer clear of Excel entirely. I would use Python and the Beautiful Soup library.
 
Upvote 0
@Dan_W as far as my experience goes, so take it for what it's worth, the only thing DoEvents does is allow you to Ctrl-PageBreak to stop the code during long executions that make Excel hang. I know it slows things down and it's probably at least worth trying running the code without them in there. The speed increase may be marginal for all I know, but worth a shot.
 
Upvote 0
You have a 1 dimensional array in your example. You can make it multidimensional as below.

This is untested but should work or at least get you going in the right direction.

VBA Code:
Dim AR(1 to 250, 1 to 4)
PagesToScrape = 250    ' Number of web pages to load/scrape from, we will scrape 4 values from each web page that is loaded
ArraySlotNumber = 0    ' Reset ArraySlotNumber to zero

For i = 1 to PagesToScrape
'
'    Load Webpage to scrape from
'    Do some code ...

    ar(ArraySlotNumber,1) = num(Doc.getElementsByClassName("right-border-separator")(1).innerText)  ' Avg Analyst 1 yr price
    ar(ArraySlotNumber,2) = Doc.getElementsByClassName("block__colored-header")(3).innerText        ' Analyst stock strength
    ar(ArraySlotNumber,3) = Doc.getElementsByClassName("block__average_value")(3).innerText         ' Analyst rating 1 - 5
    ar(ArraySlotNumber,4) = Doc.getElementsByClassName("bold")(3).innerText                         ' # of analysts
   ArraySlotNumber = ArraySlotNumber + 1
Next i

Range("O1").Resize(ubound(ar), UBound(ar,2)).Value = ar

Thank You lrobbo314!

I hope you won't mind but I "dumbed" that code back down so that I could compare it to your first example of a 1 dimensional array example.

VBA Code:
Sub ShortWay2Dimensional()
'
    Dim ar(1 To 250, 1 To 4)                            ' Prep Array for 250 web pages, 4 values scraped per page
'
    ArraySlotNumber = 0                                 ' Reset our ArraySlotNumber to 0
'
'   Load the Array into Excel cells 4 columns wide each
    For i = 1 To 250                                    ' Number of web pages to scrape from
        For J = 1 To 4                                  '   Number of values to scrape from each web page loaded
            ArraySlotNumber = ArraySlotNumber + 1       '       Increment the ArraySlotNumber + 1
            ar(i, J) = ArraySlotNumber                  '       Save the ArraySlotNumber into the Array
        Next J                                          '   Loop back
    Next i                                              ' Loop back
'
    Range("A1").Resize(UBound(ar), UBound(ar, 2)).Value = ar    ' Load 250 Rows of array values into Excel cells, 4 cells wide at a time

End Sub

Low and behold it works! It loads the array and then prints out 4 columns wide by 250 rows !!! Thank you so much for that! I wanted to 'dumb' it back just so I could understand what was happening.

This should allow me to continue to incorporate the logic into my program and continue. Thank you!
 
Upvote 0
@Dan_W as far as my experience goes, so take it for what it's worth, the only thing DoEvents does is allow you to Ctrl-PageBreak to stop the code during long executions that make Excel hang. I know it slows things down and it's probably at least worth trying running the code without them in there. The speed increase may be marginal for all I know, but worth a shot.

Yikes! Looks like I may have caused a small controversy with my comments above, about the ' Loop/DoEvents/NoDelay/Loop back' approach being a CPU cycle waster/Potential Battery Burner for a laptop.

I was just 'reporting' what I have previously read. I have googled it again to find a link that may explain it better. AvoidDoEvents

From my limited understanding, the loop is the 'CPU cycle burner'. The DoEvents is a very small(fast) window to allow mouse clicks and such. The loop appears to be the problem being that it fires as fast as it can (Computer processor speed dependent) to run over and over again, unless an additional delay is included in that loop via sleep/wait,etc.

I could set up a timer to see how many times the Loop/DoEvents/Loop back is triggered each second, but I think it would be a very large amount, especially when some of the web pages I am loading appear to take upwards of 10 seconds or more.

Lemme know.

Again, sorry for any controversy I may have caused.
 
Upvote 0

Forum statistics

Threads
1,223,909
Messages
6,175,312
Members
452,634
Latest member
cpostell

We've detected that you are using an adblocker.

We have a great community of people providing Excel help here, but the hosting costs are enormous. You can help keep this site running by allowing ads on MrExcel.com.
Allow Ads at MrExcel

Which adblocker are you using?

Disable AdBlock

Follow these easy steps to disable AdBlock

1)Click on the icon in the browser’s toolbar.
2)Click on the icon in the browser’s toolbar.
2)Click on the "Pause on this site" option.
Go back

Disable AdBlock Plus

Follow these easy steps to disable AdBlock Plus

1)Click on the icon in the browser’s toolbar.
2)Click on the toggle to disable it for "mrexcel.com".
Go back

Disable uBlock Origin

Follow these easy steps to disable uBlock Origin

1)Click on the icon in the browser’s toolbar.
2)Click on the "Power" button.
3)Click on the "Refresh" button.
Go back

Disable uBlock

Follow these easy steps to disable uBlock

1)Click on the icon in the browser’s toolbar.
2)Click on the "Power" button.
3)Click on the "Refresh" button.
Go back
Back
Top