Hi,
I'm new to VBA (and to this forum) and was wondering if someone could help me with a problem getting some web page data into Excel 2010 using VBA.
Basically the scenario I am trying to create is very similar to this YouTube tutorial. Excel VBA Pull Data From A Website - YouTube
My scenario however is set up with the following titles in cell A1, B1, C1, D1 and E1 : POST CODE, OUTLET, ADDRESS, TELEPHONE, EMAIL
The result I want to achieve is I enter a post code into cell A2 for example, Excel then uses IE to navigate to the relevant web page as defined in the VBA code. I then want the following to happen:
All instances of the p tag are contained in a div element called div class="adBox_content" . There are also 5 other DIVs above that DIV in the hierarchy (whether that helps or not I don't know?)
Using the YouTube tutorial link, the method has worked for me using the getElementsByTagName("h1").innerText However, when I try adding a second getElementsByTagName("p")(01).innerText the whole thing fails.
So I'm left with two problems; I can't make the VBA get more than one element at a time from the page, I can only either have the h1 or the first instance of the p tag. I've tried all the getElementBy methods and none of them seem to work in getting the second and third
instances to show.
I also need the code to make the data be put on the same row ONLY as where the post code was entered. In this scenario for example of entering a post code into A2, the OUTLET needs to land in cell B2 only, ADDRESS C3 only etc.
By following the youtube tutorial above by giving the cells names to refer to in the code, the data ends up being inputted in all further rows with identical cell names. I need it to not do that.
Any help would be hugely appreciated. The code is needed for around 300 rows of post codes that will be entered and refreshed every week or so.
Thanks,
Matt
I'm new to VBA (and to this forum) and was wondering if someone could help me with a problem getting some web page data into Excel 2010 using VBA.
Basically the scenario I am trying to create is very similar to this YouTube tutorial. Excel VBA Pull Data From A Website - YouTube
My scenario however is set up with the following titles in cell A1, B1, C1, D1 and E1 : POST CODE, OUTLET, ADDRESS, TELEPHONE, EMAIL
The result I want to achieve is I enter a post code into cell A2 for example, Excel then uses IE to navigate to the relevant web page as defined in the VBA code. I then want the following to happen:
- The InnerText of the web page's h1 tag is then inserted into the OUTLET cell (B2)
- The first instance of the p tag is then inserted into the ADDRESS cell (C2)
- The second instance of the p tag is then inserted into the TELEPHONE cell (D2)
- The third instance of the p tag is then inserted into the EMAIL cell (E2)
All instances of the p tag are contained in a div element called div class="adBox_content" . There are also 5 other DIVs above that DIV in the hierarchy (whether that helps or not I don't know?)
Using the YouTube tutorial link, the method has worked for me using the getElementsByTagName("h1").innerText However, when I try adding a second getElementsByTagName("p")(01).innerText the whole thing fails.
So I'm left with two problems; I can't make the VBA get more than one element at a time from the page, I can only either have the h1 or the first instance of the p tag. I've tried all the getElementBy methods and none of them seem to work in getting the second and third
instances to show.
I also need the code to make the data be put on the same row ONLY as where the post code was entered. In this scenario for example of entering a post code into A2, the OUTLET needs to land in cell B2 only, ADDRESS C3 only etc.
By following the youtube tutorial above by giving the cells names to refer to in the code, the data ends up being inputted in all further rows with identical cell names. I need it to not do that.
Any help would be hugely appreciated. The code is needed for around 300 rows of post codes that will be entered and refreshed every week or so.
Thanks,
Matt
Last edited: