Current location - Recipe Complete Network - Food world - How to use VBA or web crawler to capture website data
How to use VBA or web crawler to capture website data
Common methods of VBA net capture

1, xmlhttp/winhttp method:

Use xmlhttp/winhttp to simulate sending requests to the server and receiving data returned by the server.

Advantages: high efficiency, basically no compatibility problems.

Disadvantages: Tools like fiddler are needed to simulate http requests.

2, IE/webbrowser method:

Create IE control or webbrowser control, and simulate browser operation by combining the methods and properties of htmlfile object to obtain the data of browser page.

Advantages: This method can simulate most browser operations. What you see is what you get, and all the data that the browser can see can be obtained through code.

Disadvantages: all kinds of pop-ups are annoying, and compatibility is really a nerve-racking problem. It is impossible to upload files in IE.

3.QueryTables method:

Because it comes with excel, it is barely a method. In fact, this method is similar to xmlhttp. It also sends a request in the form of GET or POST, and then gets a response from the server and returns it to the cell.

Advantages: excel comes with it, you can get the code by recording macros, and it is very convenient to handle tables.

. The code is short, which is suitable for quickly obtaining some data in the source code table.

Disadvantages: referer and other headers cannot be simulated.

You can also use collection tools to collect data on web pages without writing code.