be good at
Excel should be used by most people. In addition to daily data statistical processing, you can also grab web data. Let me briefly introduce the grabbing process. The main steps are as follows. Here is an example of capturing PM2.5 data:
1. First, create a new Excel file and open it. Click on the data in the menu bar->; "From the website", as follows:
2. Then enter the URL to be crawled in the pop-up "New Page Query" dialog box, and click "Go" to load the webpage we need to crawl, as shown below:
3. Then, click the "Import" button in the lower right corner, select the worksheet to store data or create a new worksheet, and click the "OK" button to import data automatically. The data successfully imported is as follows:
4. If you need to refresh the data regularly, you can click "Properties" in the menu bar and set the refresh frequency in the pop-up dialog box to refresh the data regularly, as follows:
octopus
This is a crawler software specially used to collect data. It is easy to learn and master. You can automatically crawl data by setting the elements to crawl on the page, and you can save it as Excel or export the database. Let me briefly introduce the installation and use of this software:
1. Download and install Octopus. Just download this directly from official website, as follows. Click download and install directly:
2. After installation, open the software and click "Custom Collection" on the main page, as shown below:
3. Then enter the address of the webpage to be crawled in the task page, as follows. Take grabbing public comment data as an example:
4. Click "Save URL" to open the webpage automatically, as shown below:
5. Then, we can directly select the tag data that needs to be crawled, as follows, just follow the operation prompts and go down step by step, which is very simple:
6. After setting, directly click "Start Local Collection" to automatically start data capture. The data after successful crawling is as follows, which is the tag data we just set:
7. Click "Export Data" here to export the captured data to the format you need, such as Excel, CSV, database, etc. :
At this point, we have finished using Excel and Octopus to capture web data. Generally speaking, these two softwares are very simple to use. As long as you are familiar with the relevant operations, you will soon be able to master them. Of course, you can also use other crawler software, such as locomotive, which has similar basic functions to octopus. There are also related materials and tutorials on the Internet. You can search if you are interested. I hope the content shared above is helpful to you. You are also welcome to comment on the message.