Preface
In fact, the idea of crawlers is very simple, but for many beginners, they understand it, but when they write it, they don’t know how to analyze it! To be honest, write less, don't copy the code all the time, do more!
In fact, a crawler has three steps: download data, parse data, and save data.
This article provides a code example to show these three steps separately
If you are interested in Python, you can add the teacher's WeChat: abb436574, get a set of learning materials and video courses for free~
Download data
Analytical data
save data
Final main function
These are the most basic crawler routines, and it is very easy to crawl the data with these little routines for static websites.