An article takes you to reveal how the epidemic prevention and control data was obtained (python article)

2020 is a year of ups and downs. After nearly a year of fighting the epidemic, I thought the situation had improved, but I did not expect that the advent of winter would give the virus a new turn. There is still some way to go to get rid of the mask.

How to use python to obtain epidemic prevention and control data?

Screenshot from Beijing Daily

As ordinary people, in addition to complying with relevant regulations for daily protection, we have to feel that our country's epidemic prevention and control is accurate and timely. So where does the data in epidemic prevention and control come from? Today, the dark horse will take you to reveal how to use python to easily obtain epidemic data-

Course content:

1. Overview of Web Crawlers

2. requests request library

3. BeautifulSoup parsing library

4. Regular Expressions

5. json module

6. Epidemic crawler project

7. Introduction to epidemic data visualization

For people:

1. Current students and recent graduates who are interested in reptiles.

2. Incumbents who have further improvement requirements for their current occupations and hope to engage in high-paying jobs in the data industry.

3. Related personnel interested in the data industry.

The main content of the basic course includes:

Phase 1: Overview of web crawlers

1. The difference between web crawlers and browsers

2. The concept of web crawlers

 

Phase 2: requests request library

1. Introduction and installation of requests

2. Basic use of requests

3. Case: Request the Epidemic Homepage

 

Phase 3. BeautifulSoup parsing library

1. Introduction and installation of BeautifulSoup

2. Introduction and creation of BeautifulSoup objects

3. Find method of BeautifulSoup object

4. Case: Advance the latest epidemic data of various countries from the epidemic homepage

 

Phase 4: Regular expressions

1. The concept and function of regular expressions

2. Common syntax of regular expressions

3. re.findall() method

4. The use of the original string of r in regular expressions

5. Case: Extract the json string of the latest epidemic data

 

Phase five: json module

1. Introduction to json module

2. Convert json to python

3. Convert python to json

4. Case: Parse the json string of the latest epidemic data

 

Phase 6: Epidemic Reptile Project

1. Realize the collection of epidemic data from countries around the world in the last day

2. Realize the collection of epidemic data from all countries in the world since January 23

3. Realize the collection of epidemic data from all provinces across the country in the last day

4. Achieve collection of epidemic data from all provinces across the country since January 23

5. Refactor the crawler project code

 

Stage 7: Visualization of epidemic data

1. Visualization of epidemic data from countries around the world in the past day

2. Visualization of epidemic data in countries around the world since January 23

3. Visualization of epidemic data in various provinces across the country in the last day

4. Visualization of epidemic data in various provinces across the country since January 23

Course: Introduction to Python Crawler|Easily obtain epidemic data in 180 minutes

Guess you like

Origin blog.csdn.net/cz_00001/article/details/111676512