[Learn python from zero] 88. Detailed explanation of WSGI interface: realize simple and efficient web development

Article directory

WSGI interface

The WSGI interface definition is very simple, it only requires web developers to implement a function to respond to HTTP requests. Let's look at the simplest web version of "Hello, web!":

def application(environ, start_response):
    start_response('200 OK', [('Content-Type', 'text/html')])
    return '<h1>Hello, web!</h1>'

The above application()function is an HTTP processing function that conforms to the WSGI standard, and it receives two parameters:

  • environ: a dict object containing all HTTP request information;
  • start_response: A function that sends an HTTP response.

In application()the function, call:

start_response('200 OK', [('Content-Type', 'text/html')])

The Header of the HTTP response is sent. Note that the Header can only be sent once, that is, start_response()the function can only be called once. start_response()The function receives two parameters, one is the HTTP response code, and the other is a set of HTTP Headers represented by a list, and each Header is represented by a tuple containing two strs.

Normally, Content-Typethe header should be sent to the browser. Many other commonly used HTTP headers should also be sent.

Then, the return value of the function '<h1>Hello, web!</h1>'will be sent to the browser as the Body of the HTTP response.

With WSGI, what we care about is how to environget the HTTP request information from this dict object, then construct HTML, start_response()send the Header, and finally return the Body.

The whole application()function itself does not involve any part of parsing HTTP, that is to say, the underlying code does not need to be written by ourselves, we are only responsible for considering how to respond to requests at a higher level.

But wait, application()how is this function called? If we call it ourselves, we cannot provide the sum of the two parameters , environand start_responsethe returned str cannot be sent to the browser.

So application()the function must be called by the WSGI server. There are many servers that conform to the WSGI specification, and we can choose one to use. But now, we just want to test as soon as possible that the function we wrote application()can really output HTML to the browser, so we must quickly find the simplest WSGI server and run our web application.

The good news is that Python has a built-in WSGI server, this module is called wsgiref, and it is the reference implementation of a WSGI server written in pure Python. The so-called "reference implementation" means that the implementation fully complies with the WSGI standard, but does not consider any operational efficiency, and is only used for development and testing.

Advanced case

[Python] Python realizes the word guessing game-challenge your intelligence and luck!

[python] Python tkinter library implements GUI program for weight unit converter

[python] Use Selenium to get (2023 Blog Star) entries

[python] Use Selenium and Chrome WebDriver to obtain article information in [Tencent Cloud Studio Practical Training Camp]

Use Tencent Cloud Cloud studio to realize scheduling Baidu AI to realize text recognition

[Fun with Python series [Xiaobai must see] Python multi-threaded crawler: download pictures of emoticon package websites

[Play with Python series] [Must-see for Xiaobai] Use Python to crawl historical data of Shuangseqiu and analyze it visually

[Play with python series] [Must-see for Xiaobai] Use Python crawler technology to obtain proxy IP and save it to a file

[Must-see for Xiaobai] Python image synthesis example using PIL library to realize the synthesis of multiple images by ranks and columns

[Xiaobai must see] Python crawler actual combat downloads pictures of goddesses in batches and saves them locally

[Xiaobai must see] Python word cloud generator detailed analysis and code implementation

[Xiaobai must see] Python crawls an example of NBA player data

[Must-see for Xiaobai] Sample code for crawling and saving Himalayan audio using Python

[Must-see for Xiaobai] Technical realization of using Python to download League of Legends skin pictures in batches

[Xiaobai must see] Python crawler data processing and visualization

[Must-see for Xiaobai] Python crawler program to easily obtain hero skin pictures of King of Glory

[Must-see for Xiaobai] Use Python to generate a personalized list Word document

[Must-see for Xiaobai] Python crawler combat: get pictures from Onmyoji website and save them automatically

Xiaobai must-see series of library management system - sample code for login and registration functions

100 Cases of Xiaobai's Actual Combat: A Complete and Simple Shuangseqiu Lottery Winning Judgment Program, Suitable for Xiaobai Getting Started

Geospatial data processing and visualization using geopandas and shapely (.shp)

Use selenium to crawl Maoyan movie list data

Detailed explanation of the principle and implementation of image enhancement algorithm Retinex

Getting Started Guide to Crawlers (8): Write weather data crawler programs for visual analysis

Introductory Guide to Crawlers (7): Using Selenium and BeautifulSoup to Crawl Douban Movie Top250 Example Explanation [Reptile Xiaobai must watch]

Getting Started Guide to Crawlers (6): Anti-crawlers and advanced skills: IP proxy, User-Agent disguise, Cookie bypass login verification and verification code identification tools

Introductory Guide to Crawlers (5): Distributed Crawlers and Concurrency Control [Implementation methods to improve crawling efficiency and request rationality control]

Getting started with crawlers (4): The best way to crawl dynamic web pages using Selenium and API

Getting Started Guide to Crawlers (3): Python network requests and common anti-crawler strategies

Getting started with crawlers (2): How to use regular expressions for data extraction and processing

Getting started with reptiles (1): Learn the basics and skills of reptiles

Application of Deep Learning Model in Image Recognition: CIFAR-10 Dataset Practice and Accuracy Analysis

Python object-oriented programming basics and sample code

MySQL database operation guide: learn how to use Python to add, delete, modify and query operations

Python file operation guide: encoding, reading, writing and exception handling

Use Python and Selenium to automate crawling#【Dragon Boat Festival Special Call for Papers】Explore the ultimate technology, and the future will be due to you"Zong" #Contributed articles

Python multi-thread and multi-process tutorial: comprehensive analysis, code cases and optimization skills

Selenium Automation Toolset - Complete Guide and Tutorials

Python web crawler basics advanced to actual combat tutorial

Python introductory tutorial: master the basic knowledge of for loop, while loop, string operation, file reading and writing and exception handling

Pandas data processing and analysis tutorial: from basics to actual combat

Detailed explanation of commonly used data types and related operations in Python

[Latest in 2023] Detailed Explanation of Six Major Schemes to Improve Index of Classification Model

Introductory Python programming basics and advanced skills, web development, data analysis, and machine learning and artificial intelligence

Graph prediction results with 4 regression methods: Vector Regression, Random Forest Regression, Linear Regression, K-Nearest Neighbors Regression

Guess you like

Origin blog.csdn.net/qq_33681891/article/details/132477056