[Learn python from zero] 91. A concise web application that uses decorators and dictionaries to manage request paths

import the necessary modules

from wsgiref.simple_server import make_server
from utils import load_html, load_template

These two lines of code first import make_serverfunctions and custom load_htmlfunctions load_templatefor subsequent use.

Create routing dictionary

g_url_route = {
    
    }

A global variable is defined g_url_routeas a routing dictionary, which is used to save the correspondence between request paths and processing functions.

decorator functionroute

def route(url):
    def handle_action(action):
        g_url_route[url] = action

        def do_action(start_response):
            return action(start_response)

        return do_action

    return handle_action

routeis a decorator function used when defining routes. It takes one urlparameter and returns an inner function handle_action. In the inner function, it associates the request path urlwith the handler function action, and defines a new function do_actionthat takes start_responseparameters and executes actionthe function. Finally return do_actionthe function.

Routing definition and processing function

@route('/index.html')
@route('/')
def show_home(start_response):
    return load_html('/index.html', start_response)

Here's an example that routedefines two routes using decorators: /index.htmland /and show_homeassociates each with a function. When the request path matches one of these two routes, show_homethe function will be executed.

@route('/test.html')
def show_test(start_response):
    start_response('200 OK', [('Content-Type', 'text/html;charset=utf-8')])
    return ['我是一段普通的文字'.encode('utf-8')]

Here's another example, using routea decorator to define a route /test.htmland show_testassociate it with a function. When the request path matches this route, show_testthe function will be executed. This function sets the response header information and returns a piece of ordinary text as the response content.

@route('/info.html')
def show_info(start_response):
    return load_template('/info.html', start_response, name='张三', age=18)

A route is also routedefined using a decorator /info.htmland show_infoassociated with a function. When the request path matches this route, show_infothe function will be executed. This function uses load_templatethe function to load /info.htmlthe template and passes in the nameand ageparameters for rendering.

request handlerapplication

def application(environ, start_response):
    file_name = environ.get('PATH_INFO')
    try:
        return g_url_route[file_name](start_response)
    except Exception:
        start_response('404 NOT FOUND', [('Content-Type', 'text/html;charset=utf-8')])
        return ['对不起,界面未找到'.encode('utf-8')]

applicationA function is the entry point to a WSGI application. It accepts environand start_responsetwo parameters for handling HTTP requests. First, get the request path file_name. Then, try to find a matching handler function from the routing dictionary, and execute that function. If the corresponding processing function cannot be found, a 404 status code and corresponding error message will be returned.

The server starts and listens

if __name__ == '__main__':
    httpd = make_server('', 8000, application)
    print("Serving HTTP on port 8000...")
    httpd.serve_forever()

This part of the code is used to start the server and listen on the specified port (8000). When the program is run directly (rather than imported as a module), a WSGI server is created and applicationfunctions are called to handle requests. At the same time, the prompt information is printed to indicate that the server is running, and httpd.serve_forever()continuous monitoring is realized through it.

full code

from wsgiref.simple_server import make_server
from utils import load_html, load_template

g_url_route = {
    
    }


def route(url):
    def handle_action(action):
        g_url_route[url] = action

        def do_action(start_response):
            return action(start_response)

        return do_action

    return handle_action


@route('/index.html')
@route('/')
def show_home(start_response):
    return load_html('/index.html', start_response)


@route('/test.html')
def show_test(start_response):
    start_response('200 OK', [('Content-Type', "text/html;charset=utf-8")])
    return ['我是一段普通的文字'.encode('utf-8')]


@route('/info.html')
def show_info(start_response):
    return load_template('/info.html', start_response, name='张三', age=18)


def application(environ, start_response):
    file_name = environ.get('PATH_INFO')
    try:
        return g_url_route[file_name](start_response)
    except Exception:
        start_response('404 NOT FOUND', [('Content-Type', 'text/html;charset=utf-8')])
        return ['对不起,界面未找到'.encode('utf-8')]


if __name__ == '__main__':
    httpd = make_server('', 8000, application)
    print("Serving HTTP on port 8000...")
    httpd.serve_forever()


Advanced case

[Python] Python realizes the word guessing game-challenge your intelligence and luck!

[python] Python tkinter library implements GUI program for weight unit converter

[python] Use Selenium to get (2023 Blog Star) entries

[python] Use Selenium and Chrome WebDriver to obtain article information in [Tencent Cloud Studio Practical Training Camp]

Use Tencent Cloud Cloud studio to realize scheduling Baidu AI to realize text recognition

[Fun with Python series [Xiaobai must see] Python multi-threaded crawler: download pictures of emoticon package websites

[Play with Python series] [Must-see for Xiaobai] Use Python to crawl historical data of Shuangseqiu and analyze it visually

[Play with python series] [Must-see for Xiaobai] Use Python crawler technology to obtain proxy IP and save it to a file

[Must-see for Xiaobai] Python image synthesis example using PIL library to realize the synthesis of multiple images by ranks and columns

[Xiaobai must see] Python crawler actual combat downloads pictures of goddesses in batches and saves them locally

[Xiaobai must see] Python word cloud generator detailed analysis and code implementation

[Xiaobai must see] Python crawls an example of NBA player data

[Must-see for Xiaobai] Sample code for crawling and saving Himalayan audio using Python

[Must-see for Xiaobai] Technical realization of using Python to download League of Legends skin pictures in batches

[Xiaobai must see] Python crawler data processing and visualization

[Must-see for Xiaobai] Python crawler program to easily obtain hero skin pictures of King of Glory

[Must-see for Xiaobai] Use Python to generate a personalized list Word document

[Must-see for Xiaobai] Python crawler combat: get pictures from Onmyoji website and save them automatically

Xiaobai must-see series of library management system - sample code for login and registration functions

100 Cases of Xiaobai's Actual Combat: A Complete and Simple Shuangseqiu Lottery Winning Judgment Program, Suitable for Xiaobai Getting Started

Geospatial data processing and visualization using geopandas and shapely (.shp)

Use selenium to crawl Maoyan movie list data

Detailed explanation of the principle and implementation of image enhancement algorithm Retinex

Getting Started Guide to Crawlers (8): Write weather data crawler programs for visual analysis

Introductory Guide to Crawlers (7): Using Selenium and BeautifulSoup to Crawl Douban Movie Top250 Example Explanation [Reptile Xiaobai must watch]

Getting Started Guide to Crawlers (6): Anti-crawlers and advanced skills: IP proxy, User-Agent disguise, Cookie bypass login verification and verification code identification tools

Introductory Guide to Crawlers (5): Distributed Crawlers and Concurrency Control [Implementation methods to improve crawling efficiency and request rationality control]

Getting started with crawlers (4): The best way to crawl dynamic web pages using Selenium and API

Getting Started Guide to Crawlers (3): Python network requests and common anti-crawler strategies

Getting started with crawlers (2): How to use regular expressions for data extraction and processing

Getting started with reptiles (1): Learn the basics and skills of reptiles

Application of Deep Learning Model in Image Recognition: CIFAR-10 Dataset Practice and Accuracy Analysis

Python object-oriented programming basics and sample code

MySQL database operation guide: learn how to use Python to add, delete, modify and query operations

Python file operation guide: encoding, reading, writing and exception handling

Use Python and Selenium to automate crawling#【Dragon Boat Festival Special Call for Papers】Explore the ultimate technology, and the future will be due to you"Zong" #Contributed articles

Python multi-thread and multi-process tutorial: comprehensive analysis, code cases and optimization skills

Selenium Automation Toolset - Complete Guide and Tutorials

Python web crawler basics advanced to actual combat tutorial

Python introductory tutorial: master the basic knowledge of for loop, while loop, string operation, file reading and writing and exception handling

Pandas data processing and analysis tutorial: from basics to actual combat

Detailed explanation of commonly used data types and related operations in Python

[Latest in 2023] Detailed Explanation of Six Major Schemes to Improve Index of Classification Model

Introductory Python programming basics and advanced skills, web development, data analysis, and machine learning and artificial intelligence

Graph prediction results with 4 regression methods: Vector Regression, Random Forest Regression, Linear Regression, K-Nearest Neighbors Regression

Guess you like

Origin blog.csdn.net/qq_33681891/article/details/132487978