Python basics|Develop a decorator for task timeout exit

Author: Little Ming

Common applications of decorators

Task timeout exit

Various network request libraries that we use daily have timeout parameters, such as the request library. This parameter can cause the request to time out and not continue, and directly throw a timeout error to avoid waiting too long.

If we want to add this function to the method we developed by ourselves, how should we do it?

There are many ways, but the simplest and most straightforward is to use the concurrent library futures. For ease of use, I encapsulated it into a decorator. The code is as follows:

import functools
from concurrent import futures

executor = futures.ThreadPoolExecutor(1)

def timeout(seconds):
    def decorator(func):
        @functools.wraps(func)
        def wrapper(*args, **kw):
            future = executor.submit(func, *args, **kw)
            return future.result(timeout=seconds)
        return wrapper
    return decorator

After defining the above function, we have a decorator that ends with a timeout. You can test it below:

import time

@timeout(1)
def task(a, b):
    time.sleep(1.2)
    return a+b

task(2, 3)

result:

---------------------------------------------------------------------------
TimeoutError                              Traceback (most recent call last)
...
D:\Anaconda3\lib\concurrent\futures\_base.py in result(self, timeout)
    432                 return self.__get_result()
    433             else:
--> 434                 raise TimeoutError()
    435 
    436     def exception(self, timeout=None):

TimeoutError: 

Above we defined the timeout time of the function as 1 second through the decorator. When the sleep simulation function is executed for more than 1 second, the timeout exception is successfully thrown.

When the program can be completed within the timeout period:

@timeout(1)
def task(a, b):
    time.sleep(0.9)
    return a+b

task(2, 3)

result:

5

As you can see, the results were obtained smoothly.

In this way, we can add a timeout to any function through a decorator, and the function can end the task directly if it can't be processed within the specified time.

Earlier I defined this decorator to define the required variables to the outside. In fact, we can further encapsulate it with a class decorator. The code is as follows:

import functools
from concurrent import futures

class timeout:
    __executor = futures.ThreadPoolExecutor(1)

    def __init__(self, seconds):
        self.seconds = seconds

    def __call__(self, func):
        @functools.wraps(func)
        def wrapper(*args, **kw):
            future = timeout.__executor.submit(func, *args, **kw)
            return future.result(timeout=self.seconds)
        return wrapper

After testing, the same effect can be obtained by using the class decorator.

Note: The purpose of using @functools.wraps is because the decorated func function meta-information will be replaced with the meta-information of the wrapper function, while @functools.wraps(func) replaces the meta-information of the wrapper function with the meta-information of the func function. In the end, although the wrapper function is returned, the meta-information is still the original func function.

In functional programming, the return value of a function is a function object called a closure.

Logging

If we need to record the execution time of some functions, and print some logs before and after the function is executed, decorators are a very convenient choice.

code show as below:

import time
import functools
 
def log(func):
    @functools.wraps(func)
    def wrapper(*args, **kwargs):
        start = time.perf_counter()
        res = func(*args, **kwargs)
        end = time.perf_counter()
        print(f'函数 {func.__name__} 耗时 {(end - start) * 1000} ms')
        return res
    return wrapper

The decorator log records the running time of a function and returns its execution result.

have a test:

@log
def now():
    print('2021-7-1')
    
now()

result:

2021-7-1
函数 now 耗时 0.09933599994838005 ms

Cache

If a function is frequently called and the parameters are often repeated, if the result is cached, the processing time will be saved the next time the same parameter is called.

Define the function:

import math
import random
import time


def task(x):
    time.sleep(0.01)
    return round(math.log(x**3 / 15), 4)

carried out:

%%time
for i in range(500):
    task(random.randrange(5, 10))

result:

Wall time: 5.01 s

At this time, if we use the cache, the effect will be very different. There are many decorators that implement the cache. I will not repeat the wheel. Here I use the LRU cache under the functools package:

from functools import lru_cache

@lru_cache()
def task(x):
    time.sleep(0.01)
    return round(math.log(x**3 / 15), 4)

carried out:

%%time
for i in range(500):
    task(random.randrange(5, 10))

result:

Wall time: 50 ms

Constrain the number of times a function can be executed

If we want a function in the program to be executed only once or N times in the entire program life cycle, we can write a decorator like this:

import functools


class allow_count:
    def __init__(self, count):
        self.count = count
        self.i = 0

    def __call__(self, func):
        @functools.wraps(func)
        def wrapper(*args, **kw):
            if self.i >= self.count:
                return
            self.i += 1
            return func(*args, **kw)
        return wrapper

test:

@allow_count(3)
def job(x):
    x += 1
    return x


for i in range(5):
    print(job(i))

result:

1
2
3
None
None

Guess you like

Origin blog.csdn.net/as604049322/article/details/113980325