Get started with Celery, a python asynchronous task framework, take a quick look!

1. Introduction

Celery is a distributed task scheduling framework written in python.

It has several main concepts:

celery application

  • The code script written by the user is used to define the tasks to be executed, and then send the tasks to the message queue through the broker

broker

  • Broker, which coordinates between clients and workers via message queues.

  • Celery itself does not contain a message queue, it supports a message queue

    RabbitMQ

    Rdis

    Amazon SQS

    Zookeeper

  • For more information about Broker, see the official documentation (click at the end to read the original text)

backend

  • The database is used to store the results returned by the task.

worker

  • Workers are used to perform tasks assigned by brokers.

Task

  • Tasks, defined tasks that need to be performed

version requirements

Celery5.1 requires:

  • python(3.6,3.7,3.8)

Celery is a minimally funded project, so we don't support Microsoft Windows.

For more detailed version requirements, see the official documentation

Install

Install using pip:

pip install -U Celery

 If you want to learn automated testing, here I recommend a set of videos for you. This video can be said to be the first interface automation testing tutorial on the entire network at station B. At the same time, the number of online users has reached 1,000, and there are notes to collect and various Lu Dashen Technical Exchange: 798478386      

[Updated] The most detailed collection of practical tutorials for automated testing of Python interfaces taught by station B (the latest version of actual combat)_哔哩哔哩_bilibili [Updated] The most detailed collection of practical tutorials for automated testing of Python interfaces taught by station B (actual combat) The latest version) has a total of 200 videos, including: 1. [Interface Automation] The current market situation of software testing and the ability standards of testers. , 2. [Interface Automation] Fully skilled in the Requests library and the underlying method call logic, 3. [Interface Automation] interface automation combat and the application of regular expressions and JsonPath extractors, etc. For more exciting videos, please pay attention to the UP account. https://www.bilibili.com/video/BV17p4y1B77x/?spm_id_from=333.337&vd_source=488d25e59e6c5b111f7a1a1a16ecbe9a 

bundle

Celery also defines a set of packages for installing Celery and given dependencies.

Brackets can be implemented in the pip command to specify these dependencies.

pip install "celery[librabbitmq]"
pip install "celery[librabbitmq,redis,auth,msgpack]"

2. Easy to use

1. Choose a broker

To use celery, you first need to choose a message queue. Install any of the aforementioned celery-supported message queues you're familiar with.

2. Write a celery application

First we need to write a celery application, which is used to create tasks and manage wokers, and it needs to be imported by other modules.

Create a tasks.py file:


from celery import Celery

app = Celery('tasks', broker='redis://localhost:6379/0')

@app.task
def add(x, y):   
    return x + y

The first parameter tasks is the name of the current module, it can be omitted, it is recommended to use the name of the current module.

The second keyword parameter broker='redis://localhost:6379/0' specifies that we use Redis as the message queue and specifies the connection address.

3. Run the worker service of celery

cd to the directory where tasks.py is located, and then run the following command to start the worker service

celery -A tasks worker --loglevel=INFO

4. Invoke the task

>>> from tasks import add
>>> add.delay(4,4)

Execute the corresponding task by calling the delay of the task. Celery will send the execution command to the broker, and the broker will send the message to the worker service for execution. If everything is normal, you will see the logs of receiving tasks and executing tasks in the log of the worker service.

5. Save the result

If you want to track the state of the task and save the result of the task, celery needs to send it somewhere. Celery provides a variety of result backends.

Here we take reids as an example, modify the code in tasks.py, and add a Redis backend.

app = Celery('tasks', broker='redis://localhost:6379/0', backend='redis://localhost:6379/1')

See the official documentation for more result backends. (Click at the end to read the original text)

Restart the worker service and reopen the python interpreter

>>> from tasks import add>>> result = add.delay(4,4)

The ready() method returns whether the task is executed:

>>> result.ready()False

You can also wait for the result to complete, but this method is rarely used because it converts an asynchronous call into a synchronous call​​​​​​​​

>>> result.get(timeout=1)8

3. Using celery in the application

create project

Project structure:

proj/__init__.py
    /celery.py
    /tasks.py

proj/celery.py

from celery import Celery

app = Celery('proj',
            broker='redis://localhost:6379/0',
            backend='redis://localhost:6379/1',             
            include=['proj.tasks']

)# 配置
app.conf.update(
   result_expires=3600, # 结果过期时间
)

In this module we create a Celery module. To use celery in your project just import this instance.

proj/tasks.py


from .celery import app


@app.task
def add(x, y): 
   return x + y


@app.task
def mul(x, y):
   return x * y


@app.tas
kdef xsum(numbers)
    return sum(numbers)

start worker

celery -A proj worker -l INFO

call task

>>> from proj.tasks import add>>> add.delay(2, 2)

Fourth, use celery in django

To use celery in your django project, you first need to define an instance of Celery.

If your django project is as follows:

- proj/ 
 - manage.py
 - proj/ 
   - __init__.py
   - settings.py
   - urls.py

The recommended way then is to create a new proj/proj/celery.py module to define the celery instance: file:proj/proj/celery.py


import os

from celery import Celery

# 为`celery`设置默认的django设置模块
os.environ.setdefault('DJANGO_SETTINGS_MODULE','proj.settings')

app = Celery('proj')

# 设置配置来源
app.config_from_object('django.conf:settings',namespace='CELERY')

# 加载所有的已注册django应用中的任务
app.autodiscover_tasks()

@app.task(bind=True)
def debug_task(self): 
   print(f'Request: {self.request!r}')

Then you need to import this application in your proj/proj/__init__.py module. This ensures that the application is loaded when Django starts up, facilitating the use of the @shared_task decorator.

proj/proj/__init__.py:

from .celery import app as celery_app__all__ = ('celery_app',)

Note that this example project layout is suitable for larger projects, for simple projects a single module containing the defined application and tasks can be used.

Next, let's explain the code in celery.py. First, we set the default value of the environment variable DJANGO_SETTINGS_MODULE of the celery command line program:

os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')

The purpose of this line is to load the environment settings of the current django project, especially when ORM needs to be used in asynchronous tasks. It must be before the application instance is created.

app = Celery('proj')

We also added the Django settings module as a configuration source for Celery. This means that instead of using multiple configuration files, we can configure Celery directly in Django's configuration files.

app.config_from_object('django.conf:settings', namespace='CELERY')

The uppercase namespace means that all Celery configuration items must be specified in uppercase and start with CELERY_, so for example a broker_url setting becomes CELERY_BROKER_URL.

For example, a configuration file for a Django project might include:

settings.py​​​​​​​

CELERY_TIMEZONE = "Asia/Shanghai"CELERY_TASK_TRACK_STARTED = TrueCELERY_TASK_TIME_LIMIT = 30*60

Next, a common practice for reusable applications is to define all tasks in a separate tasks.py module. Celery has a way to automatically discover these modules:

app.autodiscover_tasks()

With the above line, Celery will automatically discover tasks from all installed applications following the tasks.py convention:

- app1/   - tasks.py   - models.py- app2/   - tasks.py   - models.py

This avoids having to manually add individual modules to the CELERY_IMPORTS setting.

Using the @shared_task decorator

The tasks we write may exist in reusable applications, and reusable applications cannot depend on the project itself, so they cannot directly import celery application instances.

The @shared_task decorator allows us to create tasks without any specific celery instance: demoapp/tasks.py

# Create your tasks here

from demoapp.models import Widget

from celery import shared_task


@shared_task
def add(x, y):
   return x + y


@shared_task
def mul(x, y):
   return x * y


@shared_task
def xsum(numbers):
   return sum(numbers)


@shared_task
def count_widgets(): 
   return Widget.objects.count()


@shared_task
def rename_widget(widget_id, name):
   w = Widget.objects.get(id=widget_id)
   w.name = name
   w.save()

Guess you like

Origin blog.csdn.net/m0_73409141/article/details/132279235
Recommended