Django - Celery asynchronous task queue

background

In development, we often encounter a number of time-consuming tasks, for example:

Upload Excel file and parses a 1w of data, and finally to the persistent database.

In my program, this task took about 6s, for the user, 6s have been waiting for a disaster of.

A better approach is:

  1. The task of receiving the request
  2. Add this task to the queue
  3. Return immediately, "the operation was successful, background processing is" the word
  4. Background Consumer queue, perform this task

We follow this line of thought, with Celery be achieved.

achieve

As used herein, the following environment:

  • Python 3.6.7
  • RabbitMQ 3.8
  • Celery 4.3

Use Docker install RabbitMQ

Celery relies on a back-end message, there are alternative RabbitMQ, Redis the like, herein chosen RabbitMQ.

Meanwhile, in order to facilitate installation, RabbitMQ me directly Docker installation:

docker run -d --name anno-rabbit -p 5672:5672 rabbitmq:3
复制代码

After a successful start, through to amqp://localhostaccess the message queue.

Install and configure Celery

Celery is a tool implemented in Python, installation can be done directly by Pip:

pip install celery
复制代码

I also assume that the current project folder as projthe project name myproj, application name ismyapp

After installation is complete, proj/myproj/creating a path to the celery.pyfile to initialize Celery Example:

proj/myproj/celery.py

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery, platforms

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproj.settings')

app = Celery('myproj',
             broker='amqp://localhost//',
             backend='amqp://localhost//')

# Using a string here means the worker don't have to serialize
# the configuration object to child processes.s
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
复制代码

Then proj/myproj/__init__.pyadd a reference to Celery object, ensure that Django can start initialization Celery:

proj/myproj/__init__.py

from __future__ import absolute_import, unicode_literals

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app

__all__ = ('celery_app',)
复制代码

No other special configuration, then, is that these basic configuration of Celery.

Write a time-consuming task

To simulate a time-consuming task, we directly create a way to make it "sleep" 10s, and set Celery tasks:

proj/myapp/tasks.py

import time
from myproj.celery import app as celery_app

@celery_app.task
def waste_time():
    time.sleep(10)
    return "Run function 'waste_time' finished."
复制代码

Start Celery Worker

Celery After configuration is complete, and the task is successfully created, we have to model asynchronous tasks start Celery:

celery -A myproj worker -l info
复制代码

I noticed emphasized the asynchronous mode , because Celery addition to supporting asynchronous tasks, also supports regular tasks, so time to start specified.

Note also that, Celery, once started, to Task (here waste_timemodified) must be restarted Celery take effect.

Task calls

In the request processing logic code, call the above created a good job:

proj/myapp/views.py

from django.http import JsonResponse
from django.views.decorators.http import require_http_methods
from .tasks import waste_time

@require_http_methods(["POST"])
def upload_files(request):
    waste_time.delay()
    # Status code 202: Accepted, 表示异步任务已接受,可能还在处理中
    return JsonResponse({"results": "操作成功,正在上传,请稍候..."}, status=202)
复制代码

Call waste_time.delay()the method waste_timeit will be added to the task queue, waiting for idle Celery Worker calls.

effect

When we send the request, the interface will be returned directly {"results": "操作成功,正在上传,请稍候..."}respond to the content rather than stuck ten seconds the user experience so much better.

to sum up

This asynchronous task processing with Python Celery is commonly used method, although the actual execution after a successful time-consuming unchanged or even increased (such as Worker lead busy processing lag), but the user experience is more receptive, you can click upload large files continue to deal with other matters, without waiting for the page.

Celery and more usage article does not describe that, it has very detailed document, there is a need direct reference.

reference

Reproduced in: https: //juejin.im/post/5d043d28e51d4510bf1d666e

Guess you like

Origin blog.csdn.net/weixin_33816300/article/details/93173493