python3 + celery + redis asynchronous tasks

First, the principle

Celery is based on a distributed task queue framework for Python development, support the use of task execution queue scheduling mode in the distribution of machine / process / thread. It is written in Python libraries, but its implementation protocol can also be used ruby, php, javascript and other calls. Asynchronous tasks in addition to the way the background of the message queue, it is also a regular scheduled task.

Celery is a powerful distributed task queue, it allows to perform tasks completely out of the main program can even be assigned to run on other hosts. We usually use it to implement asynchronous tasks (async task) and timing task (crontab). Its architecture is composed as follows in FIG. 

 

 

 

Components:

1, the task (tasks) - user-defined function, for realizing the function of the user, such as performing a task takes a long time

2, intermediaries (Broker) - a place to store tasks, but the intermediaries need to solve a problem that may need to store very, very many tasks, but also to ensure Worker can pick up from here

3, executor (Worker) - for performing tasks, that is, we actually call a function in tasks defined in

4, storage (Backend) - the result of the implementation of tasks returned for storage, for users to view or call

Second, the implementation process

1. Installation Environment (RabbitMQ / Redis, Celery, django-celery, flower)

2. Create Project

 

 

 Required red circle oriented projects:

web_order need to modify the following files:. celery.py, __ init __ py, settings file

web_test need to modify the following files: tasks.py file, longTask.py file

3. Modify the file process

1) modify settings.py. In the last settings plus the following code:

1 # CELERY SETTING
2 BROKER_URL = 'redis://localhost:6379'    #指定消息中间件
3 CELERY_RESULT_BACKEND = 'redis://localhost:6379'   ##指定结果存储位置为本地数据库
4 CELERY_ACCEPT_CONTENT = ['application/json']  #
5 CELERY_TASK_SERIALIZER = 'json'
6 CELERY_RESULT_SERIALIZER = 'json'
7 CELERY_TIMEZONE = 'Asia/Shanghai'
8 CELERY_IMPORTS = ("web_test.tasks")   #注册任务,

2)__init__ 文件。

#绝对导入,以免celery和标准库中的celery模块冲突
from __future__ import absolute_import

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.

#以下导入时为了确保在Django启动时加载app,shared_task在app中会使用到
from .celery import app as celery_app

__all__ = ['celery_app']

3)celery文件

from __future__ import absolute_import,unicode_literals
import os
from celery import Celery
from django.conf import settings

os.environ.setdefault("DJANGO_SETTINGS_MODULE", "web_order.settings")  # 设置celery可以在命令行中使用
app = Celery('web_order')  # 创建app实例
# app = Celery('tcelery', backend='redis://localhost:6379/0', broker='redis://localhost:6379/0')
app.conf.CELERY_IGNORE_RESULT = False  # 结果不忽略
# app.conf.CELERY_RESULT_BACKEND = 'redis://localhost:6379/0' #结果保存在redis中

app.config_from_object('django.conf:settings')  # 从文件中加载实例
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)  # 自动加载tasks,注意:他会去app下面查找tasks.py文件,所以我们必须将task放在tasks.py文件中

4)longTasks.py

from django.http import JsonResponse
import json
from .tasks import test   #导入异步任务的方法


def sendlongtask(request):
    #由此处调用test方法,执行异步命令
    run_res = test(sh, userIp, username)
    return JsonResponse("执行成功", safe=False)

5)Tasks.py

from celery import shared_task

@shared_task
def test(x, y):
    return (x+y)

4.启动Django和celery

  在项目根目录执行:

  python manage.py runserver 0.0.0.0:8000

  python manage.py celery worker -c 4 --loglevel=info

 

5.另外,Celery提供了一个工具flower,将各个任务的执行情况、各个worker的健康状态进行监控并以可视化的方式展现,如下图所示:

 

 

 

 

    Django下实现的方式如下: 

      1.) 安装flower:

    pip install flower

      2.) 启动flower(默认会启动一个webserver,端口为5555):

python manage.py celery flower

     3.) 进入http://localhost:5555即可查看。

6.python3踩过的坑:python3.7与celery不兼容

 

 1.出现这个错误时,需要将报错文件中所有的async改为asynchronous或者其他变量名即可。

执行下面的脚本也可快速修改

TARGET=/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/celery/backends
cd $TARGET
if [ -e async.py ]
then
    mv async.py asynchronous.py
    sed -n 's/async/asynchronous/g' redis.py
    sed -n 's/async/asynchronous/g' rpc.py
fi

运行后,你会发现celery可以正常使用了

2.将python版本降到3.6及以下,celery也可正常使用。

 

Guess you like

Origin www.cnblogs.com/weisunblog/p/12275261.html