Python- use in Django Celery

A request in .Django

  Django Web http request initiated from one, to obtain a response returned html page process is as follows:

    http request initiated 

    After Middleware

      http handling(request解析) 

    url mapping (url match is found corresponding to View) 

    Logic (including additions and deletions Model class calls the database search changed) in View

    After Middleware

    Returns the corresponding template / response.

  

  Synchronization request: All logic processing, data computing tasks in response View returned after processing. View user at the time of processing tasks in a wait state until the page returns the result.

  Asynchronous request: View the first return response, and then in the background processing tasks. Users do not need to wait, you can continue to browse the site. When the task processing is completed, we can then inform the user.

Two .Django use Celery

installation

pip3 install django-celery

Configuration

  First create a django project is structured as follows:

                

 

 

    After another sibling directory settings.py add celeryconfig.py profile, more configuration information can refer to the official documentation.

import djcelery
from datetime import timedelta

djcelery.setup_loader()

# 导入任务
CELERY_IMPORTS = [
    'celeryapp.tasks'
]
# 设置队列
CELERY_QUEUES = {
    'beat_tasks': {
        'exchange': 'beat_tasks',
        'exchange_type': 'direct',
        'binding_key': 'beat_tasks' 
    }, 
    ' Work_queue ' : {
         ' Exchange ' : ' work_queue ' ,
         ' exchange_type ' : ' Direct ' ,
         ' binding_key ' : ' work_queue ' 
    } 
} 
# Set the default queue, the queue does not meet the other tasks in the default queue 
CELERY_DEFAULT_QUEUE = ' work_queue '

# Some cases may prevent deadlocks 
CELERYD_FORCE_EXECV = True 

# Set the number of concurrent 
CELERYD_CONCURRENCY. 4 = #

Each worker task to execute up to 100, to prevent leakage memory 
CELERYD_MAX_TASKS_PER_CHILD = 100 # single task execution time up 
CELERYD_TASK_TIME_LIMIT * = 12 is 30 # to set the timing performed 
CELERYBAET_SCHEDULE = {
     ' Task1 ' : {
         ' Task ' : ' Course-Task ' ,
         ' Schedule ' : timedelta (= seconds The. 5 ),
         ' Options ' : {
             ' Queue ' : ' beat_tasks ' 
        } 
    } 
} 
CELERY_ACCEPT_CONTENT




= ['pickle', 'json', ]

BROKER_BACKEND = 'redis'
BROKER_URL = 'redis://localhost:6379/1'
CELERY_RESULT_BACKEND = 'redis://localhost:6379/2'
celeryconfig.py
from .celeryconfig import *  # 导入Celery配置信息


INSTALLED_APPS = [
    'django.contrib.admin',
    'django.contrib.auth',
    'django.contrib.contenttypes',
    'django.contrib.sessions',
    'django.contrib.messages',
    'django.contrib.staticfiles',
    'celeryapp.apps.CeleryappConfig',
    'djcelery'  Sign celery#
]
settings.py
import time
from celery.task import Task


class Course(Task):
    name = 'course-task'

    def run(self, *args, **kwargs):
        print('start...')
        time.sleep(3)
        print(f'args={args},kwargs={kwargs}')
        print('end task....')
tasks.py
from django.http Import jsonResponse
 from celeryapp.tasks Import Course, 


DEF Course, (Request, * args, ** kwargs):
     # perform asynchronous tasks 
    Print ( " Start Course, ... " )
     # Course.delay () 
    # can use apply_async transfer parameter specifies the queue 
    Course.apply_async (args = ( ' Hello ' ,), queue = ' work_queue ' )
     Print ( " End Course ... " )
     return jsonResponse ({ ' Result': 'ok'})
views.py
from django.contrib import admin
from django.urls import path
from celeryapp.views import course

urlpatterns = [
    path('admin/', admin.site.urls),
    path('course/', course),
]
urls.py

Redis started as a message broker

redis-server 

Django project started, then visit HTTP: // localhost: 8000 / Course, / , triggered task

python manage.py runserver

Start worker

python manage.py celery worker -l info   

    You can view the configuration and implementation of the mandate:

            

 

 

      

 

 

      

 

 

 Start beat

python manage.py celery beat -l info

         

 

 

 III. Common Errors

♦  AttributeError: ‘str’ object has no attribute ‘items’

解决方法: redis版本过高,降低redis版本即可

pip install redis==2.10.6

 

♦  from kombu.async.timer import Entry, Timer as Schedule, to_timestamp, logger SyntaxError: invalid syntax

  这个是python3.7目前不支持kombu,降低python版本至3.6即可,可以使用conda进行直接安装

conda install python=3.6.8

 四.过程监控

  Celery提供了一个工具flower,将各个任务的执行情况、各个worker的健康状态进行监控并以可视化的方式展现。

安装监控

pip install flower

执行flower

python manage.py celery flower

本地端口:5555查看监控

刷新course页面,查看tasks,发现有刚刚执行完成的任务

  查看broker

 

 进入Monitor查看任务执行情况,执行成功,执行失败,消耗的时间,队列里面的任务情况

点击的worker查看具体worker情况 

 可以给flower添加密码认证,添加之后再访问则需要输入用户名和密码

python manage.py celery flower --basic_auth=username:password 

 

Guess you like

Origin www.cnblogs.com/zivli/p/11517797.html
Recommended