Celery 任务分多队列运行

Celery 任务分多队列运行

需要安装 python插件Celery, RabbitMq

  • 代码结构:
    在这里插入图片描述

  • 创建:celery_app.py

from celery import Celery

app = Celery("TestTask")

app.config_from_object("settings")

from datetime import timedelta
from kombu import Queue
from kombu import Exchange

BROKER_URL = 'amqp://{}:{}@{}:{}'.format(
    'admin',
    'admin',
    '127.0.0.1',
    '5672'
)

# 启动Celery默认的定时任务
# CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'

CELERY_ACCEPT_CONTENT = ['application/json']
# 
CELERY_TASK_SERIALIZER = 'json'
# 返回数据序列化格式
CELERY_RESULT_SERIALIZER = 'json'
# 时区
CELERY_TIMEZONE = 'Asia/Shanghai'
CELERY_ENABLE_UTC = False

# 默认开启10进程
CELERYD_CONCURRENCY = 10
# CELERYD_MAX_TASKS_PER_CHILD = 3  #  每个worker最多执行1个任务就会被销毁,可防止内存泄露


CELERY_QUEUES = (
    Queue('Default', exchange=Exchange('default'), routing_key='default'),
    Queue('Tasks_Main', exchange=Exchange('Tasks_Main'), routing_key='Tasks_Main'),
    )

# Celery 路由设置
CELERY_ROUTES = {
    'tasks.main': {'queue': 'Tasks_Main', 'routing_key': 'Tasks_Main'}
}

# 如果不指定QUEUE 那么就用Default
CELERY_DEFAULT_QUEUE = 'Default'
CELERY_DEFAULT_EXCHANGE = 'default'
CELERY_DEFAULT_ROUTING_KEY = 'default'

CELERY_IMPORTS = ('tasks',)
from celery_app import app

@app.task(name='tasks.main')
def task_main(param):
    print('调用成功:' + str(param))
  • 运行
celery -A celery_app worker  -Q Default  -n Queue_Default@%d
celery -A celery_app worker  -Q Tasks_Main  -n Queue_Tasks_Main@%d 
from tasks import task_main

task_main.apply_async(args=['Tasks_Main'], queue='Tasks_Main', routing_key='Tasks_Main')
task_main.apply_async(args=['Default'], queue='Default', routing_key='default')
  • 运行结果如下:

在这里插入图片描述

在这里插入图片描述

猜你喜欢

转载自blog.csdn.net/weixin_39956308/article/details/84992139