一、pip安装celery(注意版本很重要)``
celery == 3.1.23
django-celery == 3.1.17
二、安装rabbitmq (我使用docker安装)``
docker pull rabbitmq:3.7.7-management
docker run -d --name rabbitmq3.7.7 -p 5672:5672 -p 15672:15672 -v `pwd`/data:/var/lib/rabbitmq --hostname myRabbit -e RABBITMQ_DEFAULT_VHOST=my_vhost -e RABBITMQ_DEFAULT_USER=admin -e RABBITMQ_DEFAULT_PASS=admin 镜像id
参数解释:RABBITMQ_DEFAULT_VHOST:虚拟机名;
RABBITMQ_DEFAULT_USER:用户名;
RABBITMQ_DEFAULT_PASS:密码
三、 settings.py中添加以下几行:
from __future__ import absolute_import # 写在最顶部
# celery 配置
import djcelery
djcelery.setup_loader()
BROKER_URL = 'amqp://admin:[email protected]:5672/my_vhost'
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler' # 定时任务
CELERY_ACCEPT_CONTENT = ['pickle', 'json']
CELERYD_CONCURRENCY = 8 # 并发worker数
CELERYD_FORCE_EXECV = True # 非常重要,有些情况下可以防止死锁
CELERYD_MAX_TASKS_PER_CHILD = 100 # 每个worker最多执行100个任务就会被销毁,可防止内存泄露
CELERY_DISABLE_RATE_LIMITS = True # 任务发出后,经过一段时间还未收到acknowledge , 就将任务重新交给其他worker执行
INSTALLED_APPS = ['djcelery',# 添加djcelery]
四、adminx.py文件配置``
from __future__ import absolute_import, unicode_literals
from djcelery.models import (
TaskState, WorkerState,
PeriodicTask, IntervalSchedule, CrontabSchedule,
)
from xadmin.sites import site
site.register(IntervalSchedule) # 存储循环任务设置的时间
site.register(CrontabSchedule) # 存储定时任务设置的时间
site.register(PeriodicTask) # 存储任务
site.register(TaskState) # 存储任务执行状态
site.register(WorkerState) # 存储执行任务的worker
from __future__ import absolute_import
import os
from celery import Celery,platforms
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'mcenter_backstage.settings')
app = Celery('mcenter_backstage')
platforms.C_FORCE_ROOT = True
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(packages=settings.INSTALLED_APPS)
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
六、把以下信息写入主应用下的__init__.py文件``
from __future__ import absolute_import, unicode_literals
from .celery import app as celery_app
__all__ = ['celery_app']
七、在APP下创建tasks.py``
from __future__ import absolute_import
from celery import task,shared_task
@shared_task
def test_task():
print("测试成功")
八、 启动命令
python manage.py celery worker --beat