celery初识

celery异步任务

参考博客

  • 在Linux下安装:pip install celery(安装redis:yum install redis-server)据说celery在Windows不好使

  • 小栗子:task1.py

    from celery import Celery
    import time
    
    app = Celery(
                'tasks',   # 任务名
                broker='redis://localhost:6379',   # 任务存储
                backend='redis://localhost:6379')  # 任务返回存储
    
    @app.task
    def hello(a,b):
        print('running tasks...')
        time.sleep(10)
        c=a+b
        return 'hello world%d'%c
    
    • 执行任务:celery -A task1 worker -l debug
    • 在Python解释器中
      >>> import task1
      >>> t=task1.hello.delay(1,2)
      
      1. t.get()获取任务结果,有异常时直接报错。
      2. t.get(timeout=1)超时后没有结果直接报socket.timeout
      3. t.get(propagte=False)任务执行报错时只返回错误对象
      4. t.ready()检查任务是否执行完毕
      5. t.traceback查看异常信息
  • 在项目中使用celery

    • 目录结构
      celery_proj/__init__.py
          /celery.py
          /task1.py
          /task2.py
      
    • 饭粒1:
    # celery.py
    # 导入__future__用于指定从绝对路径导入模块
    from __future__ import absolute_import, unicode_literals
    from celery import Celery
     
    app = Celery('celery_proj',
                broker='redis://localhost:6379',   # 任务存储
                backend='redis://localhost:6379',
                include=['celery_proj.task1','celery_proj.task2'])  # 包含多个任务
     
    # Optional configuration, see the application user guide.
    app.conf.update(
    	# 结果保存时间
        result_expires=3600,
    )
     
    if __name__ == '__main__':
        app.start()
    
    # task1.py
    from __future__ import absolute_import, unicode_literals
    from .celery import app
    
    
    @app.task
    def add(x, y):
        return x + y
    
    
    @app.task
    def mul(x, y):
        return x * y
    
    
    @app.task
    def xsum(numbers):
        return sum(numbers)
    
    # task2.py
    from __future__ import absolute_import, unicode_literals
    from .celery import app
    import time, random
    
    @app.task
    def rannum(x, y):
        time.sleep(6)
        return random.randint(x, y)
    
    • 启动worker:celery -A proj worker -l info
    • 后台启动worker:celery multi start/stop/restart [任务名] -A proj -l info --pidfile=proj.pid --logfile=proj.log
  • Celery 定时任务

    • 继续用上栗的目录结构,目录内新增periodic_task.py
    from __future__ import absolute_import, unicode_literals
    from .celery import app
    from celery.schedules import crontab
    
    
    @app.on_after_configure.connect
    def setup_periodic_tasks(sender, **kwargs):
        # Calls test('hello') every 10 seconds.
        sender.add_periodic_task(10.0, test.s('hello'), name='add every 10')
    
        # Calls test('world') every 30 seconds
        sender.add_periodic_task(30.0, test.s('world'), expires=10)
    
        # Executes every Monday morning at 7:30 a.m.
        sender.add_periodic_task(
            crontab(hour=14, minute=39),
            test.s('Happy Mondays!'),
        )
    
    
    @app.task
    def test(arg):
        print(arg)
    
    • celery.py中include加入’celery_proj.periodic_task’,加入以下内容:(每5秒执行一次task1.add)
    app.conf.beat_schedule = {
        'add-every-5-seconds': {
            'task': 'celery_proj.task1.add',
            'schedule': 5.0,
            'args': (16, 16)
        },
    }
    app.conf.timezone = 'UTC'
    
    • 启动任务celery -A celery_proj worker -l debug
    • 启动定时任务celery -A celery_proj.periodic_task beat -l debug
    • 注意:如果启动定时任务输出beat: Waking up in 5.00 minutes.则表示定时任务启动失败,原因有可能是配置有问题,也有可能环境冲突。检查配置无误后可删除当前目录下celerybeat-schedule文件和redis中所有key -> 进入redis-cli执行flushall再重试。
  • 在Django中使用celery

    1. desk_demo/settings.py中添加celery相关配置
      # celery配置
      CELERY_BROKER_URL = 'redis://localhost:6379'
      CELERY_RESULT_BACKEND = 'redis://localhost:6379'
      
    2. 在settings.py的同级目录中创建celery.py文件:
      from __future__ import absolute_import, unicode_literals
      import os
      from celery import Celery
      
      # set the default Django settings module for the 'celery' program.
      os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'desk_demo.settings')
      
      # celery名称可自定义
      app = Celery('proj')
      
      # 将settings.py中前缀是`CELERY_`看做是celery配置信息
      app.config_from_object('django.conf:settings', namespace='CELERY')
      
      # 收集所有app下的任务
      app.autodiscover_tasks()
      
      
      @app.task(bind=True)
      def debug_task(self):
          print('Request: {0!r}'.format(self.request))
      
    3. 在app下创建tasks.py -> 注意必须且只能以此命名
      from __future__ import absolute_import, unicode_literals
      from celery import shared_task
      # shared_task表示其他app也可以使用此任务
      
      @shared_task
      def add(x, y):
          return x + y
      
      
      @shared_task
      def mul(x, y):
          return x * y
      
      
      @shared_task
      def xsum(numbers):
          return sum(numbers)
      
    4. 配置全局urls.py
      from app.app01.views import *
      urlpatterns = [
      	...
          path('celery_call/', celery_call),
          path('celery_result/', celery_result),
      	...]
      
    5. 配置views.py
      import random
      # from extra_app.helpdesk import tasks
      from app.app01 import tasks
      from django.shortcuts import render, HttpResponse
      from celery.result import AsyncResult
      
      
      def celery_call(request):
      	# 获取task_id视图
          num = random.randint(1, 10000)
          t = tasks.randnum.delay(1, num)
          return HttpResponse(t.id)
      
      
      def celery_result(request):
      	# 获取结果视图
          task_id = request.GET.get('id')
          res = AsyncResult(id=task_id)
          if res.ready():
              return HttpResponse(res.get())
          else:
              return HttpResponse(res.ready())
      
  • 使用django-celery-beat配置定时任务

    1. 安装:pip install django-celery-beat
    2. 写入settings.py中INSTALLED_APPS:'django_celery_beat',
    3. 生成数据库结构:python manage.py migrate
    4. 在admin界面配置定时任务
    5. 启动worker:celery -A desk_demo worker -l info
    6. 执行定时任务:celery -A desk_demo beat -l info -S django
    • 注意:在admin界面新增任务Periodic tasks时Schedule中的Interval、Crontab、Solar三选一
发布了29 篇原创文章 · 获赞 4 · 访问量 8208

猜你喜欢

转载自blog.csdn.net/super2feng/article/details/105357353
今日推荐