Django automatically update the cache using Celery

Celery

Official website:

  • Celery official website: http://www.celeryproject.org/
  • Celery official documents in English: http: //docs.celeryproject.org/en/latest/index.html
  • Celery official documentation Chinese version: http: //docs.jinkan.org/docs/celery/

Celery architecture

Celery architecture consists of three parts, an intermediate message (message broker), the task execution unit (worker) and task execution result storage (task result store) composition.

  • Messaging middleware
    • Celery does not provide messaging services, but can be easily and messaging middleware integration provided by third parties, including, RabbitMQ, Redis, etc.
  • Task execution unit
    • Worker Tasks performed Celery unit is provided, worker run concurrently in a distributed system nodes
  • Task results are stored
    • Task result store to store results of execution of the tasks of the Worker, Celery support results in different ways to store tasks, including AMQP, Redis, etc.

scenes to be used

Asynchronous task: the task to submit time-consuming operation to asynchronous execution Celery, such as sending SMS / right, push messaging, audio and video processing, etc.

Regular tasks: the timing of implementation of something, such as statistics day

Celery Installation and Configuration

pip install celery

Messaging middleware: RabbitMQ / Redis

app = Celery ( 'task name', broker = 'xxx', backend = 'xxx')

Celery perform asynchronous tasks

Architecture Package Package:

project
    |-- celert_task     # celery包
    |   |-- __init__.py # 包文件
    |   |-- celery.py   # celery连接和配置相关文件
    |   └── tasks.py    # 所有任务函数
    |-- add_task.py     # 添加任务
    └── get_result.py   # 获取结果

Basic use:

celery.py:

# celery.py

# 1) 创建app + 任务

# 2) 启动celery(app)服务:
    # 非windows: celery worker -A celert_task -l info
    # windows:
        # pip install eventlet
        # celery worker -A celery_task -l info -P eventlet
        
# 3) 添加任务:手动添加,要自定义添加任务的脚本,右键执行脚本

# 4) 获取结果:手动获取,要自定获取任务的脚本,右键执行脚本

from celert import Celery

broker = 'redis://127.0.0.1:6379/1'
backend = 'redis://127.0.0.1:6379/2'
app = Celert(broker=broker, backend=backend, include=['celery_task.tasks'])

tasks.py:

# tasks.py
from .celery import app

@app.task
def add(n, m):
    
    print('n+m的结果:%s' % (n + m))
    return n + m

add_tasks.py:

# add_tasks.py
from celery_task import tasks

# 添加立即执行任务
t1 = tasks.add.delay(10, 20)

# 添加延迟任务
from datetiem import datetime, tiemdelta
def eta_second(second):
    ctime = datetime.now()
    utc_ctime = datetime.utcfromtimestamp(ctime.timestamp())
    time_delay = timedelta(seconds=second)
    return utc_ctime + time_delay

tasks.add.apply_async(args=(200, 300), eta=eta_seond(10))

get_tasks.py:

# get_tasks.py
from celery_task.celery import app

from celery.result import AsyncResult

id = '21325a40-9d32-44b5-a701-9a31cc3c74b5'
if __name__ == '__main__':
    async = AsyncResult(id=id, app=app)
    if async.successful():
        result = async.get()
        print(result)
    elif async.failed():
        print('任务失败')
    elif async.status == 'PENDING':
        print('任务等待中被执行')
    elif async.status == 'RETRY':
        print('任务异常后正在重试')
    elif async.status == 'STARTED':
        print('任务已经开始被执行')

Advanced use:

celery.py:

# celery.py

# 1)创建app + 任务

# 2)启动celery(app)服务:
    # 非windows  celery worker -A celery_task -l info
    # windows:
        # pip3 install eventlet
        # celery worker -A celery_task -l info -P eventlet
        
# 3)添加任务:自动添加任务,所以要启动一个添加任务的服务
# 命令:celery beat -A celery_task -l info

# 4)获取结果

from celery import Celery

broker = 'redis://127.0.0.1:6379/1'
backend = 'redis://127.0.0.1:6379/1'
app = Celery(broker=broker, backend=backend, include=['celery_task.tasks'])

# 时区
app.conf.timezone = 'Asia/Shanghai'
# 是否使用UTC
app.conf.enable_utc = Fasle

# 定时任务配置
from datetime import timedelta
from celery.schedules import crontab
app.conf.beat_schedule = {
    'add_task':{
        'task': 'celery_task.tasks.add',
        'schedule': timedelta(secconds=3),
        # 'schedule': crontab(hour=8, day_of_week=1),  # 每周1早上8点
        'args': (300, 200)
    }
}

tasks.py:

# tasks.py
from .celery import app

@app.task
def add(n, m):
    
    print('n+m的结果:%s' % (n + m))
    return n + m

get_result.py: ditto get_result.py

django use

celery.py
# 重点:要将 项目名.settings 所占的文件夹添加到环境变量
# import sys
# sys.path.append(r'项目绝对路径')

# 开启django支持
import os
os.environ.setdefault('DJANGO_SETTINGS_MODULE', '项目名.settings')
import django
django.setup()



# 1)创建app + 任务

# 2)启动celery(app)服务:
# 非windows
# 命令:celery worker -A celery_task -l info
# windows:
# pip3 install eventlet
# celery worker -A celery_task -l info -P eventlet

# 3)添加任务:自动添加任务,所以要启动一个添加任务的服务
# 命令:celery beat -A celery_task -l info

# 4)获取结果


from celery import Celery

broker = 'redis://127.0.0.1:6379/1'
backend = 'redis://127.0.0.1:6379/2'
app = Celery(broker=broker, backend=backend, include=['celery_task.tasks'])


# 时区
app.conf.timezone = 'Asia/Shanghai'
# 是否使用UTC
app.conf.enable_utc = False

# 任务的定时配置
from datetime import timedelta
from celery.schedules import crontab
app.conf.beat_schedule = {
    'django-task': {
        'task': 'celery_task.tasks.test_django_celery',
        'schedule': timedelta(seconds=3),
        'args': (),
    }
}
tasks.py
from .celery import app
# 获取项目中的模型类
from api.models import Banner
@app.task
def test_django_celery():
    banner_query = Banner.objects.filter(is_delete=False).all()
    print(banner_query)

Guess you like

Origin www.cnblogs.com/17vv/p/11980068.html