Asynchronous distribution task celery

Celery Profile

Celery is a full-featured Plug and Play queue of tasks. It allows us do not need to consider complex issues, the use of very simple. celery applicable asynchronous processing problems encountered when sending mail, or file upload, image processing, and so some of the more time-consuming operation, which we may execute asynchronously, so that users do not need to wait for a long time, improve the user experience.

celery is characterized by:

  • Simple, easy to use and maintain, has a wealth of documentation.
  • Efficient, single celery process can handle millions of tasks per minute.
  • Flexible, celery in almost every part can be custom extensions.

celery is very easy to integrate into a number of web development framework.

Profile usage scenarios

When we make the site back-end application development, will be faced with this demand: users need to fill out the registration information in our website, we send the user an activation email to the user's mailbox, if for various reasons, this message is sent required a longer time, then the client will wait for a long time, resulting in poor user experience.

We can use asynchronous task distribution systems Celery solve the above problem.

We will be time-consuming tasks in the background asynchronously. Users will not affect other operations. In addition to registration function, such as uploading, graphics, and so time-consuming tasks can be solved according to this line of thinking. How to implement asynchronous execution task? We can use celery ..

In addition to just celery relates to asynchronous tasks, timer processing may also be implemented certain tasks. 

Celery concept, configuration and use

A mechanism task queue is a cross-thread, cross-machine work.

Task queue contains the unit of work called tasks. A dedicated worker process constant surveillance task queue, and derive new tasks and processes.

(Sender task) Celery communicate via messages, usually called Broker (intermediaries) to coordinate client and worker (task handler). Clients sent messages to the queue, Broker in the queue information distributed to worker to handle .

Celery system may comprise a lot of worker and broker, may enhance availability and performance of scale.

Installation celery

Directly pip package management tools can be installed celery:

pip install celery

You can also be directly downloaded from the official installation package: https://pypi.python.org/pypi/celery/

tar xvfz celery-0.0.0.tar.gz
cd celery-0.0.0
python3 setup.py build
python3 setup.py install

Broker of choice

Intermediate apparatus for transmitting and receiving a need for a way to resolve message Celery, we have to store such a message is called message broker, the message can also be called intermediaries.

As the middleman, we have several options to select:

RabbitMQ

RabbitMQ is a fully functional, stable and easy to install broker. It is a production environment the best choice. Details RabbitMQ reference to the following links:  http://docs.celeryproject.org/en/latest/getting-started/brokers/rabbitmq.html#broker-rabbitmq

If we are using Ubuntu or Debian distribution of Linux, you can install RabbitMQ directly through the following command: sudo apt-get install rabbitmq- server after installation is complete, RabbitMQ-server server already running in the background. If you are not using Ubuntu or Debian, at the following URL:  http://www.rabbitmq.com/download.html  to find versions of the software they need.

Redis

Redis is a full-featured broker option, but it may be due to an unexpected interruption or power failure cause data loss.

About Redis is that as a Broker, can visit the following URL:  http://docs.celeryproject.org/en/latest/getting-started/brokers/redis.html#broker-redis

Store results

If we want to track the status of tasks, Celery need to save the results somewhere. There are several conservation programs optional: SQLAlchemy, Django ORM, Memcached, Redis, RPC (RabbitMQ / AMQP).

A simple example of the use of Celery

The directory structure of the project are as follows:

We need to join the task asynchronous calls in tasks package, write specific tasks in task_demo.py file:

# - * - Coding: UTF-. 8 - * - 
from celery Import Celery 

# define celery objects 
# 123 redis my password; redis No. 1 library using as broker, No. 2 library as a result of local access 
celery_app = Celery ( " demo1 " , 
                    Broker = " Redis: //: [email protected]: 6800/1 " , 
                    backend = " Redis: //: [email protected]: 6800/2 " 
                    ) 


@ celery_app.task 
DEF print_now (now ):
     "" " asynchronous tasks Demo " ""
    Print (now)
     # define a return value, if you do not write it will default return None
    return  " Current Time: {} " .format (now)

Then add the code to call this task in an external transfer.py file:

# - * - Coding: UTF-8 - * - 
Import Time 

Import redis
 from tasks.task_demo Import print_now 


# create redis connection, select No. 2 library 
conn = redis.Redis (Host = " 127.0.0.1 " , Port = 6800, password 123 =, DB = 2 ) 

# using celery asynchronous task 
# delay function to return immediately after the call 
now the time.strftime = ( " %% Y-X-M-% D% " )
 # the delay parameter in the! 
= RET print_now.delay (now)
 Print (RET, type (RET)) # 7af04c3c-8070-4bf1-9563-b220fd5aac9c <class 'celery.result.AsyncResult'>

Then start the celery in the directory of the project:

 

Start and results are described below:

Celery启动时如果上报AttributeError: async 这个错误,建议把Celery的版本升级到4.1.1

启动成功后,终端会夯住;然后运行transfer.py文件,在终端会出现这样的提示:

然后我们再利用redis可视化工具看看结果:

我们可以看到,在执行完一次任务后,redis数据库1存放着celery的broker的信息,在库2中存放着执行的结果。

Django中使用Celery生成首页静态页面并将结果存在本地数据库中 *****

在实际中,我们网站的首页内容是不怎么变化的,如果每次请求首页都对数据库中的内容进行查询的话,这样对数据库来说负担会很大。

我们可以使用celery生成一下静态页面,当没有数据变化的时候用户每次请求主页我们都让他们去访问这个生成好的“首页”,有当数据变化的时候再重新生成一下这个“静态页面”就可以了。

项目的目录如下

 

路由与视图如下 

路由:

复制代码
from django.contrib import admin
from django.urls import path

from demo import views

urlpatterns = [
    path('admin/', admin.site.urls),
    path('index/',views.Index.as_view()),
]
复制代码

视图:

from django.views import View
from django.shortcuts import render

from tasks.static_index import index_static

class Index(View):

    def get(self,request):

        # 这里只是模拟一下后台是否有数据改动的情况
        # 当数据有修改的时候需要进行判断,把修改后的数据传过去再重新进行页面的生成!
        # 如果a=1表示后台没有数据修改
        a = 12
        if a == 1:
            return render(request, "index.html")
        # 有数据修改了,就重新生成一下静态的页面
        else:
            index_static.delay()
            return render(request,"index.html")

将结果存储在本地数据库中的配置

此处需要用到额外包django_celery_results, 先安装包:

pip3 install django-celery-results

在celery_demo/settings.py中安装此应用:

INSTALLED_APPS = [
   xxx

    # 自定义app或第三方app
    'demo.apps.DemoConfig',
    # celery用于存储结果的应用
    'django_celery_results',
    # celery用于定时任务的应用
    'django_celery_beat'
]

此时,tasks/static_index.py文件中的配置以及生成静态页面的代码如下:

import os

from celery import Celery
from django.conf import settings
from django.template import loader,RequestContext

### 注意!必须引入Django环境!为celery设置环境变量
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'djangoCeleryDemo.settings')


## 创建应用
# 定义celery对象
# 123是我redis的密码;使用redis的1号库作为broker
celery_app = Celery("demo2")

## 配置应用
celery_app.conf.update(
    # 使用redis的库1作为消息队列
    BROKER_URL="redis://:[email protected]:6800/1",
    # 使用项目数据库存储任务执行结果
    CELERY_RESULT_BACKEND='django-db',
    # 如过想把把结果存在redis的库2中这样配置:
    # CELERY_RESULT_BACKEND = "redis://:[email protected]:6800/2",
)

@celery_app.task
def index_static():

    # 这里可以使用ORM查询的结果等等
    context = {
            "msg":"这是一个基于Celery实现的首页静态页面"
        }

    # 使用模板
    # 加载模板文件,返回模板对象
    temp = loader.get_template("index_static.html")
    # 模板渲染
    index_static_html = temp.render(context)
    # 生成首页静态页面 —— 存放在static目录下
    index_static_html_path = os.path.join(settings.BASE_DIR,'templates','index.html')
    with open(index_static_html_path,'w')as f:
        f.write(index_static_html)

    # 作为测试,这里返回一个字符串OK
    return "OK"

创建django_celery_results应用所需数据库表, 执行迁移文件:

这里需要注意,由于后续我们会用到admin页面,因此需要先迁移一下“默认注册应用auth”相关的数据库:

python3 manage.py migrate

然后迁移django_celery_results的表:

python3 manage.py migrate django_celery_results

启动celery

进入项目根目录,启动celery:

celery -A tasks.static_index worker -l info

访问index页面查看结果

此时我们再访问index页面,可以从结果的表中看到result:

Django中使用celery做定时任务 *****

如果我们想某日某时执行某个任务,或者每隔一段时间执行某个任务,也可以使用celery来完成。

使用定时任务,需要安装额外包:

pip3 install django_celery_beat

然后在settings.py中安装此应用: 

INSTALLED_APPS = [
    xxx

    # 自定义app或第三方app
    'demo.apps.DemoConfig',
    # celery用于存储结果的应用
    'django_celery_results',
    # celery用于定时任务的应用
    'django_celery_beat'
]

然后,tasks/static_index.py文件中的配置以及定时任务的代码如下:

# -*- coding:utf-8 -*-
import os

from celery import Celery
from django.conf import settings
from django.template import loader,RequestContext

### 注意!必须引入Django环境!为celery设置环境变量
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'djangoCeleryDemo.settings')


## 创建应用
# 定义celery对象
# 123是我redis的密码;使用redis的1号库作为broker
celery_app = Celery("demo2")

## 配置应用
celery_app.conf.update(
    # 使用redis的库1作为消息队列
    BROKER_URL="redis://:[email protected]:6800/1",
    # 使用项目数据库存储任务执行结果
    CELERY_RESULT_BACKEND='django-db',
    # 把结果存在redis的库2中
    # CELERY_RESULT_BACKEND = "redis://:[email protected]:6800/2",
    # 配置定时器模块,定时器信息存储在数据库中
    CELERYBEAT_SCHEDULER='django_celery_beat.schedulers.DatabaseScheduler',
)


### 定时任务
@celery_app.task
def interval_task():
    print("我每隔5秒执行一次...")
# 作为测试,这里返回666
    return 666

由于定时器信息存储在数据库中,我们需要先生成对应表, 对diango_celery_beat执行迁移操作,创建对应表:

python3 manage.py migrate django_celery_beat

其他的表是之前迁移的时候生成的:

由于我们需要在数据库中添加数据才能使用定时任务,因此这里需要创建一下后台管理员账号:

python3 manage.py createsuperuser

然后登陆后台管理员admin界面:

其中Crontabs用于定时某个具体时间执行某个任务的时间;

Intervals用于每隔多久执行任务的事件;

具体任务的执行在Periodic tasks表中创建。 

我们要创建每隔5秒执行某个任务,所以在Intervals表名后面点击Add按钮:

然后在Periodic tasks表名后面,点击Add按钮,添加任务:

 

启动定时任务需要在后面加上--beat参数!

celery -A tasks.static_index worker -l info --beat

结果如下:

我们可以在admin的Task results表中查看结果:

其中上面3条是定时任务的结果,下面那一条是生成静态页面的结果。

Guess you like

Origin www.cnblogs.com/ryxiong-blog/p/11645700.html