Achieve a simple python3 + flask + celery + redis framework

Detailed python3 + flask + celery + redis

What Celery yes?

Celery is an asynchronous distributed task queue.

Run tasks in the background like a thread through Celery is not so simple, but with Celery, it enables better application scalability, because Celery is a distributed architecture. Here Celery three core components.

Producer (Celery client). Manufacturer (Celery client) to send messages. When working on Flask, producer (Celery client) running in a Flask application.

Consumers (Celery workers). Consumers used to handle background tasks. Consumers (Celery client) can be local or remote. We can run on a single server running Flask of consumers (Celery workers), when the volume of business rose again to add more consumers (Celery workers).

By messaging (message broker). Interactive use of information producers (Celery client) and consumer (Celery workers) is a message queue (message queue). Celery message queue supports several ways, the most common is RabbitMQ and the Redis .


9327700-45b7fa8795a6b9e4.png

We did not talk much on the code first.


First, the basic frame structure

9327700-a330f997a516c353.png

Second, important documents configuration is as follows

In Flask integrated celery you need to do two things:

Celery create an instance of the object name must be a flask application app name, otherwise it will fail to start celery;

celery must be able to successfully load the initialization file

1, __ init__.py file (initialization flask and celery)

from flaskimport Flask

from flask_sqlalchemyimport SQLAlchemy

from configimport *

import pymysql

pymysql.install_as_MySQLdb()

db= SQLAlchemy()

from celeryimport Celery

# Celery configuration

CELERY_RESULT_BACKEND= "redis://localhost:6379/0"

CELERY_BROKER_URL= "redis://localhost:6379/0"

def create_app(config_name):

    app= Flask(__name__)

    app.config.from_object(config[config_name])

    config[config_name].init_app(app)

    db.init_app(app)

    register_blueprint(app)

    return app

def make_celery(app=None):

    app= app or create_app(os.getenv('FLASK_CONFIG')or 'default')

    ## On the basis of the first phase started with celery on task scheduling, I use it as a redis cache server installation configuration in which redis not repeat

    celery= Celery(__name__,broker=CELERY_BROKER_URL,backend=CELERY_RESULT_BACKEND)

    celery.conf.update(app.config)

    TaskBase= celery.Task

    class ContextTask(TaskBase):

        abstract= True

        def __call__(self,*args,**kwargs):

            with app.app_context():

                return TaskBase.__call__(self,*args,**kwargs)

    celery.Task= ContextTask

    return celery

def register_blueprint(app):

    from app.mainimport main

    app.register_blueprint(main)

    from app.mailimport mail

    app.register_blueprint(mail)

    from app.testsimport tests

    app.register_blueprint(tests)


2, tasks.py file

"""

Tasks performed on the file

"""

from .import make_celery

celery= make_celery(app=None)

@celery.task()

def add_together(a,b):

    return a + b

@celery.task()

def print_hello():

    print('Hello World!')


3.config.py project profiles

import os

basedir= os.path.abspath(os.path.dirname(__file__))

class config:

    SECRET_KEY= os.environ.get('SECRET_KEY')or 'this is a secret string'

    SQLALCHEMY_TRACK_MODIFICATIONS= True

    @staticmethod

    def init_app(app):

        pass

class DevelopmentConfig(config):

    DEBUG= True

    SQLALCHEMY_DATABASE_URI= 'mysql+pymysql://username:pwd@sqldbadress/db'

class TestingConfig(config):

    TESTING= True

    SQLALCHEMY_DATABASE_URI= 'mysql+pymysql://username:pwd@sqldbadress/db'

class ProductionConfig(config):

    SQLALCHEMY_DATABASE_URI= 'mysql+pymysql://username:pwd@sqldbadress/db'

config= {

'development': DevelopmentConfig,

'testing': TestingConfig,

'production': ProductionConfig,

'default': DevelopmentConfig

}


4, manage.py (flask framework of the project up and running file)

import os

from appimport create_app, db

from flask_scriptimport Manager, Shell

from flask_migrateimport Migrate, MigrateCommand

app= create_app(os.getenv('FLASK_CONFIG')or 'default')

manager= Manager(app)

migrate= Migrate(app, db)

def make_shell_context():

    return dict(app=app,db=db)

manager.add_command("shell",Shell(make_context=make_shell_context))

manager.add_command('db', MigrateCommand)

if __name__== '__main__':

    manager.run()


5, views.py file (flask primary interface for business)

from .import main

from flaskimport Flask,request, jsonify

from app.tasksimport *

@main.route("/api/task_start",methods=['POST'])

def task_start():

    result= add_together.delay(10,20)

    print(result.wait())

    return jsonify({"msg":"Welcome to my app!"})


6. Start project steps:

1) Start falsk framework: python manage.py runserver -h 127.0.0.1 -p 8090

2) 启动 Celery Worker::  celery -A app.tasks worker --loglevel=info

Note: If something flask and celery joint report with the discovery of the error:

NotImplementedError: No result backend is configured.

When use is so useful:

CELERY_RESULT_BACKEND= "redis://localhost:6379/0"

CELERY_BROKER_URL= "redis://localhost:6379/0"

celery= Celery(__name__,broker=CELERY_BROKER_URL,backend=CELERY_RESULT_BACKEND)

Reproduced in: https: //www.jianshu.com/p/bdd9dcbf1e21

Guess you like

Origin blog.csdn.net/weixin_34375251/article/details/91158926