Celery- regular tasks

1. Celery Beat What is that?

  celery beat is a scheduler that periodically start the task, and then by the cluster nodes to perform tasks available working procedures.

  By default, the entry is set acquired in , but may use custom storage, for example, entries are stored in an SQL database.beat_schedule

  We must ensure that only one scheduler to run a schedule for the task, otherwise it will eventually lead to repetitive tasks. Using a centralized approach means having to schedule synchronization, and the service can be run without the use of locks.

 

2. Add the celery beat the task

   To invoke the task on a regular basis, you must add an entry in the list Beat schedule

tasks.py

from celery import Celery

app = Celery('tasks', broker='pyamqp://celery:[email protected]:5672/celery_vhost',backend='redis://localhost:6379/0')
#app = Celery('tasks', backend='redis://localhost', broker='pyamqp://')

app.conf.update(
    task_serializer='json',
    accept_content=['json'],  # Ignore other content
    result_serializer='json',
    timezone='Asia/Shanghai',
    enable_utc=True,
)

@app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
    # Calls test('hello') every 10 seconds.
    sender.add_periodic_task(10.0, test.s('hello'), name='add every 10')
    
    # Calls add(2,2) every 30 seconds
    sender.add_periodic_task(30.0, add.s(2,2), expires=10)

    # Executes every Monday morning at 7:30 a.m.
    sender.add_periodic_task(
        crontab(hour=7, minute=30, day_of_week=1),
        test.s('Happy Mondays!'),
    )

@app.task
def test(arg):
    print(arg)

@app.task
def add(x, y):
    return x + y

 

3. Enable Celery Beat

Beat necessary to run the last time the task is stored in a local database file ( by default named celerybeat-Schedule ), the

So it needs to access in order to write in the current directory, or you can specify a custom location for the file:

celery -A tasks beat -s /var/run/celery/celerybeat-schedule

And then enable the worker in another terminal 

celery -A tasks worker -l info

You can see the log:

[2019-10-24 14:45:53,448: INFO/ForkPoolWorker-4] Task tasks.add[e028900c-f2a3-468e-8cb8-4ae72d0e77fe] succeeded in 0.0020012762397527695s: 4
[2019-10-24 14:46:03,370: INFO/MainProcess] Received task: tasks.test[0635b276-19c9-4d76-9941-dbe9e7320a7f]
[2019-10-24 14:46:03,372: WARNING/ForkPoolWorker-6] hello
[2019-10-24 14:46:03,374: INFO/ForkPoolWorker-6] Task tasks.test[0635b276-19c9-4d76-9941-dbe9e7320a7f] succeeded in 0.0021341098472476006s: None
[2019-10-24 14:46:13,371: INFO/MainProcess] Received task: tasks.test[afcfa84c-3a3b-48bf-9191-59ea55b08eea]
[2019-10-24 14:46:13,373: WARNING/ForkPoolWorker-8] hello
[2019-10-24 14:46:13,375: INFO/ForkPoolWorker-8] Task tasks.test[afcfa84c-3a3b-48bf-9191-59ea55b08eea] succeeded in 0.002273786813020706s: None

 

Also by enabling workers -B option will beat embedded into the worker,

If you never run more than a worker node, which is very convenient, but it is not commonly used, and is not recommended for production environments:

celery -A tasks worker -B -l info 

  

 

  

 

Guess you like

Origin www.cnblogs.com/zydev/p/11732129.html