celery use --3

# Celery use

1.broker

2. The base case

Redis use as a broker and brokend.

Creating tasks.py

# tasks.py
di = 'redis://:****@localhost:6379/0'
app = Celery('tasks', backend=di, broker=di)

@app.task
def add(x, y):
    return x + y

运行:
celery -A tasks worker -l info -P eventlet

创建temp.py
# temp.py
from tasks import add
rv = add.delay(4, 4)

2.1 operating results:

Run tasks

E:\python\code test>celery -A tasks worker -l info -P eventlet

 -------------- celery@*** v4.3.0 (rhubarb)
---- **** -----
--- * ***  * -- Windows0 2019-09-21 22:08:04
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app:         tasks:0x1aebfdcf98
- ** ---------- .> transport:   redis://:**@localhost:6379/0
- ** ---------- .> results:     disabled://
- *** --- * --- .> concurrency: 4 (eventlet)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery


[tasks]
  . tasks.add

[2019-09-21 22:08:04,802: INFO/MainProcess] Connected to redis://:**@192.168.199
.113:6379/0
[2019-09-21 22:08:04,813: INFO/MainProcess] mingle: searching for neighbors
[2019-09-21 22:08:05,849: INFO/MainProcess] mingle: all alone
[2019-09-21 22:08:05,886: INFO/MainProcess] celery@*** ready.
[2019-09-21 22:08:05,905: INFO/MainProcess] pidbox: Connected to redis://:**@...../0.

Run temp

[2019-09-21 22:11:27,198: INFO/MainProcess] Received task: tasks.add[06d745c6-53
18-4f48-8a1e-2ab8f8563994]
[2019-09-21 22:11:27,200: INFO/MainProcess] Task tasks.add[06d745c6-5318-4f48-8a
1e-2ab8f8563994] succeeded in 0.0s: 8
[2019-09-21 22:11:31,935: INFO/MainProcess] Received task: tasks.add[115c3b5d-eb
a7-472b-86ab-bd356f650e13]
[2019-09-21 22:11:31,936: INFO/MainProcess] Task tasks.add[115c3b5d-eba7-472b-86
ab-bd356f650e13] succeeded in 0.0s: 8

2.2 problem

Two problems arise at runtime:

  1. redis-py version of the problem, 2. * currently requires upgrade
    pip install --upgrade redis
    upgrade to 4. ***
  2. Error ValueError: not enough values to unpack ( expected 3, got 0)
    Solution:
    watching others describe celery4.x this problem occurs, the solution probably is to say the following to run on win10, unknown principle:

    安装`eventlet
    pip install eventlet

    Then add a start time parameter worker follows:
    Celery -A worker -l info -P eventlet
    then you can normally call up.

3. The more complex test environment

In general, celery project code is divided into three parts:

  1. the definition of worker
  2. tasks defined
  3. Adding tasks

structure:

proj / __ init__.py 
    /celery_worker.py worker # define 
    /celery_tasks.py # tasks defined 
    /celery_run.py # call

proj/celery_worker.py

 # celery test -- worker
from celery import Celery

di_broker = 'redis://:[email protected]:6379/0'
di_backend = 'redis://:[email protected]:6379/1'

def create_worker():
    # app = Celery('tasks', broker=di)
    app = Celery('tasks',
             backend=di_backend,
             broker=di_broker,
            include=['code_2.celery_tasks'])

    app.conf.update(result_expires=3600,)
    return app

app = create_worker()


if __name__ == '__main__':
    app.start()
    

proj/celery_tasks.py

from celery_worker import app

@app.task
def add(x, y):
    return x + y

@app.task
def mul(x, y):
    return x * y

@app.task
def xsum(numbers):
    return sum(numbers)

proj/celery_run.py

# celery test
from celery_tasks import add
rv = add.delay(4, 4)

out = rv.get(timeout=1)
print(out)
out = rv.ready()
print(out)

start the woker
celery -A celery_tasks worker -l info -P eventlet

stopping the woker
ctrl+c

Test environment to build complete, the following test more complex functions.

4.calling tasks

interface

add (4, 4) # local calling
add.delay (4, 4) # worker performs
This IS Actually Method A-Star Another argument Shortcut to Method Called apply_async ():
add.apply_async ((2, 2))
may be used more multiparameter
add.apply_async ((2, 2), queue = 'lopri', countdown = 10)
on behalf of the sentence sent to lopri task queue, wait at least 10 seconds before performing

Each task is assigned to one the above mentioned id
at The Delay and apply_async Methods return AN AsyncResult instance
if specified backend, you can view the implementation of the mandate

res = add.delay(2, 2)
res.get(timeout=1)
4

You can find the task’s id by looking at the id attribute:
res.id
d6b3aea2-fb9b-4ebc-8da4-848818db9114

You can also inspect the exception and traceback if the task raised an exception, in fact result.get() will propagate any errors by default:
res = add.delay(2)
res.get(timeout=1)

If you don’t wish for the errors to propagate then you can disable that by passing the propagate argument:
res.get(propagate=False)
TypeError('add() takes exactly 2 arguments (1 given)',)

5.server/worker

5.1 explain the basis

(vir_venv) E:\python\code>celery -A celery_tasks worker -l info -P eventlet

 -------------- celery@** v4.3.0 (rhubarb)
---- **** -----
--- * ***  * -- Windows-8.1-6.3. 2019-09-22 10:50:49
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app:         tasks:0x38ac527d30
- ** ---------- .> transport:   redis://:**@***:6379/0
- ** ---------- .> results:     redis://:**@***:6379/1
- *** --- * --- .> concurrency: 4 (eventlet) # 并发数
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery

As used herein eventlet on behalf of each mission is executed in a separate thread.
task events parameter determines whether to monitor worker

5.2 running in the background

celery multi start worker1 -A celery_worker -l info 
celery  multi restart w1 -A proj -l info  
celery multi stop w1 -A proj -l info
# 等待执行完成
celery multi stopwait w1 -A proj -l info 

6.task composite structure / workflow

task supports the following methods:


add.signature((2, 2), countdown=10)
tasks.add(2, 2)
There’s also a shortcut using star arguments:

add.s(2, 2)
tasks.add(2, 2)

def func2():
r = add.s(2,2)
pr_type(r)
rv = r.delay()
out = rv.get(timeout=5)
print(out)
out = rv.ready()
print(out)

It looks like a partial, but also a substantial package of tasks using its purpose is to construct more complex task structure.
The combined structure supporting the following:
Group Map Starmap of chunks are catena alberghiera Chord

To the group as an example:

>>> g = group(add.s(i) for i in xrange(10))
>>> g(10).get()
[10, 11, 12, 13, 14, 15, 16, 17, 18, 19]

Guess you like

Origin www.cnblogs.com/wodeboke-y/p/11600951.html