[Django] use file upload and celery

Upload files, asynchronous, initialization script

 面试题:
项目中的静态文件处理(JS/CSS/image)	自己在机房内优化的方法:
   1. Nginx/lighttpd(lighty-douban):Nginx 处理静态资源速度非常快,并且自身还带有缓存。
   2. 80: Nginx -> {server config} -> django:8080  -> Static : folders	 -> 云存储
 
你了解CDN吗?能讲讲原理么?

扩展:如何更换CDN上的图片?改名大法

   1. 请求加参数:
      
      1. abc.jpg
      2. abc.jpg?20191011111042asdflj2
      3. abc.jpg?20191011111043
      
   2. 改名
      1. abc_201910111042.jpg
      2. abc_md5.jpg

Seven cattle cloud access

  1. Sign up seven cattle cloud account
  2. Create storage space: Bucket -> has a separate domain, you can access
  3. Access to relevant configuration
    • AccessKey: individual center - to get the key management in
    • SecretKey: individual center - to get the key management in
    • Bucket_name: the name of our own new storage space
    • Bucket_URL: access to built a new storage space url
  4. Installation qiniu SDK:pip install qiniu
  5. The interface document interface encapsulation
  6. In accordance with the need to upload, download interface packaged as asynchronous tasks
  7. Program processing flow
    1. Server uploads
      1. User picture to upload to our server
      2. Then, our program and then call seven cattle cloud api, upload pictures to seven cattle cloud
      3. After a successful upload, stitching avatar image url address:
        1. Seven cattle cloud Bucket_URL / filename, the avatar images stored in the database url
    2. Direct Client
      1. The client upload pictures directly to the cloud service
      2. The pictures tell the client address of the server, the server updates the database
      3. In fact, there is a security risk
        1. Client server start acquiring token, then upload

Code example uses the celery

#上传图片
def user_avatar(request):
    #1.存下来
    # 定义上传后保存的文件名
    file_name =f'avatar-{request.user.id}.jpg'
    # 上传后保存的路径
    file_path = f'{settings.BASE_DIR}/static/{file_name}'
    #接收上传文件内容
    f = request.FILES['avatar']

    with open(file_path,'wb+') as destination:
        for chunk in f.chunks():
            destination.write(chunk)
    print('save local ok.')
    user_id = request.user.id
    #delay很重要!!!
    upload_qiniu.delay(file_name, file_path, user_id)
    return  render_json('已经放入celery-redis队列中')

#这一步用异步celery
@celery_app.task
def upload_qiniu(file_name, file_path, user_id):
    # 2.调用七牛云sdk上传
    # 需要填写你的 Access Key 和 Secret Key 需要修改 
    access_key = 'KXfx2ZiBP311HkZZ8l8JHCmqlqPTJCK2sraihexx'
    secret_key = 'pe5svTTUAJQoUPhppsf0Gg9wbEJiYahnRMAy1rxx'
    # 要上传的空间
    bucket_name = 'liu'
    # 要上传的域名
    bucket_domain = ' liu.s3-cn-south-1.qiniucs.com'
    # 构建鉴权对象
    q = Auth(access_key, secret_key, )
    token = q.upload_token(bucket_name, file_name, 3600)
    ret, info = put_file(token, file_name, file_path)
    print(info)
    #断言
    assert ret['key'] == file_name
    assert ret['hash'] == etag(file_path)
    print('save qiniu ok.')
    avatar_url = f'{bucket_domain}/{file_name}'
    # 3.更新用户的avatar
    user = User.objects.get(id=user_id)
    # user = request.user # request 无法传 所以改为user_id
    user.avatar = avatar_url
    user.save()

    return  True

Asynchronous tasks

Celery and processing asynchronous tasks

  1. The principle of universal asynchronous frame

    1. Core queue: message queue

    2. Client: asynchronous terminal

    3. Manufacturer: Consumers

    4. Publisher: Subscribers

    5. Sender: Recipients

      [Image dump the chain fails, the source station may have security chain mechanism, it is recommended to save the picture down uploaded directly (img-YFOxnBJs-1577967577874) (/ Users / zebin / Pictures / markdown_assets / image-20200102152154116.png)]

    • Task module Task: comprising a asynchronous tasks and timing of these tasks, the task is usually triggered asynchronous concurrent business logic to the task queue and the task of the timing process Celery Beat task periodically sent to the task queue.

    • Message middleware Broker: Agent Broker, that is, the task scheduling queue, receives the message sent by the producer task (i.e., task) is more important function of the mass of the message buffer live, the task into the queue. Celery queue itself does not provide service, the official recommended to use RabbitMQ and Redis, etc., or simply use the cloud service Message Queuing Service: SQS

    • Task execution unit Worker: Worker is a processing unit to perform tasks, real-time monitoring its message queue, the queue scheduling acquisition task, and executes it.

    • Task results are stored Backend: Backend for storing results of execution of the task, for the same query as messaging middleware, storage can also use RabbitMQ, Redis and MongoDB and so on.

  2. message queue:

    1. MQ: Message Queue, (server, storage, caching, database equally important, is the basic components of modern architecture of the Internet background)
    2. effect:
      1. Asynchronous: the program is not necessary to synchronize execution, with the temporary message queue, and then to execute asynchronously with other programs, so that cross-machine decentralized tasks
      2. Decoupling: There dependencies between applications, coupled with the release message queue
        1. There are two services A and B strongly coupled
        2. A B
      3. Clipping: the Internet request peak, down to the buffer, and then processed one by one slowly
        1. Spike, buy scenario, the request in this manner to reduce the peak
      4. Limiting:
        1. 1000, each one coming -1, +1 every person leaving,
        2. Reduced to 0, to tell newcomers: people full.
    3. Common message queue:
      1. Redis, easiest
      2. Kafka, most suitable for log processing, hundreds of thousands of concurrent (requests per second) stand-alone
      3. RabbitMQ, in line with standard message queue MQ (tens of thousands per second)
      4. RocketMQ, in line with standard MQ message queue (per hundred thousand)
  3. Installation win10 does not support celery4.3! ! ! ! ! ! ! ! ! !
    Therefore, the need for additional installationHere Insert Picture Description

    pip install  celery[redis] 
    
  4. Create an instance of

    import os
    from celery import Celery
    
    from social import settings
    from worker import config
    
    os.environ.setdefault("DJANGO_SETTINGS_MODULE", "social.settings")
    
    celery_app = Celery('social')
    celery_app.config_from_object(config)
    celery_app.autodiscover_tasks()
    
    
    
    
    
    
    
    
  5. General Configuration

    broker_url = 'redis://127.0.0.1:6379/0'
    broker_pool_limit = 1000  # Borker 连接池, 默认是10
    
    timezone = 'Asia/Shanghai'
    accept_content = ['pickle', 'json']
    
    task_serializer = 'pickle'
    result_expires = 3600  # 任务过期时间
    
    result_backend = 'redis://127.0.0.1:6379/1'
    result_serializer = 'pickle'
    result_cache_max = 10000  # 任务结果最大缓存数量
    
    worker_redirect_stdouts_level = 'INFO'
    
  6. Start Worker

Linux / mac下:
celery worker -A worker --loglevel=info

win:
celery worker -A worker --loglevel=info -P eventlet

Published 354 original articles · won praise 163 · views 80000 +

Guess you like

Origin blog.csdn.net/qq_41856814/article/details/103810537