django cache signal high concurrency issues

Advanced django cache

django six kinds of caching policy

开发调试

内存

文件

数据库

Memcache缓存(python-memcached模块)

Memcache缓存(pylibmc模块)

Cached page

1, @ cache_page method cache

Adding cache parameters @cache_page decorator cache time, caching scheme, as well as the prefix

@cache_page(timeout=10,cache='html',key_prefix='page')
def list(request):

Verify Login

chrs = string.ascii_letters
char = random.choice(chrs)
return HttpResponse('用户列表页面: <br> %s' % char)

2, add caching middleware, create a cache middleware, each time a request before the first judge

class CachePageMiddleware(MiddlewareMixin):

Configuration Page path cache

cache_page_path = [
    '/user/list/'
]

# 实现process_request和process_response来判断缓存,request.path
def process_request(self, request):

    # 判断当前的请求是否支持缓存
    if request.path in self.cache_page_path:

        # 判断页面是否已缓存
        if cache.has_key(request.path):
            return HttpResponse(cache.get(request.path))

def process_response(self, request, response):

    # 判断当前请求路径是否要被缓存
    if request.path in self.cache_page_path:

        # 设置响应内容为缓存的内容,请求路径,响应,超时
        cache.set(request.path,
                  response.content, timeout=5)

    return response

Cache with redis

Configuration settings in the cache entry

CACHES = {

# 设置默认缓存
'default': {

    # 缓存的方法,django-radis.cache.RedisCache
    'BACKEND': 'django_redis.cache.RedisCache',

    # 设置缓存的路径
    'LOCATION': 'redis://127.0.0.1:6379/10',

    # 设置缓存的路径,用户类型,连接超时,
    'OPTIONS': {
        'CLIENT_CLASS': 'django_redis.client.DefaultClient',
        'SOCKET_CONNECT_TIMEOUT': 10,
        'SOCKET_TIMEOUT': 10
    }
}

}

Set the cache with session

Cache engine

SESSION_ENGINE='django.contrib.session.backends.cache'

Cache cookie name

SESSION_COOKIE_NAME='SESSION_ID'

Cache path

SESSION_COOKIE_PATH='/'

SESSION_CACHE_ALIAS='default'

Cache lifetime

SESSION_COOKIE_AGE='1209600'

Signaling mechanism

Monitoring of internal events django

Complex business decoupling

Built received signal

Receiving a first set of common signals, signals connect () function

from django.db.models.signals impoer pre_delete

def model_delete_pre(sender,**kwargs):
from user.models import Order

if sender == Order:
    print('')

pre_delete.connect(model_delete_pre)

Decorator received signal and connection information processing function

from django.dispatch import receiver

@receiver(post_delete)
def delete_model_post(sender,**kwargs):
print(sender,kwargs)

pre_delete.connect(delete_model_post)

Custom signal

Create signals package, and declared in the init

from django import dispatch

codeSignal = dispatch.Signal(providing_args=['path','phone','code'])

send Message

The business needs to send information in place

def new_code(request):

# 生成手机验证码
# 随机产生验证码,
code_text = code_new_code_str(4)
print(code_text)

phone = request.GET.get('phone')
print(phone)

# 将信号发送出去
signal.codeSignal.send('new_code',
                       path = 'request.path',
                       phone = phone,
                       code = code_text)

return HttpResponse('%s...' % phone)

receive signal

from signals import codeSignal
from django import dispatch

@dispatch.receiver(codeSignal)
def cache_code(sender,**kwargs):
print('...')
print(sender,kwargs)

HttpResponse subclasses

JsonResponse

HttpResponseRedirect

HttpresopnseNotAllow

HttpResponseGone

django middleware correlation function

process_request()

process_response()

process_view()

process_exception()

process_template_response()

Paging Properties

bumber

object_list

has_previous

has_next

previous_page_number

next_page_number

High concurrency solutions

Use celery + redis queue

celery c10k solve the problem, solve the problem of high concurrency through the middleware and back-office tasks execution unit

celery part

Message broker middleware

Task execution unit worker

Storage tasks is to store result

Configuration

Related documents: http://docs.celeryproject.org/en/latest/django/index.html

win does not support multi-process manner after the celery need to perform the way into coroutine

Creating celery in your home directory

from future import absolute_import ,unicode_literals
import os
from celery import Celery

os.environ.setdefault('DJANGO_SETTINGS_MODULE','advicedjango.settings')

app = Celery('advicedjango',
broker='redis://10.36.174.1:6379/8')
app.config_from_object('django.conf.settings')

app.autodiscover_tasks()

Add all the properties in the project's main init script

from .celery import app as celer_app

all = ('celery',)

In the application module, create a task module tesk

from celery import shared_task

@shared_task
def qbuy(id,name):
print(id,name)
time.sleep(1)
return '%s,%s % (id,name)'

In the settings, configure celery items

CELERY_IMPORTS = ('stockapp.tasks',)

Start celery

celery -A advicedjango worker -P gevent -I info

-A specified item, -P -I designation information designated coroutine

qbuy('1991',3)

Celery service receives tasks and execution, you can see the results of implementation,

But the warning issued by the results. Because there is no result of the processing execution units to complete the task.

pip3 install django-celery-results

Configuration settings result in the scheme

添加'django_celery_result'

添加CELERY_RESULT_BACKEND = 'django-db', django-cache

Migrating new museum library

python manage.py migrate django_celery_results

Celery is restarted after the migration is successful

Guess you like

Origin blog.51cto.com/14418623/2435616