django cache coherency

Disclaimer: This article is a blogger original article, follow the CC 4.0 BY-SA copyright agreement, reproduced, please attach the original source link and this statement.
This link: https://blog.csdn.net/weixin_39726347/article/details/88035356

First, set cache

Django supports database, file and memory cache. Usually we first set it up. Django settings for the cache are located CACHES configuration items in the settings.py.

Django supports several cache system below:

1. Memcached

Memcached is native support for Django caching system, high speed, high efficiency. Memcached is a memory-based caching service, initially load in order to solve the problem of LiveJournal.com developed, and later by Danga open source. It is similar to Facebook and Wikipedia use such large sites, to reduce the number of database access, significantly improves the performance of the site.

Memcached will start a daemon, and assign a separate memory block. Its main job is to provide a buffer to quickly add, search, delete interface. All data is stored directly in memory, so it can not replace the function of a database or a file system. If you are familiar with the cache, these elements are well understood.

If you are a novice, so be clear:

  • Memcached is not the software that comes with Django, but a stand-alone software, you need to install, configure and start the service;
  • After Memcached installed, also install the Memcached Python operations dependent libraries, the most commonly used is the python-memcached and pylibmc;
  • After the above two conditions are met, but also be configured in Django.

Configuration:

  • Depending on your installation of Python dependencies, the CACHES of BACKEND set django.core.cache.backends.memcached.MemcachedCache or django.core.cache.backends.memcached.PyLibMCCache
  • LOCATION is set to host IP and port of your Memecached daemon process where the format ip: port string. Or unix: path of the form, in the Unix operating system.

The following is a reference example, the Memcached runs localhost (127.0.0.1) port 11211, using the python-memcachedlibrary:

 
   
  
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6

The following Memcached running on the local Socket the Unix: /tmp/memcached.sock, dependency python-memcached:

CACHES = {
    'default': {
        'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',
        'LOCATION': 'unix:/tmp/memcached.sock',
    }
}

1

  • 2
  • 3
  • 4
  • 5
  • 6

The following Memcached running /tmp/memcached.sock, without unix:/prefix, depending pylibmc library:

CACHES = {
    'default': {
        'BACKEND': 'django.core.cache.backends.memcached.PyLibMCCache',
        'LOCATION': '/tmp/memcached.sock',
    }
}
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6

Memcached distributed support services may run simultaneously on several machines to their IP addresses are added to the LOCATION, as follows:

CACHES = {
    'default': {
        'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',
        'LOCATION': [
            '172.19.26.240:11211',
            '172.19.26.242:21423',
            '172.19.26.244:11213',
        ]
    }
}
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10

Memory cache-based systems have a significant drawback is that power data loss, especially this Memcached caching does not support serialization, so please be sure to pay attention to data security.

In fact, for the moment, redis flourishing era, chose redis as a cache. It also supports serialization.

2. Database Cache

We use a big cache of reason is to reduce the operation of the database, and if the cache to the database, is it off ...

So, try not to use cache-based database, here is not to make a specific description, an example of a simple configuration to it:

CACHES = {
    'default': {
        'BACKEND': 'django.core.cache.backends.db.DatabaseCache',
        'LOCATION': 'my_cache_table',
    }
}
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6

redis Cache

  • Preparation software: redis database, django-redis module
  • start using:
  • Installation: pip install django-redis
  • Configuration: settings.py
CACHES = {
    "default": {
        "BACKEND": "django_redis.cache.RedisCache",
        "LOCATION": "redis://127.0.0.1:6379/1",
        "OPTIONS": {
           "CLIENT_CLASS": "django_redis.client.DefaultClient",
        }
    }
}
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9

However, in some cases, there are some applications, such as you have a high-speed, efficient indexing database.

3. file system cache

We feel that database even slow, the file-based system? Slower! But no Redis in your hands, and when the Memcached database, you can also use some reluctance. Here are two configuration examples:

Unix-based systems:

CACHES = {
    'default': {
        'BACKEND': 'django.core.cache.backends.filebased.FileBasedCache',
        'LOCATION': '/var/tmp/django_cache',
    }
}

Windows-based operating system, the need to bring the letter path:

CACHES = {
    'default': {
        'BACKEND': 'django.core.cache.backends.filebased.FileBasedCache',
        'LOCATION': 'c:/foo/bar',
    }
}

 

 

4. Based on the local cache memory

If your local host memory large enough fast enough, you can also use it directly as a cache. Configuration is as follows:

CACHES = {
    'default': {
        'BACKEND': 'django.core.cache.backends.locmem.LocMemCache',
        'LOCATION': 'unique-snowflake',
    }
}

 

The buffer used in the development of

Django very intimate for us to develop a design with the cache. When your production environment is a large cache system, and you do not have the cache system to support development at the time, or do not want to develop with the kind of big heavy guy. However, the actual development process, you had to access cache system, using caching api, in this case, the cache with the development of very effortless.

Configuration is as follows:

CACHES = {
    'default': {
        'BACKEND': 'django.core.cache.backends.dummy.DummyCache',
    }
}

6. custom cache backend

Of course, is to use the highest level of development of their own caching system, Django is supported, but only if you have that ability! Configuration is very simple:

CACHES = {
    'default': {
        'BACKEND': 'path.to.backend',
    }
}

 

7. cache parameters

The rear end of each of the above buffer may be provided some additional parameters to control the behavior of the cache, the following parameters can be set:

  • TIMEOUT

The default cache expiration time, in seconds, the default is 300 seconds None indicates that never expire. Set to 0 will cause an immediate failure of the cache (cache does not make sense).

  • OPTIONS

Optional parameters, depending on the rear end of the buffer is different.

  • KEY_PREFIX

All cache key string of Django server.

  • VERSION

The default version number generated by the Django server.

  • KEY_FUNCTION

A string containing the path points of a function, the function defines how to combine the prefix and the key to the final version of the cache key.

The following example is configured in a back-end file system based cache, cache expiration time is set to 60 seconds, a maximum of 1000 entries.

CACHES = {
    'default': {
        'BACKEND': 'django.core.cache.backends.filebased.FileBasedCache',
        'LOCATION': '/var/tmp/django_cache',
        'TIMEOUT': 60,
        'OPTIONS': {
            'MAX_ENTRIES': 1000
        }
    }
}
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10

The following example is configured on a rear end of the python-memcached library size is limited to an object which 2MB:

CACHES = {
    'default': {
        'BACKEND': 'django.core.cache.backends.memcached.MemcachedCache',
        'LOCATION': '127.0.0.1:11211',
        'OPTIONS': {
            'server_max_value_length': 1024 * 1024 * 2,
        }
    }
}
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9

The following is the configuration based on the back-end pylibmc library, the back-end to enable binary protocol, SASL authentication and ketama behavior patterns:

CACHES = {
    'default': {
        'BACKEND': 'django.core.cache.backends.memcached.PyLibMCCache',
        'LOCATION': '127.0.0.1:11211',
        'OPTIONS': {
            'binary': True,
            'username': 'user',
            'password': 'pass',
            'behaviors': {
                'ketama': True,
            }
        }
    }
}
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14

Second, site-wide cache

The easiest is to use caching system caches the entire site.

This will require additional 'django.middleware.cache.UpdateCacheMiddleware' and 'django.middleware.cache.FetchFromCacheMiddleware' MIDDLEWARE added to the setting, as follows:

MIDDLEWARE = [
    'django.middleware.cache.UpdateCacheMiddleware',
    'django.middleware.common.CommonMiddleware',
    'django.middleware.cache.FetchFromCacheMiddleware',
]
  • 1
  • 2
  • 3
  • 4
  • 5

Note: updateMiddleware must be placed at the beginning of the list, and fectchmiddleware must be last. This is the rule used middleware Django, they are sequential relationships.

Then, add these parameters to the settings required for the following file:

CACHE_MIDDLEWARE_ALIAS : 用于存储的缓存的别名
CACHE_MIDDLEWARE_SECONDS : 每个page需要被缓存多少秒.
CACHE_MIDDLEWARE_KEY_PREFIX : 密钥前缀
  • 1
  • 2
  • 3

Third, cached views

Another method is to use the frame buffer output of view may be cached. In django.views.decorators.cache defines a response result of the automatic caching decorators view cache_page, the use of very simple:

from django.views.decorators.cache import cache_page

@cache_page(60 * 15)
def my_view(request):
    ...
  • 1
  • 2
  • 3
  • 4
  • 5

cache_pageTakes one parameter: timeout, in seconds. In the above example, my_view()the results of the view will be cached for 15 minutes (in order to improve the readability of written 60 * 15)

And site cache as the view cache regardless of the URL. If more than one URL point to the same view, each URL will be cached separately. Following the example my_view if URLconf as follows:

urlpatterns = [
    url(r'^foo/([0-9]{1,2})/$', my_view),
]
  • 1
  • 2
  • 3

Then transmitted to /foo/23/and /foo/1/requests are cached separately. But once a clear URL (for example /foo/23/) has been requested before, and point to the URL request will be issued after the re-use of cached content.

cache_pageSince his installation can also use some additional parameters, such as cache, this parameter indicates the cache backend specific use.

@cache_page(60 * 15, cache="special_cache")
def my_view(request):
    ...
  • 1
  • 2
  • 3

Optional keyword parameters may also be used key_prefixto specify a particular cache prefix in each view, as follows:

@cache_page(60 * 15, key_prefix="site1")
def my_view(request):
    ...
  • 1
  • 2
  • 3

Fourth, the cache template fragments

We can also use cachetemplate tags to cache a fragment of the template. To use this tag, you must first add the top position template {% load cache %}.

Template tag {% cache %}will be in the set time, the contents of the cache tag included in the block. It requires a minimum of two parameters: cache time (in seconds) and the name of a fragment from the cache. like this:

{% load cache %}
{% cache 500 sidebar %}
    .. sidebar ..
{% endcache %}
  • 1
  • 2
  • 3
  • 4

You can also cache multiple versions based on dynamic content within a segment. As an example, the sidebar can generate different versions of the cache for each user site. Just to give {% cache %}label before passing a parameter to identify the cache fragment distinguished, as follows:

{% load cache %}
{% cache 500 sidebar request.user.username %}
    .. sidebar for logged in user ..
{% endcache %}
  • 1
  • 2
  • 3
  • 4

Cache Timeout parameter can be a template variable, as long as the template variables can be resolved to an integer value. For example, if the template variable my_timeout set to a value 600, then the following examples are equivalent:

{% cache 600 sidebar %} ... {% endcache %}
{% cache my_timeout sidebar %} ... {% endcache %}
  • 1
  • 2

Guess you like

Origin www.cnblogs.com/AbnerLc/p/11946191.html