Were three key elements to enhance performance by caching

Cache architecture design is an important means of its technology is relatively simple, while the performance in particular have a significant effect, is used in many things. Use the cache to note three key factors that determine the effectiveness of the cache, the cache effect of efficiency, cache implementation:

1, the cache size key set

2, the size of the buffer space

3, cache life

Reading this article will use the five minutes of time to help you improve the cache hit rate.

 

0.5 What is the cache hit rate?
The main features of the cache is write-once-read-out to reduce the use of the database by this means, as soon as possible to read data from the cache to improve performance. So the cache is valid, mainly to see it once written into the cache can not be repeatedly read out in response to requests business, this indicator is called the cache hit rate. How the cache hit rate count it? Get the right result to the query cache than the total number of queries, the resulting index is the cache hit rate, for example, ten times nine queries are able to get the right result caching, his hit rate is 90%.

The main factors affecting the cache hit rate of three, namely, the size of the cache key collection, memory size and cache life.

 

1. The size of the set of cache keys.
Each cache objects are identified by a cache key. For example, we get the key, value structure, key is a string abc, value is the string hello, abc is one of a cache key. Key is a unique identifier in the cache, the only way to locate an object is to cache key for an exact match.

For example, we want to cache information for each item of merchandise online, you need to use the product ID as a cache key. In other words, the cache key space is the number of all the keys of your application can be generated. Statistically speaking, the more unique key generated by the application, the smaller the chance to reuse. For example, according to the IP address of the cache weather data, it may take more than 4 billion keys. But if the weather based on national data cache, then you only need a few hundred cache key is enough, but it is also the world's hundreds of countries.

So to minimize the number of cache keys, the fewer number of keys, the higher the efficiency of the cache. Designed to focus on how the cache when the cache key is the design of its entire collection range, limited to a both to efficient use, and can reduce its number, this time the cache performance is the best.

 

2. Cache memory size available space
memory cache can use determines the number and average size of cached objects cached objects. Because the cache is typically stored in memory, available memory cache object is relatively expensive and subject to strict restrictions.

If you want to cache more objects, you need to delete the old objects, add new objects. These old object is deleted, it will affect the cache hit rate. Therefore, the larger the cache on the physical space, the more cached object, the higher cache hit rate.

 

3. Cache lifetime object
lifetime cache object called TTL.

The longer the object cache time, the higher the possibility of being reused. Invalidate the cache in two ways:

1) time-out failure

Timeout Failure in building cache, that is, when write cache, each cache object to set a timeout period, will return the cached data is accessed before the timeout cache, the cache becomes ineffective once the timeout, this time to access the cache it will return empty.

2) Real Clear

The Real Clear is to say, when there is a cache object updates directly inform the cache has been updated data is cleared. Clear later, when the next time you access this application cache object key, because the cache has been cleared, had to read the database to find, this time to get the latest data. Because the update is always updated in the database.

Although there is a time has not failed, but the new object to be written to the cache, and memory space is insufficient, and this time you need to clean up some of the old cache object out to make room for the new cache object.

Clear the main memory space used algorithm is LRU algorithm, LRU algorithm is the least recently used algorithm. Clear the time to remove those objects the most recent years have not been visited, the algorithm implemented using a linked list structure. All cache objects are placed on the same list. When an object is accessed, put the entire list of objects to the head. When you need to remove those least recently used LRU algorithm through object when, just from the tail of the queue to find, the more we have not been visited in the tail of the queue of the more recent the oldest, priority clearance to free up memory space let the new object to join.

The above three conditions is the key element in the decision cache hit rate, the master, the cache will have a deeper understanding.

Above the removal from the pull hook "Ali predecessors architecture by" The first 02 talk (on): distributed cache Click for more

Speaker: Li Zhihui, before Alibaba technical experts, "large-scale Web Site Technology Framework" Author

Niagara hook mentors micro letter: lagouandy, do not resume regular participation in 1v1 diagnosis sweepstakes, more pull hook official technical exchanges waiting for you to join the community

 

Guess you like

Origin www.cnblogs.com/lagou/p/10948825.html