Use a simple Python implementation of LRUCache

Brief introduction

As we all know, Redis uses management "out of policy" to hot data, which in most scenarios will use LRU (Least Recently used) algorithm, this paper from the value of a simple to use dict cache Fibonacci number as an example LRU usage scenarios lead and a Python a simple LRUCache.

Or use caching to reduce the computational overhead of the primary database

In a real business scenario, we often use caching to reduce the overhead calculation or frequent user program access the primary database. For example, I have a FIB interface function, when the user uses a certain number of days to request a data interface results returned to the user:

def fib(n):
    if n <= 2:
        return 1
    return fib(n-1) + fib(n-2)

In order to reduce the overhead of the program calculations, we can use a dictionary to "cache" corresponding to the result, that is to say, we can calculate the corresponding information on the number and the corresponding result of good days requested by the user in advance, if the number of days requested by the user in the "cache." then we can directly result cache returned to the user, thus effectively reducing the cost of a program!

Decorator way to achieve "buffer effect"

# Dictionary without some will be calculated from this value, reducing the computational overhead 
DIC = {. 1: 1,2: l, 3: 2 } 

DEF warpper (FUNC):
     Global DIC
     DEF Inner (n-):
         IF n- in DIC :
             return DIC [n-] 
        RES = FUNC (n-)             
        DIC [n-] = RES
         return RES
     return Inner 

@wrapper 
DEF FIB (n-):
     IF n-<= 2 :
         return . 1
     return FIB (. 1-n-) + FIB (N- 2 ) 

IF  __name__ =="__main__":
    for i in range(1,6):
        ret = fib(i)
        print(ret)

The code above effectively achieve a "cache" effect.

But the question again: in fact determined by the size of the cache memory size of the machine, so the data in the cache can not be infinite! If they allow unlimited increase in the data cache is always a moment of memory will explode, affect the performance of services!

This requires management strategy for caching data, and LRU caching hot data management is commonly used algorithms.

Simple LRU achieve

LRU (Least Recently used) strategy is to first access the data deletion (of course, is in the cache exceeds the threshold value) from the cache.

Realization of ideas

Take the above buffer dic dictionary to explain:

If a user requests data, we directly find a value corresponding to the data value back to the user, and that key to access just put back in the cache in the cache;

If the data is not in the cache request, then you need to call fib method of calculating the results and will have to judge for themselves whether the buffer is full: If the cache is not full, then returns the results and update the cache space - the key just to visit on the back, if the cache is full, then you need to "early" access to the key deleted from dic, and then the results of the new calculations put behind the dic.

Use of double-ended simple linked list LRUCache

# -*- coding:utf-8 -*-
class Node(object):
    def __init__(self,prev=None,next=None,key=None,value=None):
        self.prev,self.next,self.key,self.value = prev,next,key,value

# 双端链表
class CircularDoubeLinkedList(object):
    def __init__(self):
        node = Node()
        node.prev,node.next = node,node
        self.rootnode = node
    # 头节点
    def headnode(self):
        return self.rootnode.next
    # 尾节点
    DEF tailnode (Self):
         return self.rootnode.prev
     # remove a node 
    DEF Remove (Self, Node):
         IF Node IS self.rootnode:
             return 
        node.prev.next = node.next 
        node.next.prev = Node. prev
     # is added to a linked list node tail 
    DEF the append (Self, node): 
        tailnode = self.tailnode () 
        tailnode.next = node 
        node.next = self.rootnode
         # require the root node is set to prev
        = self.rootnode.prev Node 

# the LRU the Cache 
class the LRUCache (Object):
     DEF  the __init__ (Self, MAXSIZE = 10 ): 
        self.maxsize = MAXSIZE 
        in self.cache = {} 
        self.access = CircularDoubeLinkedList () 
        self.isfull = len ( in self.cache)> = self.maxsize 

    # ## class decorator wording 
    DEF  the __call__ (Self, FUNC):
         DEF Inner (n-): 
            cachenode = self.cache.get (n-, None)
             # If there is a cache, from n cache acquires the node corresponding to the mobile and 
            ifcachenode: 
                self.access.remove (cachenode) 
                self.access.append (cachenode) 
                # return value value 
                return cachenode.value
             # If the cache does not need to look at the cache is full, go to deal with 
            the else : 
                the Result = FUNC ( the n-)
                 # If the cache is not full data entered, add 
                IF  not self.isfull: 
                    tailnode = self.access.tailnode () 
                    new_node = the Node (tailnode, self.access.rootnode, the n-, the Result) 
                    self.access.append (new_node) 
                    # the new node cached
                    self.cache [the n-] = new_node 
                    self.isfull = len (self.cache)> = self.maxsize
                     # return the Result 
                # If the cache is full, delete lru_node 
                the else : 
                    lru_node = self.access.headnode ()
                     # first lru_node deleted 
                    del in self.cache [lru_node.key] 
                    self.access.remove (lru_node) 
                    # talk to the new node into 
                    tailnode = self.access.tailnode () 
                    new_node = the node (tailnode, self.access.rootnode, n-, Result) 
                    self.access.append (new_node) 
                    # new node cached
                    in self.cache [n-] = new_node
                 return Result
         # decoration, and finally returns Inner 
        return Inner 

@LRUCache () 
DEF FIB (n-):
     IF n-<= 2 :
         return . 1
     return FIB (. 1-n-) + FIB (2-n- ) 


for i in the Range (1, 10 ):
     Print (fib (i))

~~~

Guess you like

Origin www.cnblogs.com/paulwhw/p/12150981.html