LRU algorithm Python implementation

LRU algorithm description

The LRU algorithm actually allows you to design the data structure: first you need to receive a capacity parameter as the maximum capacity of the cache, and then implement two APIs, one is put (key, val) method to store key-value pairs, and the other is get (key ) The method obtains the val corresponding to the key, and returns -1 if the key does not exist.

Note that the get and put methods must be O (1) time complexity, let's take a specific example to see how the LRU algorithm works.

  

/ * The cache capacity is 2 * / 
LRUCache cache = new LRUCache (2 );
 // You can understand the cache as a queue // Assuming that the left is the head of the team, and the right is the tail of the team /// The most recently used queue is in the head of the
 queue
 , long time ago Used at the end of the line
 // parentheses indicate key-value pairs (key, val) 

cache.put ( 1, 1 );
 // cache = [(1, 1 )] 
cache.put ( 2, 2 );
 // cache = [(2, 2), (1, 1 )] 
cache.get ( 1); // returns 1 
// cache = [(1, 1), (2, 2 )]
 // explanation: because of the recent access The key 1 is gone , so advance to the head of the queue
 // return the value corresponding to key 1 1 
cache.put ( 3, 3 );
 // cache = [(3, 3), (1, 1 )]
 // Explain: cache capacity Full, need to delete content to make room
// Priority delete the data that has not been used for a long time, that is, the data at the end of the team
 // and then insert the new data into the queue head 
cache.get ( 2); // returns -1 (not found)
 // cache = [(3, 3 ), (1, 1 )]
 // Explanation: There is no data with key 2 in cache 
cache.put ( 1, 4 );    
 // cache = [(1, 4), (3, 3 )]
 // Explanation : Key 1 already exists, overwrite the original value 1 to 4 
// Do n’t forget to bring the key-value pair to the head of the team

 

LRU algorithm design

Analyzing the above operation process, to make the time complexity of the put and get methods O (1), we can sum up the necessary conditions of the cache data structure: fast search, fast insertion, fast deletion, in order.

Because it is clear that the cache must be ordered in order to distinguish the most recently used and long-used data; and we need to find whether the key already exists in the cache; if the capacity is full, we need to delete the last data; each time we have to insert the data To the head of the team.

So, what data structure meets the above conditions at the same time? Hash table search is fast, but the data has no fixed order; linked lists have order, insertion and deletion are fast, but search is slow. So combine them to form a new data structure: hash linked list.

The core data structure of the LRU cache algorithm is a combination of hash linked lists, doubly linked lists, and hash tables. This data structure looks like this:

 

 Delete operation is required. Deleting a node not only requires the pointer of the node itself, but also needs to operate the pointer of its predecessor node, and the doubly linked list can support the direct search of the predecessor, ensuring the time complexity of the operation O (1).

'' ' 
Method one 
class LRUCache: 
    # @ param capacity, an integer 
    def __init __ (self, capacity): 
        self.cache = () 
        self.used_list = [] 
        self.capacity = capacity 
    # @ return an integer 
    def get (self, key): 
        if key in self.cache: #Use 
            a list to record the order of access, the first to access is placed in front of the list, and the last to access is placed behind the list, so when the cache is full, delete list [0 ], And then insert a new item; 
            if key! = Self.used_list [-1]: 
                self.used_list.remove (key) 
                self.used_list.append (key) 
            return self.cache [key] 
        else: 
            return -1 
    def put ( self, key, value): 
        if key in self.cache:
            self.used_list.remove (key) 
        elif len (self.cache) == self.capacity: 
            self.cache.pop (self.used_list.pop (0)) 
        self.used_list.append (key) 
        self.cache [key] value = 
'' ' 
 
# method two: 
Import Collections 
 
# based orderedDict implemented 
class the LRUCache (collections.OrderedDict):
     ' '' 
    function: type data using collection.OrdereDict a least recently used algorithm 
    OrdereDict has a special method popitem (Last = When False), the queue is implemented, and the first inserted element 
    is popped . When Last = True, the stack method is implemented, and the most recently inserted element is popped. 
    Two methods are implemented: get (key) takes out the corresponding value in the key, if it does not return None 
    set (key, value), it has more LRU characteristics and adds elements 
    ' '' 
    def  __init__ (self, size = 5): 
        Self.size = size 
        self.cache = collections.OrderedDict () # ordered dictionary 
 
    DEF GET (Self, Key):
         IF Key in self.cache.keys ():
             # because also recorded at the same time visit visit number (order) 
            value = self.cache.pop (Key)
             # guarantee the recent visit of the list is always in the final surface 
            self.cache [Key] = value
             return value
         the else : 
            value = None
             return value 
 
    DEF PUT (Self, Key, value ):
         if key in self.cache.keys():
            self.cache.pop(key)
            self.cache[key] = value
        elif self.size == len(self.cache):
            self.cache.popitem(last=False)
            self.cache[key] = value
        else:
            self.cache[key] = value
 
if __name__ == '__main__':
    test = LRUCache()
    test.put('a',1)
    test.put('b',2)
    test.put('c',3)
    test.put('d',4)
    test.put('e',5)
    # test.put('f',6)
    print (test.get('a'))

 

  Reference: https://blog.csdn.net/qq_35810838/article/details/83035759

                    https://labuladong.gitbook.io/algo/gao-pin-mian-shi-xi-lie/lru-suan-fa

 

Guess you like

Origin www.cnblogs.com/cassielcode/p/12723024.html