Binary tree traversal issue, time and space complexity, elimination strategy algorithm, lru data structures, dynamic programming algorithm greedy

  1. Preorder traversal of a binary tree, preorder, postorder traversal

    1. Preorder traversal
      1. Traversal order rules about [root]
      2. ABCDEFGHK
    2. Preorder
      1. Traversal order to rule the root [left and right]
      2. BDCAEHGKF
    3. Postorder
      1. Traversal order rules about [root]
      2. DCBHKGFEA
  2. What is the time complexity and space complexity

    1. time complexity
      1. It refers to the current algorithm execution time consumed
    2. Space complexity
      1. Refers to the execution of the current algorithms require much memory space occupied
    3. Evaluation of the efficiency of an algorithm is mainly to see its time complexity and space complexity. Then sometimes fish and bear's paw can not have both, so we need to get from a balance
  3. Know which algorithms are eliminated strategy?

  4. If lru algorithms allow you to realize what kind of data structure you choose

    1. Principle LRU algorithm (least recently used, least recently used):
      1. Four implementation of LRU
        1. LRU-K
          1. Principle: K represents the number of times the LRU-K most recently used, and therefore can be considered LRU LRU-1. The main purpose of LRU-K is to solve the problem "cache pollution" LRU algorithm, the core idea is to extend the "recently used one time," the criteria for the most recently used K times
          2. Implementation: compared to the LRU, LRU-K need more to maintain a queue for history being accessed recorded all cached data. Only when the number of accesses data reaches K times, will put the data into the cache. When the need to eliminate data, LRU-K after time out of the current maximum data access time, according to K-th.
            1. process
              1. Data is first accessed, added to the access history list
              2. If the data access in the history list does not reach K visits, they follow certain rules (FIFO, LRU) eliminated
              3. When the number of accesses data access history queue reached K times, the index data is deleted from the history queue, move data to the buffer queue, and the data cache, the cache queue re-ordering by time
              4. After the cache data queue is accessed again, reorder
              5. When the need to eliminate data buffer queue row phased out at the end of the data, that is, out of the penultimate K visit from now the oldest data
          3. LRU-K reduces cache pollution problem caused by the hit rate is higher than LRU, he is a priority queue, algorithm complexity and costs are relatively high
          4. cost:
            1. Since the LRU-K also what needs to be accessed, but has not placed in the cache of objects, so the memory consumption will be more than LRU record; when a large amount of data, memory consumption will be great
            2. He needs to be sorted based on time (can be phased out during the time of ordering, you can instantly sort), CPU consumption will be higher than the LRU
        2. Two queues(2Q)
          1. Principle: LRU-2 algorithm is similar, except that the LRU-2 2Q access history queue algorithm (note that this is not cached data) is changed to a FIFO buffer queue 2Q algorithm has two buffer queue, the other a FIFO queue LRU queue is
          2. Implementation: When the data of the first I question, in the data buffer. 2Q algorithm FIFO queue, when the second data is accessed, the data will be moved from the FIFO queue inside the LRU queue, each queue in accordance with two their own way out of the data
            1. Implementation process
              1. The new access data is inserted into FIFO queue
              2. If the data in the FIFO has not been accessed, then again, then eventually phased out according to the rules of FIFO
              3. If the data is then accessed again in the FIFO queue, the data will be moved to the head of the LRU queue
              4. Data cleavage LRU queue is accessed again, it will move the data to the head of the LRU queue
              5. LRU queue out of the end of data
        3. Multi Queue(MQ)
          1. Principle: MQ algorithm based on frequency of access of the data into multiple queues, different queues have different access priority, the core idea is this: many visits priority cache data
          2. Achieved: MQ algorithm LRU cache divided into a plurality of queues, each queue corresponding to a different access priority. Access priority is calculated based on the number of visits
        4. LRU
          1. Principle: algorithm to eliminate data based on historical data access records, the core idea is "if the data has recently been visited, then the probability of future access will be higher."
          2. Implementation: The most common is to achieve is to use a linked list to store cache data
            1. process
              1. Book inserted at the head of the list
              2. Whenever a cache hit (ie when the cached data is accessed), will move to the head of the list
              3. When the list is full, the tail of the linked list data is discarded
  5. Dynamic programming and greedy algorithms know it

    1. Dynamic Programming
      1. Global optimal solution must contain a local optimal solution, but not necessarily before a local optimal solution contains, so all the optimal solution needs to be recorded before
      2. The key dynamic programming is that the state transition equation, namely, how to use local optimal solution has been obtained to derive the globally optimal solution
      3. Boundary conditions: that is the most simple, local optimal solution can be directly derived
    2. how are you
      1. Always make time to see the current problem solving is the best option, that is to say, not to be considered as a whole the best, he made only a partial sense of the optimal solution
      2. Every step of the greedy decisions are made can not be changed, because the greedy decisions are the optimal solution is derived from the optimal solution on the next step, but before the optimal solution is not retained at the previous step
      3. The right conditions greedy algorithm: the optimal solution for each step must contain the optimal solution of the previous step
    3. Common: all A recursive algorithm, it is local optimal solution to derive the global optimal solution

Guess you like

Origin www.cnblogs.com/tulintao/p/11576221.html