US Study Notes of Data Structures and Algorithms: Session Nine

First, the pre-course problems

We know, CPU resources are limited, the task of processing speed was positively correlated with the number of threads is not linear. Instead, too many threads can lead to anti-⽽ frequent switching CPU processing performance.
Therefore, the thread pool size are generally considered to deal with the task of features and hardware environment to set in advance.

When we request a thread pool thread to a fixed size, if there is no idle thread pool resources, and this time the thread pool how to deal with this request? Or reject the request is queued requests?
Various treatment strategies is how to achieve it?

In fact, these questions are not complicated, the contents of its underlying data structure that we have to learn today, queue (queue).

Second, how to understand the "queue"?

1. What is a queue?

You can think of it as queue up for tickets, first come, first buy, then people can only stand at the end, is not allowed to jump the queue. Advanced, first out, this is a typical "queue."

2, the operation of the queue support

1, the enqueue enqueue (), put into a data queue tail;
2, dequeue dequeue (), an element taken from the head of the queue.

3, the queue scenarios

1, high performance queue Disruptor, Linux ring buffer, have used complicated cycle queue;
2, using the Java Concurrent contract and to achieve fairness ArrayBlockingQueue lock.

Like queue stack and also a limited linear operating table data structure

Third, the order of the queue and the queue chain

1, how to queue

Like stacks, queues can use arrays to implement, you can also use the list to achieve. Implemented with a stack called the order of the array of stacks, a stack with a stack called the chain of linked list implementation. Also, the order of the array implemented with queue called queue
chain linked list implementation of the queue is called a queue chain.

2, the queues chain

1, an array of code implemented queue

// array implemented with queue 
public class ArrayQueue { 
  // array: items, the array size: n- 
  Private String [] items; 
  private int n-= 0; 
  // head HOL represents a superscript, tail the tail subscript represents 
  private int = 0 head; 
  Private tail int = 0; 

  // apply an array of size capacity 
  public ArrayQueue (int capacity) { 
    items = new new String [capacity]; 
    n-capacity =; 
  } 

  // enqueue 
  public boolean enqueue (String item) { 
    // If the tail == n indicates that the queue is full 
    IF (tail == n) return to false; 
    items [tail] = Item; 
    ++ tail; 
    return to true; 
  } 

  // dequeue 
  public String dequeue () {  
    // If the head == tail indicates that the queue is empty
    IF ( head == tail) return null; 
    // To give students another language to see more clearly, the - operation into a separate line to write a 
    String RET = items [head]; 
    ++ head; 
    return RET; 
  } 
}

2, the realization of ideas

Queue requires two pointers:

  1. Is a head pointer queue head;
  2. Tail is a pointer to the tail.

You can combine this picture below to understand. When a, b, c, d are sequentially queued, head pointer points to the next queue position 0 of the subscript, tail pointer to the position labeled 4.

When we called twice dequeue operation, the queue head pointer to the next position labeled 2, tail pointer is still pointing at index position 4.

2, data movement

With constantly enqueue, dequeue operations, head, and tail are continuously moved backward. When the tail moves to the far right, even if there is free space in the array, the data can not continue to add to the queue. How this problem solve it?

However, each time dequeue operations as deleting data array subscript 0, to move data from the entire queue, the time complexity of this operation will be dequeued from the original O (1) becomes O (n ). You can not be optimized a little bit?

3, data movement problems caused? How to optimize?

In fact, we can not move data at the time of the team. If there is no free space, and

We just need time into the team, and then concentrate trigger a move operational data. With this idea, a team function dequeue () remain unchanged,

We look into the team slightly modified function enqueue () implementation, you can easily solve the problem just. The following is a specific code:

   // enqueue operation, the item into the tail 
  public Boolean the enqueue (String item) { 
    // == n-tail without showing the end of the queue space 
    IF (== n-tail) { 
      // && n-tail head == == 0, denotes the entire queue fills 
      IF (head == 0) return to false; 
      // data migration 
      for (int I = head; I <tail; I ++) { 
        items [I-head] = items [I] ; 
      } 
      newly updated after // End moving head and tail 
      tail - = head; 
      head = 0; 
    } 
    
    items [tail] = Item; 
    ++ tail; 
    return to true; 
  }

We see from the code, when the queue tail pointer is moved to the rightmost array, if the new data into the team, we can head to the tail of data between, move integrally 0 to the array of head-to tail position.

4, chain queue

Achieve list-based, we also need two pointers: head pointer and tail pointer. They point to the first node and the last linked list node.

as the picture shows:

  1. 入队时,tail->next= new_node, tail = tail->next;
  2. When the team, head = head-> next.

I will be specific code on GitHub, you can try to achieve it, and then go under the code on GitHub compare me to achieve, to see written on the right.

Fourth, the circular queue

1. What is a circular queue

A circular queue name implies, looks like a ring. Originally array is beginning to end is a straight line. Now we put end to end, to pull into a ring. I draw a diagram, you can intuitively feel.

We can see that:

  1. The size of the queue in FIG. 8, the current head = 4, tail = 7. When there is a new element into a team, we put index is 7, respectively.
  2. But this time, we do not update the tail is 8, but will move it one after the ring, to the subscript 0 position.
  3. When there is an element b again into the team, we will b placed at index position 0, and tail plus 1 Update 1.

Therefore, after a, b successively enqueued, the circular queue element becomes like the following:

2, to determine a good team empty determination conditions

To write bug-free code that implements circular queue, I think, the most critical is to determine a good team and a team full of empty judgment conditions.

  • In an array of non-circular queue implementation, the team determined that full tail == n, the empty judgment that the team head == tail.
  • For that circular queue, how to determine the team and the team is full empty it?
  • Queue is empty, the determination condition remains head == tail. However, queue full condition is determined on the somewhat complicated.

I drew a diagram of the queue is full, you can look at, try to summarize what the law:

 

3, to determine team full of determination conditions.

I figure as in the case of team full of paintings, tail = 3, head = 4, n = 8,

So to sum up rule is: (3 + 1)% 8 = 4. Draw a team full of multi-map, you will find that when the team is full , (tail + 1)% n = head.

Have you ever noticed that when the queue is full, the figures point to the location of tail is actually no data is stored. Therefore, the circular queue wasting storage space of an array.
Talk is cheap, if you still do not understand how, it would show you code it.

CircularQueue class {public 
  // array: items, the array size: n- 
  Private String [] items; 
  Private n-int = 0; 
  // head HOL represents a superscript, tail subscript represents the tail 
  Private head int = 0; 
  Private int tail = 0; 

  // apply an array of size capacity of 
  public CircularQueue (int capacity) { 
    items = new new String [capacity]; 
    n-capacity =; 
  } 

  // enqueue 
  public Boolean the enqueue (Item String) { 
    // queue is full 
    if ((+ tail. 1) ==% n-head) return to false; 
    items [tail] = Item; 
    tail = (+ tail. 1) n-%; 
    return to true; 
  } 

  // dequeue 
  public String dequeue () {  
    // If the head == tail indicates that the queue is empty
    IF (= head = tail) return null;
    String ret = items[head];
    head = (head + 1) % n;
    return ret;
  }
}

Five blocking queue and concurrent queue

1. What is blocking queue

Blocking queue is actually based on an increase in the queue blocking operation.

  • Simply put, when the queue is empty, take data from the head of the team will be blocked. Preferably at this time because no data until the queue with the data to be returned;
  • If the queue is full, then insert the data of the operation it will be blocked until the queue has idle position before inserting the data, and then back again.

You should have found, that the above definition of a "producer - consumer model"! Yes, we can use a blocking queue, easily a "producer - consumer model"!

2, blocking queue and "producer - consumer model" "

This blocking queue-based implementation of "producer - consumer model", can effectively coordinate production and consumption rate:

  • When the speed of the "producer" of production data too fast, when "consumers" time to consumption, queue for storing data quickly filled.
  • This time, producers will block until the "consumer" consumer data, "producer" will continue to be woken up "production."
  • And not only that, based on blocking queue, we can also, to submit data processing efficiency by coordinating the number of "producer" and "consumer".
  • For example, the previous example, we can configure more than a "consumer", to deal with a "producer."

3, concurrent queue: How to implement a thread-safe queue it?

Earlier we talked about blocking queue in a multithreaded situation, there will be multiple threads simultaneously operating the queue, this time will be the presence of thread safety issues, then how to implement a thread-safe queue it?

  • We called thread-safe queue concurrent queue. The most straightforward implementation is not directly on the lock in enqueue (), dequeue (),
  • However, a large degree of concurrency lock granularity will be relatively low, allowing only one time a deposit or fetch operations.
  • Indeed, based on the circular queue array, a CAS atomic operation, it can be realized very efficient concurrent queue. This is also the circular queue queue chain chain more widely reasons.

In actual sermon Disruptor time, I will speak in detail the application of concurrent queue.

Six, answer begins

1, two treatments strategies

We generally have two treatment strategies. The first approach is non-blocking, directly reject the task request;

Another approach is blocked, the request is queued, until there is an idle thread, taken queued requests continue processing.

That request is queued how to store it?

We want fair treatment of requests for each queue, advanced, first serve, so this queue data structure is very suitable to store queued requests. As we have said, there is a queue based on linked lists and arrays based on these two implementations.
Both implementations for queued requests, what difference does it make?

2. What is the difference queue lists and arrays Based

Based on the list of ways:

Can achieve a support unlimited line bounded queue (unbounded queue), but may result in too many requests are queued
request response processing time is too long. So, for more sensitive response time of the system, based on a limited thread pool to achieve the queue list is not appropriate.

The array-based implementation bounded queue (bounded queue):

  • The limited size of the queue, so the thread pool queued requests exceeds the size of the queue, the next request will be rejected,
  • In this way the response time-sensitive systems, relatively more reasonable.
  • However, a reasonable set of queue size, is very luxurious.
  • Queue is too small lead to too many pending requests, queue up is too can not make full use of system resources, maximize performance.

3, other scenarios queue

In addition to the previously mentioned application queue requests queued thread pool scene outside:

  • Queue can be applied to any finite resource pool,
  • A request queue,
  • Such as database connection pools.

In fact, for most limited resource scenario, when there is no free resources, can be implemented basically by the request queue "Queue" this data structure.

VII Summary

Today we talked about ⼀ species with very similar stack data structure queue. About queue, you can master the following, this section no problem.

  • The greatest feature is FIFO queue, two main operations are enqueue and dequeue. With the same stack, both Arrays can be used to achieve, you can also use the list to achieve.
  • Order to achieve an array called queue,
  • With a list called the chain to achieve the queue. In particular queue looks like a circular ring.
  • In the array queue when there will be data movement operations, in order to solve the problem of data movement, we need to ring-like circular queue.

Circular queue is the focus of our festival. To write bug-free code that implements a circular queue, the key is to determine a good team and a team full of empty judgment conditions, the specific code you want to be able to write about.

In addition, we also talk about the kind of get accustomed ADVANCED queue structure, blocking queue concurrent queue, the queue are still bottom of this data structure, but on many other additional features. Blocking queue is into the team,
operating a team can block, concurrent queue is operating a multi-thread safe queue.

Eight, after-school thinking

1. In addition to the thread pool this pool structure will be used queuing request, you also know what a similar cell structure or scene will be used queue in the request it?

  • In operation card send and receive data packets, linux kernel protocol stack by way of circular queue for processing.
  • Acting on queue database uses an array-based sequence
  • In a connection pool redis uses queue-based order of the array

2. Today talked about concurrent queue on how to achieve lock-free concurrent queue, there are a lot of online discussion. On this issue, how do you see it?

  •  ruc linux kernel mode and user mode urcu achieve a lock-free concurrent access to shared data, it is ideal for reading and writing little scenes.
  • Its core idea is, duplicated copies linked list data, atomic chain pointer movement, no true lock operation.

Guess you like

Origin www.cnblogs.com/luoahong/p/11816944.html