BlockingQueue (blocking queue) detailed

Note: The content of this essay is completely quoted from http://wsmajunfeng.iteye.com/blog/1629354. It is well written, thank you very much, copying it is an accumulation, I am afraid I will not find it in the future.

I. Introduction

  In the new Concurrent package, BlockingQueue solves the problem of how to efficiently and safely "transmit" data in multithreading. Through these efficient and thread-safe queue classes, it brings great convenience for us to quickly build high-quality multi-threaded programs. This article details all members of the BlockingQueue family, including their respective functions and common usage scenarios.

2. Get to know BlockingQueue

  Blocking queue, as the name suggests, is first of all a queue, and the role of a queue in the data structure is roughly as shown in the following figure:

  From the above figure, we can clearly see that through a shared queue, data can be sent by the queue. Input from one end and output from the other end;

  There are two main types of commonly used queues: (Of course, through different implementations, many different types of queues can be extended, and DelayQueue is one of them)

    First in, first out (FIFO): The elements of the queue inserted first are also first out of the queue, similar to the function of queuing. To some extent, this kind of queue also reflects a kind of fairness.

    Last In, First Out (LIFO): Elements inserted later in the queue are first out of the queue, which prioritizes the most recent events.  


      
In a multi-threaded environment, data sharing can be easily achieved through queues. For example, in the classic "producer" and "consumer" models, data sharing between the two can be easily achieved through queues. Suppose we have several producer threads and several other consumer threads. If the producer thread needs to share the prepared data with the consumer thread, and use the queue to transmit the data, the data sharing problem between them can be easily solved. But what if the producer and the consumer are in a certain period of time, in case the data processing speed does not match? Ideally, if the rate at which the producer produces data is greater than the rate at which the consumer consumes, and when the produced data accumulates to a certain level, the producer must pause and wait (block the producer thread) in order to wait for the consumer. The thread finishes processing the accumulated data, and vice versa. However, before the concurrent package was released, in a multi-threaded environment, each of us programmers had to control these details, especially with regard to efficiency and thread safety, which would bring a lot of complexity to our programs . Fortunately, at this time, a powerful concurrent package was born, and he also brought us a powerful BlockingQueue. (In the field of multi-threading: the so-called blocking, in some cases, the thread will be suspended (ie blocked), and once the conditions are met, the suspended thread will be automatically awakened), the following two pictures demonstrate the two common BlockingQueue Blocking scenario: As shown in the figure above: when there is no data in the queue, all threads on the consumer side will be automatically blocked (suspended) until data is put into the queue. As shown in the figure above: when the queue is full of data, all threads on the producer side will be automatically blocked (suspended) until there is an empty position in the queue, and the threads will be automatically awakened.
       


   

  This is why we need BlockingQueue in a multithreaded environment. As a user of BlockingQueue, we no longer need to care when we need to block the thread and when we need to wake up the thread, because BlockingQueue does it all for you. Since BlockingQueue is so powerful, let's take a look at its common methods:

3.  The core method of BlockingQueue :

  1. Put in the data

    (1) offer(anObject): Indicates that if possible, add anObject to the BlockingQueue, that is, if the BlockingQueue can accommodate, return true, otherwise return false. (This method does not block the current execution method

 (2) offer(E o, long timeout, TimeUnit unit): The waiting       
       time can be set. If the BlockingQueue cannot be added to the queue within the specified time, it will return a failure.

    (3) put(anObject): Add anObject to the BlockingQueue. If there is no space in the BlockingQueue, the thread calling this method will be blocked until there is space in the BlockingQueue to continue.

  2. Get data

    (1) poll(time): Take the object ranked first in the BlockingQueue. If it cannot be taken out immediately, you can wait for the time specified by the time parameter, and return null if it cannot be taken;

    (2) poll(long timeout, TimeUnit unit): Take out a queue head object from the BlockingQueue. If the queue has data available within the specified time, it will immediately return the data in the queue. Otherwise know the time

There is no data available in the timeout, and the return fails.

    (3) take(): Take the object ranked first in the BlockingQueue, if the BlockingQueue is empty, block and enter the waiting state until the BlockingQueue has new data added; 

    (4) drainTo(): ​​Obtain all available data objects from BlockingQueue at one time (you can also specify the number of data to be obtained). Through this method, the efficiency of data acquisition can be improved; there is no need to lock or release locks in batches multiple times.

4.  Common BlockingQueue

  After understanding the basic functions of BlockingQueue, let's take a look at the members of the BlockingQueue family?

  1. ArrayBlockingQueue

  Array-based blocking queue implementation. Inside ArrayBlockingQueue, a fixed-length array is maintained to cache data objects in the queue. This is a commonly used blocking queue. In addition to a fixed-length array, ArrayBlockingQueue also holds two integer variables. , which respectively identify the position of the head and tail of the queue in the array.

  ArrayBlockingQueue shares the same lock object when producers put data and consumers get data, which means that the two cannot really run in parallel, which is especially different from LinkedBlockingQueue; according to the implementation principle, ArrayBlockingQueue can be used completely Separating locks, allowing for fully parallel running of producer and consumer operations. The reason why Doug Lea did not do this may be because the data writing and obtaining operations of the ArrayBlockingQueue are already light enough that the introduction of an independent locking mechanism, in addition to bringing additional complexity to the code, does not account for the performance at all. to any cheap. Another notable difference between ArrayBlockingQueue and LinkedBlockingQueue is that the former does not generate or destroy any extra object instances when inserting or removing elements, while the latter generates an extra Node object. In a system that needs to process large batches of data efficiently and concurrently for a long time, there is still a certain difference in its impact on GC. When creating an ArrayBlockingQueue, we can also control whether the internal lock of the object adopts a fair lock, and an unfair lock is adopted by default.

  2.LinkedBlockingQueue

  The linked list-based blocking queue, similar to ArrayListBlockingQueue, also maintains a data buffer queue (the queue is composed of a linked list). When the producer puts a piece of data into the queue, the queue will get the data from the producer, and Cached inside the queue, and the producer returns immediately; only when the queue buffer reaches the maximum cache capacity (LinkedBlockingQueue can specify this value through the constructor), the producer queue will be blocked until the consumer consumes a copy from the queue Data, the producer thread will be woken up, and vice versa, the processing on the consumer side is based on the same principle. The reason why LinkedBlockingQueue can efficiently process concurrent data is also because it uses independent locks for the producer and consumer to control data synchronization, which also means that in the case of high concurrency, producers and consumers can be parallel. Operate the data in the queue to improve the concurrent performance of the entire queue.

  As developers, we need to pay attention that if a LinkedBlockingQueue object is constructed without specifying its capacity, LinkedBlockingQueue will default to a capacity similar to infinite size (Integer.MAX_VALUE), so that if the speed of the producer is greater than that of the consumer The speed of the system may not have waited until the queue is full and blocked, and the system memory may have been exhausted.

  ArrayBlockingQueue and LinkedBlockingQueue are two of the most common and commonly used blocking queues. Generally, these two classes are sufficient to deal with the producer-consumer problem between multiple threads.

  The following code demonstrates how to use BlockingQueue:

  (1) Test class

copy code
1 import java.util.concurrent.BlockingQueue;
 2 import java.util.concurrent.ExecutorService;
 3 import java.util.concurrent.Executors;
 4 import java.util.concurrent.LinkedBlockingQueue;
 5
 6 public class BlockingQueueTest {
 7  
 8     public static void main(String[] args) throws InterruptedException {
 9 // Declare a cache queue with a capacity of 10
10         BlockingQueue<String> queue = new LinkedBlockingQueue<String>(10);
11  
12 //new three producers and one consumer
13         Producer producer1 = new Producer(queue);
14         Producer producer2 = new Producer(queue);
15         Producer producer3 = new Producer(queue);
16         Consumer consumer = new Consumer(queue);
17  
18 // With the help of Executors
19         ExecutorService service = Executors.newCachedThreadPool();
20 // start the thread
21         service.execute(producer1);
22         service.execute(producer2);
23         service.execute(producer3);
24         service.execute(consumer);
25  
26 // execute 10s
27         Thread.sleep(10 * 1000);
28         producer1.stop();
29         producer2.stop();
30         producer3.stop();
31  
32         Thread.sleep(2000);
33 // Exit Executor
34         service.shutdown();
35     }
36 }
copy code

  (2) Producer class

copy code
1 import java.util.Random;
 2 import java.util.concurrent.BlockingQueue;
 3 import java.util.concurrent.TimeUnit;
 4 import java.util.concurrent.atomic.AtomicInteger;
 5  
 6 /**
 7 * producer thread
 8  *
 9  * @author jackyuj
10  */
11 public class Producer implements Runnable {
12     
13 private volatile boolean isRunning = true;//Whether the flag is running
14 private BlockingQueue queue;//Blocking queue
15 private static AtomicInteger count = new AtomicInteger();//Automatically updated value
16     private static final int DEFAULT_RANGE_FOR_SLEEP = 1000;
17  
18 //Constructor
19     public Producer(BlockingQueue queue) {
20         this.queue = queue;
21     }
22  
23     public void run() {
24         String data = null;
25         Random r = new Random();
26  
27 System.out.println("Start producer thread!");
28         try {
29             while (isRunning) {
30 System.out.println("Producing data...");
31 Thread.sleep(r.nextInt(DEFAULT_RANGE_FOR_SLEEP));//Take a random number of 0~DEFAULT_RANGE_FOR_SLEEP value
32  
33 data = "data:" + count.incrementAndGet();//Add 1 to the current value of count atomically
34 System.out.println("Put data: " + data + " into the queue...");
35 if (!queue.offer(data, 2, TimeUnit.SECONDS)) {//The set waiting time is 2s, if it has not been added for more than 2s, return true
36 System.out.println("Failed to put data: " + data);
37                 }
38             }
39         } catch (InterruptedException e) {
40 e.printStackTrace ();
41             Thread.currentThread().interrupt();
42         } finally {
43 System.out.println("Exit the producer thread!");
44         }
45     }
46  
47     public void stop() {
48         isRunning = false;
49     }
50 }
copy code

  (3) Consumers

copy code
1 import java.util.Random;
 2 import java.util.concurrent.BlockingQueue;
 3 import java.util.concurrent.TimeUnit;
 4  
 5 /**
 6 * Consumer threads
 7  *
 8  * @author jackyuj
 9  */
10 public class Consumer implements Runnable {
11     
12     private BlockingQueue<String> queue;
13     private static final int DEFAULT_RANGE_FOR_SLEEP = 1000;
14  
15 //Constructor
16     public Consumer(BlockingQueue<String> queue) {
17         this.queue = queue;
18     }
19  
20     public void run() {
21 System.out.println("Start consumer thread!");
22         Random r = new Random();
23         boolean isRunning = true;
24         try {
25             while (isRunning) {
26 System.out.println("Getting data from the queue...");
27 String data = queue.poll(2, TimeUnit.SECONDS);//When there is data, take it directly from the head of the queue, block when there is no data, if there is data within 2s, take it away, if there is no data for more than 2s, return fail
28                 if (null != data) {
29 System.out.println("Get data: " + data);
30 System.out.println("Consuming data: " + data);
31                     Thread.sleep(r.nextInt(DEFAULT_RANGE_FOR_SLEEP));
32                 } else {
33 // If there is no data for more than 2s, it is considered that all production threads have exited, and the consumer threads are automatically exited.
34                     isRunning = false;
35                 }
36             }
37         } catch (InterruptedException e) {
38 e.printStackTrace ();
39             Thread.currentThread().interrupt();
40         } finally {
41 System.out.println("Exit the consumer thread!");
42         }
43     }
44  
45     
46 }
copy code

  3.DelayQueue

  Elements in DelayQueue can only be obtained from the queue when the specified delay time has elapsed. DelayQueue is a queue with no size limit, so the operation of inserting data into the queue (producer) will never be blocked, and only the operation of getting data (consumer) will be blocked.

  scenes to be used:

  There are few use cases of DelayQueue, but they are quite clever. Common examples are to use a DelayQueue to manage a connection queue that has not responded to a timeout.

  4.PriorityBlockingQueue

   Priority-based blocking queue (priority judgment is determined by the Compator object passed in by the constructor), but it should be noted that PriorityBlockingQueue does not block data producers, but only blocks data when there is no consumable data. of consumers. Therefore, special attention should be paid when using it. The speed at which the producer produces data must not be faster than the speed at which the consumer consumes data, otherwise, all available heap memory space will eventually be exhausted over time. When implementing PriorityBlockingQueue, the internal control thread synchronization lock adopts fair lock.

  5. SynchronousQueue

   An unbuffered waiting queue, similar to a direct transaction without an intermediary, is a bit like the producers and consumers in the primitive society. The producer takes the product to the market and sells it to the final consumer of the product, and the consumer must go to the market in person. The market finds the direct producer of the desired product. If one party does not find a suitable target, then I'm sorry, everyone is waiting at the market. Compared with the buffered BlockingQueue, there is one less intermediary dealer link (buffer), if there are dealers, the producer directly wholesales the products to the dealer, and does not need to care that the dealer will eventually sell these products to those Consumers, because dealers can stock a part of the goods, compared with the direct transaction mode, the overall throughput of the intermediate dealer model will be higher (can be bought and sold in batches); but on the other hand, because of the introduction of dealers, This adds additional transaction links from the producer to the consumer, and the timely response performance of a single product may be reduced.

  There are two different ways of declaring a SynchronousQueue, with different behaviors between them. The difference between fair mode and unfair mode:

  If fair mode is adopted: SynchronousQueue will adopt fair lock and cooperate with a FIFO queue to block redundant producers and consumers, so as to achieve the overall fairness strategy of the system;

  But if it is an unfair mode (SynchronousQueue defaults): SynchronousQueue uses unfair locks and cooperates with a LIFO queue to manage redundant producers and consumers. In the latter mode, if there is a gap between the processing speeds of producers and consumers, it is very difficult to It is prone to starvation, that is, there may be some producers or consumers whose data will never be processed.

V. Summary

  BlockingQueue not only implements the basic functions of a complete queue, but also automatically manages the automatic waiting and wake-up function between multiple threads in a multi-threaded environment, so that programmers can ignore these details and focus on more advanced functions.

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=324821880&siteId=291194637