Common Queue introduction

 

 

 

  1. Common methods of queue:

remove: Remove the first element from the queue, there is no error.

poll: Remove the first element from the queue, return null if it does not exist.

peek: Query the first element, return null if it does not exist.

add: Add an element to the queue, the queue is full and report an exception.

offer: add an element to the queue, return false when the queue is full;

 

put: Add an element to the queue, and block when the queue is full, until the queue has new space.

take: Take the first element from the queue. If the queue is empty, it will block until there are elements in the queue.

 

PriorityQueue , the priority queue. The function of the priority queue is to ensure that the elements taken out each time are the smallest in the queue .

Java in PriorityQueue implements Queue interfaces, are not allowed into the null element; its implementation through the heap, specifically through the complete binary tree ( Complete Tree binary implementation) small stack top (a non-arbitrary weights leaf node, not greater than The weight of the left and right child nodes), which means that an array can be used as the underlying implementation of PriorityQueue .

 

 

Consider the following case:

 

public class T_PriorityQueque {

 

    public static void main(String[] args) {

 

        PriorityQueue<String> q = new PriorityQueue<>();

        q.add("c");

        q.add("e");

        q.add("a");

        q.add("d");

        q.add("z");

 

        for (int i = 0; i <5 ; i++) {

            System.out.println(q.poll());

        }

 

    }

}

 

result:

 

ConcurrentLinkedQueue

 

An unbounded thread-safe queue based on linked nodes. This queue sorts the elements according to the FIFO (First In First Out) principle. The head of the queue is the longest element in the queue. The tail of the queue is the element with the shortest time in the queue.
New elements are inserted at the end of the queue, and the queue fetch operation obtains elements from the head of the queue. When multiple threads share access to a common collection, ConcurrentLinkedQueue is an appropriate choice. This queue does not allow null  elements.

ConcurrentLinkedQueue consists of a head node and a tail node. Each node (Node) is composed of a node element (item) and a reference to the next node (next). The nodes and nodes are related through this next to form a linked list. Structure of the queue.

 

Simple to use:

 

public class T_ConcurrentQueue {

 

    public static void main(String[] args) {

        Queue<String> strs = new ConcurrentLinkedQueue<>();

 

        for (int i=0;i<10;i++){

            strs.offer("a"+i); //add

        }

        System.out.println(strs);

 

        System.out.println(strs.size());

 

        System.out.println(strs.poll());

 

        System.out.println(strs.peek());

 

        System.out.println(strs.size());

 

    }

}

 

 

result:

 

 

BlockingQueue

BlockingQueue is a blocking queue. It is used very much in high concurrency scenarios, in the thread pool. If the number of running threads is greater than the number of core threads, it will also try to put the newly added threads into a BlockingQueue. The characteristic of the queue is that it is easy to understand first in first out .

 

ArrayBlockingQueue

 

Simple to use:

 

public class T_ArrayBlockingQueue {

 

    static BlockingQueue<String> strs = new ArrayBlockingQueue<>(18);

 

    static Random r = new Random();

 

    public static void main(String[] args) throws InterruptedException {

        for (int i = 0; i <10 ; i++) {

            strs.put("a"+i);

        }

 

        strs.offer("aaa", 1, TimeUnit.SECONDS);

 

        System.out.println(strs);

 

    }

}

 

 

result:

 

 

LinkedBlockingDeque

 

LinkedBlockingQueue is also a blocking queue. What is the difference from ArrayBlockingQueue?

LinkedBlockingQueue saves elements as a linked list. There is an internal class of Node, which has a member variable Node next. In this way, a linked list structure is formed. To get the next element, just call next. The ArrayBlockingQueue is an array.

LinkedBlockingQueue has a lock for internal read and write (insert acquisition), and ArrayBlockingQueue shares a lock for read and write.

 

Simple to use:

 

 

public class T_LinkedBlockingQueue {

 

    static BlockingQueue<String> strs = new LinkedBlockingDeque<>();

 

    static Random r = new Random();

 

    public static void main(String[] args) {

        new Thread(() -> {

 

            for (int i = 0; i < 10; i++) {

 

                try {

                    strs.put("a" + i); //If it is full, it will wait

                    TimeUnit.MILLISECONDS.sleep(r.nextInt(1000));

                } catch (InterruptedException e) {

                    e.printStackTrace ();

                }

            }

 

        }, "p1").start();

 

        for (int i = 0; i < 5; i++) {

            new Thread(() -> {

 

                for (; ; ) {

                    try {

                        System.out.println(Thread.currentThread().getName() + "take -" + strs.take()); //If it is empty, it will wait

                    } catch (InterruptedException e) {

                        e.printStackTrace ();

                    }

                }

 

            }, "c" + i).start();

        }

 

 

    }

 

 

}

 

result:

 

 

 

SynchronousQueue

SynchronousQueue is a kind of BlockingQueue, so SynchronousQueue is thread-safe. The difference between SynchronousQueue and other BlockingQueue is that the capacity of SynchronousQueue is 0. That is, SynchronousQueue does not store any elements.

That is to say, every insert operation of SynchronousQueue must wait for other linear remove operations. And every remove operation must also wait for insert operations from other threads.

This feature reminds us of Exchanger. Unlike Exchanger, the same object can be delivered in two threads using SynchronousQueue. One thread puts the object, the other thread takes the object.

 

public class T_SynchronusQueue {

 

    public static void main(String[] args) throws InterruptedException {

        BlockingQueue<String> strs = new  SynchronousQueue<>();

        new Thread(()->{

            try {

                System.out.println(strs.take());

            } catch (InterruptedException e) {

                e.printStackTrace ();

            }

 

        }).start();

 

        strs.put("aaa");

        System.out.println(strs.size());

    }

}

 

 

result:

 

 

 

DelayQueue

 

DelayQueue is an unbounded BlockingQueue, used to place objects that implement the Delayed interface, and the objects can only be removed from the queue when it expires. This kind of queue is ordered, that is, the delay expiration time of the head of the queue is the longest. Note: You cannot put null elements in this kind of queue.

 

Simple to use:

public class T_DelayQueue {

 

    static BlockingQueue<MyTask> tasks = new DelayQueue<>();

 

    static Random r = new Random();

 

    static class MyTask implements Delayed{

 

        String name;

        long runningTime;

 

        MyTask(String name,long rt){

            this.name = name;

            this.runningTime = rt;

        }

 

        @Override

        public long getDelay(TimeUnit unit) {

            return unit.convert(runningTime-System.currentTimeMillis(),TimeUnit.MILLISECONDS);

        }

 

        @Override

        public int compareTo(Delayed o) {

           if(this.getDelay(TimeUnit.MILLISECONDS)<o.getDelay(TimeUnit.MILLISECONDS)){

               return -1;

           }else if(this.getDelay(TimeUnit.MILLISECONDS)>o.getDelay(TimeUnit.MILLISECONDS)){

               return 1;

           }else

               return 0;

        }

 

        @Override

        public String toString() {

            return "MyTask{" +

                    "name='" + name + '\'' +

                    ", runningTime=" + runningTime +

                    '}';

        }

    }

 

 

    public static void main(String[] args) throws InterruptedException {

        long now = System.currentTimeMillis();

        MyTask t1 = new MyTask("t1",now+1000);

        MyTask t2 = new MyTask("t2",now+2000);

        MyTask t3 = new MyTask("t3",now+1500);

        MyTask t4 = new MyTask("t4",now+2500);

        MyTask t5 = new MyTask("t5",now+500);

 

        tasks.put(t1);

        tasks.put(t2);

        tasks.put(t3);

        tasks.put(t4);

        tasks.put(t5);

 

        System.out.println(tasks);

 

        for (int i = 0; i <5 ; i++) {

            System.out.println(tasks.take());

        }

    }

 

}

 

result:

 

[MyTask{name='t5', runningTime=1592299436085}, MyTask{name='t1', runningTime=1592299436585}, MyTask{name='t3', runningTime=1592299437085}, MyTask{name='t4', runningTime=1592299438085}, MyTask{name='t2', runningTime=1592299437585}]

MyTask{name='t5', runningTime=1592299436085}

MyTask{name='t1', runningTime=1592299436585}

MyTask{name='t3', runningTime=1592299437085}

MyTask{name='t2', runningTime=1592299437585}

MyTask{name='t4', runningTime=1592299438085}

 

Process finished with exit code 0

 

 

DelayQueue  application scenarios

1. Taobao order business: If there is no payment within 30 minutes after placing the order, the order will be automatically cancelled. 
2. Are you hungry? Notification of ordering: Send an SMS notification to the user after 60s after the order is successfully placed.

3. Close idle connections. In the server, there are many client connections, which need to be closed after being idle for a period of time.

4. Cache. Objects in the cache exceed the idle time and need to be removed from the cache.

5. Task timeout processing. In the network protocol sliding window request response interaction, processing timeout unresponsive requests, etc.

 

LinkedTransferQueue

 

LinkedTransferQueue is a special blocking queue added to the JUC package in JDK1.7. In addition to the common functions of blocking queues, it also has a special transfer method.

We know that in a normal blocking queue, when the queue is empty, the consumer thread (the thread that calls the take or poll method) generally blocks and waits for the producer thread to store elements in the queue. The LinkedTransferQueue the transfer method is more specific:

  1. When a consumer thread is blocked and waiting, the producer thread that calls the transfer method will not store the element in the queue, but directly transfer the element to the consumer;
  2. If the producer thread calling the transfer method finds that there is no consumer thread waiting, it will enqueue the element and then block and wait until there is a consumer thread to obtain the element.

 

Simple to use:

 

public class T_TransferQueue {

 

    public static void main(String[] args) throws InterruptedException {

        LinkedTransferQueue<String> strs = new LinkedTransferQueue<>();

 

        new Thread(()->{

 

            try {

                System.out.println(strs.take());

            } catch (InterruptedException e) {

                e.printStackTrace ();

            }

 

        }).start();

 

        strs.transfer("aaa");

    }

}

 

result:

 

 

 

 

Guess you like

Origin blog.csdn.net/huzhiliayanghao/article/details/106791733