JAVA- quickly understand the basic principles of the thread pool

Disclaimer: This article is a blogger original article, follow the CC 4.0 BY-SA copyright agreement, reproduced, please attach the original source link and this statement.
This link: https://blog.csdn.net/weixin_44807144/article/details/102510056

Foreword

       Speaking of the thread pool, I am sure no strangers to belong to one of the problems must ask in the interview, especially for " high concurrency " with higher requirements for enterprises, basically will ask point. Online concerning the thread pool of many articles and videos, this article is designed to help you quickly understand and master the basic principles of the thread pool, for advanced applications but more involved.



table of Contents

  1. Concurrent queue
  2. Introduction thread pool
  3. Why do we need a thread pool
  4. Principle thread pool
  5. Classification of the thread pool



A concurrent queue

1. The concept of concurrent queue

       A concurrent queue is unbounded thread-safe queue based on linked nodes, it uses the FIFO rule sort nodes, when we add an element when it is added to the tail of the queue, when we get an element, it will returns the element head of the queue.

2. Classification concurrent queue

       Concurrent queue into blocking queue queues and non-blocking, the following illustrative example:
       the conventional length of a queue 10, there are 11 elements need to put
Alt
                                              a schematic

The difference between two kinds of queue
  1. When enqueuing
    nonblocking Queue: When the element 10 into the queue, the queue is full at this time, and then into the first element 11 data will be lost.

    Blocking queue: the time when the queue is full, then will wait with the team when the elements in the queue, then the first 11 came alive again.

  2. When dequeuing
    nonblocking queue: if no queue element, the dequeue operation at this time, take out the element obtained is null.

    Blocking queue: When there is no element in the queue, if at this time the team will wait for the operation, when put to, when then taken out.

       In particular, the thread pool is based on blocking queue implementation.



Second, the thread pool Profile


       Thread pool is a multithreaded processing forms processing tasks will be added to the queue, the thread pool is to create a large number of idle threads at system startup, the program will pass a task thread pool, thread pool will start a thread to execute this task. After the end of execution, the thread does not die, but return to the thread pool once again become idle, waiting for the next task.


       In simple terms, it is a collection of thread pool threads.




Third, why the thread pool


Normal life cycle of the thread as shown below:
Here Insert Picture Description       For analysis, assume the various stages shown as time spent (of course the actual thread each stage takes a very short time, in milliseconds). If we can omit the other stages, each thread run the task, so that you can single-threaded processing tasks can save five seconds. To achieve this vision, we can use the thread pool to handle, because the threads in the pool is previously created a large number of idle threads when the queue of tasks into the thread pool thread can perform tasks directly, release resources after the completion of execution , continue processing the next task.
       For example look at: the existing 100 tasks need to be addressed, once to create up to 10 threads. If the ordinary way, to create a 10 threads handle 10 tasks require a total of 60 seconds, while the thread pool the way, once the implementation of 10 tasks, a total of 10 seconds.
       To sum up: We can clearly see that the thread pool in processing tasks with large amounts of highly concurrent systems, has a great advantage.



Fourth, the principle of the thread pool


1. ThreadPoolExecutor core classes

       Thread pool uppermost layer interface is the Executor, this interface defines a core method execute (Runnablecommand), this method is used to pass the task, finally ThreadPoolExecutor class implementation. And ThreadPoolExecutor is the core thread pool class constructor for this class as follows:


public ThreadPoolExecutor(int corePoolSize,int maximumPoolSize,long keepAliveTime,TimeUnit unit,BlockingQueue<Runnable> workQueue);

public ThreadPoolExecutor(int corePoolSize,int maximumPoolSize,long keepAliveTime,TimeUnit unit,BlockingQueue<Runnable> workQueue,ThreadFactory threadFactory);
 
public ThreadPoolExecutor(int corePoolSize,int maximumPoolSize,long keepAliveTime,TimeUnitunit,BlockingQueue<Runnable>workQueue,RejectedExecutionHandler handler);
 
public ThreadPoolExecutor(int corePoolSize,int maximumPoolSize,long keepAliveTime,TimeUnit unit,BlockingQueue<Runnable> workQueue,ThreadFactory threadFactory,RejectedExecutionHandler handler);


Each parameter indicates significance:

parameter name Parameter Meaning
corePoolSize The number of core thread pool size, that is the core thread
maximumPoolSize The maximum number of maximum thread pool size, that thread
keepAliveTime Idle time is the maximum survival time of newly created thread other than the core thread
TIMEUNIT time unit
workQueue Blocking queue, waiting for the store to task
threadFactory Thread factory used to create new threads
handler Refuse treatment strategy, when the amount of tasks submitted to the thread pool exceeds the maximum size of the thread pool + queue length, will take refuse treatment strategy

Specifically stated:
workQueue generally have the following three blocking queue:
SynchronousQueue: submitted directly, using the default queue
ArrayBlockingQueue: bounded queue
LinkedBlockingQueue: unbounded queue

threadFactory when the queue is full, but the total number of threads <maximum thread pool size, to create a new thread in the thread pool thread factory. Generally there are three types:
ArrayBlockingQueue: Bounded thread-safe blocking queue.
LinkedBlockingQueue: Concurrent safe blocking queue.
SynchronousQueue: synchronous queue.

When the handler is triggered, it has refused to deal with the following four strategies:
hreadPoolExecutor.AbortPolicy (default): RejectedExecutionException discard task and throw an exception.
ThreadPoolExecutor.DiscardPolicy: the task is discarded, but does not throw an exception.
ThreadPoolExecutor.DiscardOldestPolicy: discard the foremost task queue, and then try to perform the task (Repeat this process) again
ThreadPoolExecutor.CallerRunsPolicy: This task is handled by the calling thread



2. Thread Pool Schematic

Here Insert Picture Description

3. Thread pool examples

       Then you through a simple example in conjunction with the schematic to understand the basic principles thread pool:


public class test02 {
    public static void main(String[] args) {
        ThreadPoolExecutor pool =
        new ThreadPoolExecutor(1,2,3, TimeUnit.SECONDS, new LinkedBlockingDeque<>(3));
        //利用线程池中的线程开始执行任务
        //执行第一个任务
        pool.execute(new TestThread());
        
        //队列有三个任务等待
        pool.execute(new TestThread());
        pool.execute(new TestThread());
        pool.execute(new TestThread());
        
        //执行第五个任务
        pool.execute(new TestThread());
        //执行第六个任务,拒绝任务报错
        //pool.execute(new TestThread());
        //当前线程池中有2个线程:1个核心线程 + 1个新创建的线程 = 最大线程数

        //关闭线程池
        pool.shutdown();
    }
}

class TestThread implements Runnable{
    @Override
    public void run() {
        System.out.println(Thread.currentThread().getName());
    }
}


       Start by creating a simple type of thread pool, constructor only five parameters, each of the following meaning:
1: a core number of threads
2: The maximum number of threads
3: free time. Waiting for a new job after the newly created task execution thread idle time
TimeUnit.SECONDS: unit of time, in seconds
new LinkedBlockingDeque: blocking queue length is 3





When the results of the implementation of Article 6 does not perform the following tasks:

Here Insert Picture Description


The implementation of Article 6 of the task execution results are as follows:
Here Insert Picture Description

Analysis code execution:

Here Insert Picture Description
       An existing thread pool, there is only one core threadthread1The first task into the thread pool bythread1Executed, and 2-4 in the thread queue for execution, when a job submission No. 5, a schematic diagram, this time to meet the queue is full, and a new core + <= the maximum, so creating a new thread Thread2, thethread1And thread2 sharing tasks, it can also be seen by the operating results, indeed sharing tasks .

       When coupled with the task of section 6, a schematic diagram, when the queue is full, and a new core +> maximum, no extra thread execution task queue can not be loaded, the error will reject the task.



Fifth, the classification of the thread pool


Thread pool can be divided into the following four categories:


1. cacheable: newCachedThreadPool
  • Role: Creates a thread pool that creates a new thread pool as needed. When the old thread after the release of resources can use the old thread.

  • Features: Flexible maximum number of threads is INTER.MAX_VALUE, using a similar underlying queue borderless

2. Fixed-length: newFixedThreadPool
  • Role: Creates a thread pool that reuses a fixed number of threads to a shared unbounded queue to run these threads.

  • Features: thread is a certain amount, you can be well controlled concurrency

3. Timing: newScheduleThreadPool
  • Role: to create a delay or extension running thread pool.

  • Features: a thread pool having a specified number of threads, or may be a timing delay execution for tasks performed periodically scene.

Example 4. Single: newSingleThreadExecutor
  • Role: only one thread create a thread pool. And the inventory time thread is infinite, when the thread is busy, the new task will enter the unbounded blocking queue.
  • Features: For a scene a task to perform.


Copyright Notice:

1. This blog is the original article, I'll wait belongs to the original author in the romantic in you all;

2. The permit shall not be reproduced without the original author of this article, or will be deemed infringement;

3. reprint or quote this article, please indicate the source and author;

4. For non-compliance with this statement or other illegal use of this article who themselves retain the right to be investigated according to law.

Guess you like

Origin blog.csdn.net/weixin_44807144/article/details/102510056
Recommended