ThreadPoolExecutor thread pool theory, saturation strategy, working queuing strategy

This link: https: //blog.csdn.net/wangmx1993328/article/details/80582803
directory

This article REVIEW

Brief thread pool

Executor structure

The benefits of using a thread pool

Thread pool works

Thread pool saturation strategy

AbortPolicy

DiscardPolicy

DiscardOldestPolicy

User-defined denial policy (most common)

Thread pool work flow chart

Job queuing strategy

SynchronousQueue

 LinkedBlockingQueue

ArrayBlockingQueue

This article Introduction
This paper describes the theoretical knowledge of Java thread pool
in Java There are several ways to create a new thread?
Thread inherit or implement Runnable
more advanced thread pool
thread pool Description of
the thread pool is JDK1.5 introduced in the beginning, also known as Executor framework, or Java Concurrency Framework
thread pool API related in java.util.concurrent package, commonly used to the following classes and interfaces:
java.util.concurrent.Executor: an interface that contains only a method, its abstract meaning is: the actuator to perform a Runnable task
java.util.concurrent.ExecutorService: inherited executor Interface interface, added a lot of methods for managing the life cycle of the task and actuators
java.util.concurrent.ThreadFactory: a new interface to generate a thread. The user may generate the logical thread to the thread pool managed by implementing this interface
java.util.concurrent.Executors: Create and return to rest of the class of each instance, provides many practical method of generating different actuators, such as those based execution thread pool the implementor.
java.util.concurrent.ThreadPoolExecutor: This class maintains a thread pool, to submit to this task in Executor, it is not creating a new thread pool threads, but the use of execution, but for a huge number of task execution time is very short , you can significantly reduce the overhead for task execution.
Structure Executor
executor structure including the results of calculations performed and asynchronous tasks, task.
Tasks include: tasks to be performed to achieve the required interfaces, such as Runnable interface Callable interfaces or
tasks performed: Executor implementation mechanisms including core interface tasks, as well as inherited from the Executor ExecutorService interface. There are two key Executor frame ExecutorService class implements the interface (the ThreadPoolExecutor and a ScheduledThreadPoolExecutor)
results asynchronous computation: Future includes an interface and implementation of the Future interface class FutureTask


The benefits of using a thread pool
to reduce resource consumption: reusable threads that have been created to reduce thread creation and destruction caused by consumption.
Improve the response speed: When the task arrives, the task may not need to wait until the thread creation can be implemented immediately.
Improve the manageability of threads: the thread is a scarce resource, if the unlimited creation, not only consumes system resources, but also reduce the stability of the system, using a thread pool can be unified distribution, monitoring and tuning
thread pool works
as a after the submission of new tasks to the thread pool thread pool process is as follows:
the thread pool to determine the core thread pool thread is full. When full, the creation of a new worker thread to perform the task. If the core thread pool thread is full, the second step.
Thread pool to determine whether the work queue is full. If the work queue is not full, the task will be stored in the newly submitted job queue waiting to be executed. If the work queue is full, the third step is performed.
Determine whether the thread pool thread pool (the outer part of the core thread pool thread pool) threads are in working condition. If not, create a new worker thread to perform the task. If full, then to the saturation strategy to deal with this task.
Thread pool saturation strategy
saturation strategy commonly used are as follows:
they are ThreadPoolExecutor class inner class can be called directly


AbortPolicy
the Java thread pool default blocking strategy, that does not perform this new task, but also abnormal, remember ThreadPoolExecutor.execute need to try catch, otherwise the program will exit directly throw a runtime.
DiscardPolicy
directly discarded and new tasks not performed, empty method
DiscardOldestPolicy
abandon a task from the head of the queue inside, and execute this task again.
User-defined denial policy (most common)
to achieve RejectedExecutionHandler, and define their own policy mode
thread pool work flow chart
to show an example ThreadPoolExecutor thread pool workflows

 


If the thread is currently running less than corePoolSize (kernel threads), then create a new thread to perform the task (note that the implementation of this step needs to acquire the global lock).
If the running thread is equal to or more than corePoolSize, the task will be added BlockingQueue (blocking queue / task queue).
If you can not join the task BlockingQueue (queue is full), then create a new thread in the non-corePool to handle the task (note that the implementation of this step needs to acquire the global lock).
If you create a new thread currently running thread will exceed maximumPoolSize, the task will be rejected, and saturation thread of execution strategies, such as: RejectedExecutionHandler.rejectedExecution () method.
ThreadPoolExecutor take such steps overall design idea is to perform at the execute () method, as much as possible to avoid obtaining a global lock (that would be a serious bottleneck scalable). After the completion of the preheating ThreadPoolExecutor (number of threads currently running is greater than or equal corePoolSize), almost all of execute () method calls are executed in step 2, and step 2 need not acquire a global lock.
Job queuing strategy
already said when the total number of worker threads in the thread pool threads exceeds the number of cores, plus the new task will be placed in the work queue waiting to be executed
using a thread pool you have to create ThreadPoolExecutor object, ThreadPoolExecutor (thread when constructors that create a pool) class, have specified work queue, it is BlockingQueue <Runnable> interface, and is specified in the actual development of this interface implementation class, used as follows.
SynchronousQueue
directly to policy ---- means that the work queue does not save any task is waiting to be executed, but directly submitted for execution to the thread.
The default options work queue is SynchronousQueue, it will be submitted directly to the thread task without saving them.
If the thread can be used to run a task immediately does not exist, then the task of trying to join the queue will fail, and therefore will construct a new thread.
This strategy avoids lock occurs when processing the request set may have an internal dependency. Submitted directly often requires unbounded maximumPoolSizes to avoid reject newly submitted task.
Executors of newCacheThreadPool () method to create a thread pool, this is to use the queuing policy
public static ExecutorService newCachedThreadPool () {
return new new the ThreadPoolExecutor (0, Integer.MAX_VALUE,
60L, TimeUnit.SECONDS,
new new SynchronousQueue <the Runnable> ());
}
 LinkedBlockingQueue
unbounded queue policy ---- unbounded work queue size means that no limit, you can add an unlimited number of tasks waiting.
Use unbounded queue will lead to a new job waiting in the queue when all corePoolSize threads are busy. Thus thread creation will not exceed corePoolSize. Therefore, the value maximumPoolSize is also invalid. So let corePoolSize generally equal maximumPoolSize
when each task is completely independent of the other tasks that the task is executed independently of each other, suitable for use unbounded queue 
newFixedThreadPool create a thread pool (int nThreads) Executors of the method is to use this line of strategy
ExecutorService newFixedThreadPool static public (int nThreads) {
return new new the ThreadPoolExecutor (nThreads, nThreads,
0L, TimeUnit.MILLISECONDS,
new new a LinkedBlockingQueue <the Runnable> ());
}
ArrayBlockingQueue with
bounded queue policy ---- means that there is a queue size of the work limits
advantage is to prevent resource depletion occurs, because if the work queue is endless add the task is very dangerous
after work when the queue fills up, it will execute threads saturation strategy
// constructor thread pool
ThreadPoolExecutor threadPool = new ThreadPoolExecutor (. 3,. 4,
. 3, TimeUnit.SECONDS,
new new ArrayBlockingQueue with <the Runnable> (2),
new new ThreadPoolExecutor.DiscardOldestPolicy ());
above the core thread 3, the work queue size for each thread is 2 (i.e., queue up to two tasks awaiting execution), the maximum number of threads in the thread pool 4
so that when the number of worker threads is less than equal to 3, the new thread directly perform tasks; exceeds 3, the task will be added to the work queue waits = 3 * 2 6, when the number of work queue of tasks waiting for more than six later, it will again create a new thread At this time the total number of the whole thread pool thread has reached four, when there are tasks to add, at this time will take saturation strategy
 
----------------
Disclaimer: This article is CSDN blogger "Chi descendant" of the original article, follow the CC 4.0 BY-SA copyright agreement, reproduced, please attach the original source and this link statement.
Original link: https: //blog.csdn.net/wangmx1993328/article/details/80582803

Guess you like

Origin www.cnblogs.com/mengen/p/11890520.html