[Base] java thread pool

Why use a thread pool

Thread pool for multithreading, which according to the system can effectively control the number of threads of execution, optimal operation results. Thread pool to do work is mainly to control the number of threads running, the task will be placed in the queue process, and then start these tasks after thread creation, if the number of threads exceeds the maximum number, then the number of threads waiting beyond, and other the thread is finished, then remove from the queue to perform the task.

Thread pool features

  • Thread multiplexing
  • The maximum number of concurrent control
  • Management thread

    Advantage of the thread pool

  • Reduce resource consumption, to reduce thread creation and destruction caused by the consumption by reusing threads that have been created.
  • Corresponding increase speed, when the mission arrives, the task may not need to wait until the thread creation can be implemented immediately.
  • Improve the manageability of the thread, the thread is a scarce resource, if the unlimited creation, not only consumes system resources, but also reduce the stability of decency, using threads can be unified distribution, tuning and monitoring.

    Several ways to create a thread

  • Inheritance Thread
  • Implement Runnable
  • Implement Callable

    How to use the thread pool

    Architecture thread pool

    Coding
  • Executors.newSingleThreadExecutor (): Only one thread of the thread pool, so all submitted tasks are executed sequentially
  • Executors.newCachedThreadPool (): There are a lot of thread pool threads need to be performed at the same time, the old threads available to be re-trigger a new task execution, if not execution threads within more than 60 seconds, it will be terminated and removed from the pool
  • Executors.newFixedThreadPool (): has a fixed number of threads in the thread pool, if there is no task execution, the thread will wait
  • Executors.newScheduledThreadPool (): used to schedule tasks to be executed in the thread pool
  • Executors.newWorkStealingPool (): newWorkStealingPool suitable for use in a very time-consuming operation, but newWorkStealingPool not ThreadPoolExecutor extension, it is an extension of the new thread pool class ForkJoinPool, but are implemented in a uniform Executors class, since reasonable using the CPU of the task operation (parallel operation), it is suitable for use in a very time consuming task

    ThreadPoolExecutor

    Foreign ThreadPoolExecutor as java.util.concurrent package provides the underlying implementation, in the form of an internal thread pool provide external management of task execution, thread scheduling, thread pool management services and so on.

    Several important parameters of the thread pool description

    parameter effect
    corePoolSize The core thread pool size
    maximumPoolSize Maximum thread pool size
    keepAliveTime CorePoolSize thread pool exceeds the number of idle threads maximum survival time; you can allowCoreThreadTimeOut (true) so that the effective time of the core thread
    TimeUnit keepAliveTime time unit
    workQueue Blocking task queue
    threadFactory New thread factory
    RejectedExecutionHandler When submitting a number of tasks exceeds maxmumPoolSize + workQueue sum, the task will be to deal with RejectedExecutionHandler

    Talk about the underlying thread pool works

    The more misleading is: the relationship between corePoolSize, maximumPoolSize, workQueue.
  1. When the thread pool is less than corePoolSize, submit new task will create a new thread to perform the task, even though at this time there is a free thread pool.

  2. When the thread pool reaches corePoolSize, submit new task will be placed workQueue, the waiting thread pool tasks scheduled for execution.

  3. When workQueue full and maximumPoolSize greater than corePoolSize, newly submitted task creates a new thread to perform tasks.

  4. When submitting a number of tasks exceeds maximumPoolSize, submit new tasks handled by RejectedExecutionHandler.

  5. When a thread pool thread over corePoolSize, idle time to reach keepAliveTime, shut down idle threads.

  6. When set allowCoreThreadTimeOut (true), the thread pool threads idle for keepAliveTime corePoolSize also closed.

Environment set reasonable parameters on the thread pool production

What strategy is to reject the thread pool

Waiting queue is full, never would fit new tasks, while the number of threads in the thread pool has reached the maximum number of threads, we can not continue to serve as a new task.

Thread Pool deny policy

  • AbortPolicy: handler refused to throw a run-time RejectedExecutionException
  • CallerRunsPolicy: running thread calls execute the task itself. This policy provides a simple feedback control mechanism, the speed can slow to submit new tasks.
  • DiscardPolicy: can not perform the task will be deleted
  • DiscardOldestPolicy: If the execution of the program has not been closed, is located at the head of the mission work queue will be deleted, and then retry the execution of the program (If it fails again, repeat this procedure)

You single in the work of the three methods to create a thread pool of fixed and variable number, which you use more super pit

If the reader is in Java blocking queue understand it, see here we might be able to understand why.

Java, there are two main BlockingQueue achieve, namely ArrayBlockingQueue and LinkedBlockingQueue.

ArrayBlockingQueue array is implemented by a bounded blocking queue capacity must be provided.

LinkedBlockingQueue a linked list is implemented bounded blocking queue capacity can be set to select, if not set, the queue is a non-blocking boundary, the maximum length of Integer.MAX_VALUE.

The problem here lies in: not set, it will be a non-blocking queue boundary of a maximum length of Integer.MAX_VALUE. That is, if we do not set LinkedBlockingQueue capacity, then its default capacity will be Integer.MAX_VALUE.

And when you create LinkedBlockingQueue newFixedThreadPool, the capacity is not specified. At this point, LinkedBlockingQueue is a borderless queue, the queue for a borderless, it can be continuously added to the task queue, in which case it is possible because too many tasks and lead to memory overflow.

The above mentioned problems mainly in newFixedThreadPool and newSingleThreadExecutor two factory methods, not to say newCachedThreadPool and newScheduledThreadPool two methods to secure the maximum number of threads created by these two methods may be Integer.MAX_VALUE, created so much thread, inevitably it may lead to OOM.

How practical work is to use the thread pool, whether through custom thread pool use

Custom thread pool

import java.util.concurrent.*;

/**
 * 第四种获得java多线程的方式--线程池
 */
public class MyThreadPoolDemo {
    public static void main(String[] args) {
        ExecutorService threadPool = new ThreadPoolExecutor(3, 5, 1L,
                TimeUnit.SECONDS,
                new LinkedBlockingDeque<>(3),
                Executors.defaultThreadFactory(),
                new ThreadPoolExecutor.DiscardPolicy());
//new ThreadPoolExecutor.AbortPolicy();
//new ThreadPoolExecutor.CallerRunsPolicy();
//new ThreadPoolExecutor.DiscardOldestPolicy();
//new ThreadPoolExecutor.DiscardPolicy();
        try {
            for (int i = 1; i <= 10; i++) {
                threadPool.execute(() -> {
                    System.out.println(Thread.currentThread().getName() + "\t办理业务");
                });
            }
        } catch (Exception e) {
            e.printStackTrace();
        } finally {
            threadPool.shutdown();
        }
    }
}

The rational allocation of thread pool if you are considering

CPU-intensive

  • CPU-intensive means that the task requires a lot of computation, without obstruction, CPU runs at full speed.
  • As little as possible the number of threads of the CPU-intensive tasks, usually the number of CPU cores + 1 thread pool threads.

IO-intensive

  • Since the IO-intensive task thread has not always been on a mission, you can assign a little more than the number of threads, such as CPU * 2.
  • You can also use the formula: CPU Audit / (l - blockage factor); wherein a blockage factor between 0.8 to 0.9.

Guess you like

Origin www.cnblogs.com/zhangxinying/p/12483819.html