Java line city detailed explanation

1. Introduction to Line City

(1) Reduce the consumption of system resources, and reduce the consumption caused by thread creation and destruction by reusing existing threads;

(2) Improve the response speed of the system. When a task arrives, it can be executed immediately without waiting for the creation of a new thread by reusing the existing thread;

(3) It is convenient to control the number of concurrent threads. Because if the thread is created without limit, it may cause OOM due to excessive memory usage, and it will cause excessive cpu switching (cpu switching thread has a time cost (it is necessary to keep the current execution thread site and restore the execution thread site) )

(4) Provide more powerful functions, delayed timing thread pool.

2. How to create a line city

Executors is a class under the concurrent package, which provides us with an easy way to create a thread pool.

Executors can create four commonly used thread pools:

(1) newCachedThreadPool creates a cacheable thread pool. If the length of the thread pool exceeds the processing needs, it can flexibly recycle idle threads. If there is no recyclable thread, a new thread will be created. There is no upper limit, and submitted tasks will be executed immediately.

(2) newFixedThreadPool creates a fixed-length thread pool, which can control the maximum number of concurrent threads, and the excess threads will wait in the queue.

(3) newScheduledThreadPool creates a fixed-length thread pool that supports timing and periodic task execution.

(4) newSingleThreadExecutor creates a single-threaded thread pool to execute tasks.

But the above four creation methods have very disadvantages:

    public static ExecutorService newFixedThreadPool(int nThreads) {
        return new ThreadPoolExecutor(nThreads, nThreads,
                                      0L, TimeUnit.MILLISECONDS,
                                      new LinkedBlockingQueue<Runnable>());
    }

Not allowed according to Ali specification

image-20220413222641239

Furthermore, LinkedBlockingQueue does not specify that the size of the queue is an unbounded queue, which may cause oom.

3. Custom line city

The constructor with the most parameters in the ThreadPoolExecutor class:

orePoolSize: The number of core threads in the thread pool. When the thread pool is initialized, the core thread will be created and enter the waiting state. Even if it is idle, the core thread will not be destroyed, thereby reducing the time and cost of creating a new thread when the task comes. performance overhead. When allowCoreThreadTimeOut is manually set to true, it will be destroyed

maximumPoolSize: The maximum number of threads, which means that the number of core threads has been used up, then only new threads can be recreated to execute the task, but the premise is that the maximum number of threads cannot be exceeded, otherwise the task can only enter the blocking queue and wait in line until Only when a thread is idle can the task be continued

keepAliveTime: Thread survival time, except for core threads, how long those newly created threads can survive. It means that these new threads will be destroyed after a certain period of time once they complete their tasks and are idle.

**threadFactory: ** is the thread factory that creates threads

unit: Thread survival time unit
workQueue: Indicates the blocking queue of tasks. Since there may be many tasks, and there are only a few threads, tasks that have not been executed will enter the queue. We know that the queue is FIFO. Wait until When the thread is idle, the task is taken out in this way. This generally does not require us to implement. There are four rejection strategies for the handler:

(1) AbortPolicy: Do not execute new tasks, throw an exception directly, indicating that the thread pool is full
  (2) DisCardPolicy: Do not execute new tasks, and do not throw an exception
  (3) DisCardOldSetPolicy: Replace the first task in the message queue
Execute   (4) CallerRunsPolicy for the current new incoming task : directly call execute to execute the current task

frequently used queue

\1. LinkedBlockingQueue
  For FixedThreadPool and SingleThreadExector, the blocking queue they use is a LinkedBlockingQueue with a capacity of Integer.MAX_VALUE, which can be considered as an unbounded queue. Since the number of threads in the FixedThreadPool thread pool is fixed, there is no way to add a particularly large number of threads to process tasks. At this time, a blocking queue with no capacity limit such as LinkedBlockingQueue is needed to store tasks. It should be noted here that since the task queue of the thread pool will never be full, the thread pool will only create threads with the number of core threads, so the maximum number of threads at this time is meaningless to the thread pool, because it will not trigger the generation of multiple threads. Threads based on the number of core threads.
\2. SynchronousQueue
  The second blocking queue is SynchronousQueue, and the corresponding thread pool is CachedThreadPool. The maximum number of threads in the thread pool CachedThreadPool is the maximum value of Integer. It can be understood that the number of threads can be expanded infinitely. CachedThreadPool is just the opposite of the previous thread pool, FixedThreadPool. In the case of FixedThreadPool, the capacity of the blocking queue is unlimited. Here, the number of threads in CachedThreadPool can be expanded infinitely, so the CachedThreadPool thread pool does not need a task queue to store tasks, because Once a task is submitted, it is directly forwarded to the thread or creates a new thread to execute without saving them separately.
When we create a thread pool that uses SynchronousQueue, if we don't want the task to be rejected, we need to pay attention to setting the maximum number of threads as large as possible, so as to avoid that when the number of tasks is greater than the maximum number of threads, there is no way to put the task in the queue. A situation where there are not enough threads to perform a task.
\3. DelayedWorkQueue
  The third blocking queue is DelayedWorkQueue. Its corresponding thread pools are ScheduledThreadPool and SingleThreadScheduledExecutor. The biggest feature of these two thread pools is that they can delay the execution of tasks, such as executing tasks after a certain period of time or executing tasks at regular intervals. . The characteristic of DelayedWorkQueue is that the internal elements are not sorted according to the time when they are put in, but the tasks are sorted according to the length of delay, and the internal use is a "heap" data structure. The reason why the thread pool ScheduledThreadPool and SingleThreadScheduledExecutor choose DelayedWorkQueue is because they themselves execute tasks based on time, and the delay queue can just sort tasks by time to facilitate task execution.

image-20220413223904792

4. How to configure the thread pool

For CPU-intensive tasks,
try to use a smaller thread pool, generally equal to the number of CPU cores + 1. Because CPU-intensive tasks make the CPU usage rate very high, if too many threads are opened, it will cause excessive CPU switching.

IO-intensive tasks
can use a slightly larger thread pool, generally 2*CPU core number. The CPU usage rate of IO-intensive tasks is not high, so the CPU can have other threads to process other tasks while waiting for IO, and make full use of CPU time.

Mixed tasks
can divide tasks into IO-intensive and CPU-intensive tasks, and then use different thread pools to process them. As long as the execution time of the two tasks is not much different after the division, it will be more efficient than serial execution.
Because if there is a data-level gap in the execution time of the two tasks after the division, then the split is meaningless.
Because the task that is executed first has to wait for the task to be executed later, the final time still depends on the task that is executed later, and the overhead of task splitting and merging is added, which is not worth the candle.

5. Line city tools

public final class ThreadPoolProvider {
   private ThreadPoolProvider(){}

    /**
     * 线程池<br/>
     * 备注:不建议随处创建,可统一一处调用
     */
private static final ExecutorService FIXED_THREAD_POOL = new ThreadPoolExecutor(10,500,5,TimeUnit.SECONDS,
                                                                new  ArrayBlockingQueue<Runnable> (1000),
                                                                new DefaultThreadFactory(),
                                                                new ThreadPoolExecutor.AbortPolicy());

   /**
     * @Title newFixedThreadPool
     * @Description: 获取定长的线程池
     * @return  ExecutorService
     */
    public static ExecutorService newFixedThreadPool(){
        return FIXED_THREAD_POOL;
    }


    /**
     * @ClassName: DefaultThreadFactory
     * @Description: 线程命名工厂
     */
    private static class DefaultThreadFactory implements ThreadFactory {
        private static final AtomicInteger poolNumber = new AtomicInteger(1);
        private final ThreadGroup group;
        private final AtomicInteger threadNumber = new AtomicInteger(1);
        private final String namePrefix;
        DefaultThreadFactory() {
            this("thread-pool-t-");
        }
        DefaultThreadFactory(String prefix) {
            SecurityManager s = System.getSecurityManager();
            group = (s != null) ? s.getThreadGroup() :
                    Thread.currentThread().getThreadGroup();
            namePrefix = prefix+poolNumber.getAndIncrement();
        }
        public Thread newThread(Runnable r) {
            Thread t = new Thread(group, r,
                    namePrefix + threadNumber.getAndIncrement(),
                    0);
            if (t.isDaemon()) t.setDaemon(false);
            if (t.getPriority() != Thread.NORM_PRIORITY) t.setPriority(Thread.NORM_PRIORITY);
            return t;
        }
    }
}

Follow my WeChat public
accountinsert image description here

Guess you like

Origin blog.csdn.net/CharlesYooSky/article/details/124160264