2024 java interview--multi-threading (2)

Table of Contents of Series Articles

  1. 2024 Java Interview (1) – Spring Chapter
  2. 2024 Java Interview (2) – Spring Chapter
  3. 2024 Java Interview (3) – Spring Chapter
  4. 2024 Java Interview (4) – Spring Chapter
  5. 2024 Java Interview – Collection
  6. 2024 java interview–redis(1)
  7. 2024 java interview–redis(2)


Thread Pool

1. Use thread pool

(The bottom layer implements the run method)

static class DefaultThreadFactory implements ThreadFactory {
    
    
    DefaultThreadFactory() {
    
    
        SecurityManager s = System.getSecurityManager();
        group = (s != null) ? s.getThreadGroup() : Thread.currentThread().getThreadGroup();
        namePrefix = "pool-" + poolNumber.getAndIncrement() +"-thread-";
    }
    public Thread newThread(Runnable r) {
    
    
        Thread t = new Thread(group, r,namePrefix + threadNumber.getAndIncrement(),0);
        if (t.isDaemon()) t.setDaemon(false);  //是否守护线程
        if (t.getPriority() != Thread.NORM_PRIORITY) t.setPriority(Thread.NORM_PRIORITY); //线程优先级
        return t;
    }
}

Benefits :

1. Improve response speed (reduce the time to create new threads)

2. Reduce resource consumption (reuse threads in the thread pool, no need to create them every time)

3. Facilitate thread management

Advantages: By reusing created threads, resource consumption is reduced , threads can directly process tasks in the queue to speed up response , and it facilitates unified monitoring and management .


2. Thread pool constructor

/**
* 线程池构造函数7大参数
*/
public ThreadPoolExecutor(int corePoolSize,int maximumPoolSize,long keepAliveTime,
    TimeUnit unit,BlockingQueue<Runnable> workQueue,ThreadFactory threadFactory,
    RejectedExecutionHandler handler) {
    
    }

Parameter introduction:

parameter effect
corePoolSize The size of the core thread pool, the number of core threads, that is, the number of threads that create work under normal circumstances. These threads will not be eliminated after they are created, but are resident threads.
maximumPoolSize Maximum thread pool size, maximum number of threads, which corresponds to the number of core threads and represents the maximum number of threads allowed to be created. For example, when there are many tasks at present and the number of core threads is used up and the demand cannot be met, it will Create new threads, but the total number of threads in the thread pool will not exceed the maximum number of threads
keepAliveTime The maximum survival time of idle threads in the thread pool that exceeds corePoolSize; the idle survival time of threads that exceed the number of core threads, that is, core threads will not be eliminated, but some threads that exceed the number of core threads will be eliminated if they are idle for a certain period of time. To eliminate, we can set the idle time through setKeepA7iveTime
TimeUnit keepAliveTime time unit
workQueue The blocking task queue is used to store tasks to be executed. Assume that our core threads are all used now, and if there are still tasks coming in, they will all be put into the queue. Until the entire queue is full but tasks continue to enter, new ones will start to be created. thread
threadFactory Create a new thread factory, which is a thread factory used to produce threads to perform tasks. We can choose to use the default creation factory. The generated threads are all in the same group, have the same priority, and are not daemon threads. Of course, we can also choose to customize the thread factory. Generally, we will develop different thread factories according to the business.
RejectedExecutionHandler Denial strategy. When the number of submitted tasks exceeds the sum of maxmumPoolSize+workQueue, the task will be handed over to RejectedExecutionHandler for processing. There are two situations for task rejection strategy. The first is when we call shutdown and other methods to close the thread pool. At this time, even if the thread pool is internal There are still unfinished tasks being executed, but since the thread pool has been closed, if we continue to submit tasks to the thread pool, we will be rejected. Another situation is when the maximum number of threads is reached and the thread pool no longer has the ability to continue processing newly submitted tasks, this is also rejected.

Deny policy:

  • AbortPolicy: Throw an exception directly, default policy;
  • CallerRunsPolicy: Use the caller's thread to perform tasks;
  • DiscardOldestPolicy: Discard the frontmost task in the blocking queue and execute the current task;
  • DiscardPolicy: Discard the task directly; of course, you can also implement the RejectedExecutionHandler interface according to the application scenario and customize the saturation strategy, such as logging or persistent storage of tasks that cannot be processed

3. Thread processing task process

Please add image description
Please add image description
1. When the thread pool is smaller than corePoolSize, a newly submitted task will create a new thread to execute the task, even if there are idle threads in the thread pool at this time.

2. When the thread pool reaches corePoolSize, the newly submitted task will be put into the workQueue, waiting for task scheduling and execution in the thread pool.

3. When the workQueue is full and maximumPoolSize is greater than corePoolSize, a newly submitted task will create a new thread to execute the task.

4. When the number of submitted tasks exceeds maximumPoolSize, the newly submitted tasks are processed by RejectedExecutionHandler.

5. When the thread pool exceeds corePoolSize threads and the idle time reaches keepAliveTime, close the idle threads.


4. Thread rejection strategy

The threads in the thread pool have been used up and cannot continue to serve new tasks. At the same time, the waiting queue is full and no new tasks can be filled. At this time, we need to reject the policy mechanism to deal with this problem reasonably.

The built-in rejection policies of the JDK are as follows:

AbortPolicy : Throw an exception directly to prevent the system from running normally. You can choose strategies such as retrying or abandoning submission based on business logic.

CallerRunsPolicy : As long as the thread pool is not closed, this policy runs the currently discarded task directly in the caller thread.
It will not cause task loss, while slowing down the speed of task submission and giving buffer time for task execution.

DiscardOldestPolicy : Discard the oldest request, which is the task that is about to be executed, and try to submit the current task again.

DiscardPolicy : This policy silently discards tasks that cannot be processed without any processing. This is the best solution if tasks are allowed to be lost.


5. Execuors class implements thread pool

Please add image description

  • newSingleThreadExecutor (): A thread pool with only one thread. Tasks are executed sequentially, which is suitable for scenarios where tasks are executed one by one.
  • newCachedThreadPool (): There are many threads in the thread pool that need to be executed at the same time and reused within 60s. It is suitable for executing many short-term asynchronous small programs or services with light loads.
  • newFixedThreadPool (): A thread pool with a fixed number of threads. If no task is executed, the thread will wait forever, which is suitable for executing long-term tasks.
  • newScheduledThreadPool (): Thread pool used to schedule upcoming tasks
  • newWorkStealingPool (): The bottom layer uses forkjoin's Deque. Using an independent task queue can reduce competition and speed up task processing
    Please add image description
    because the above methods have disadvantages:

FixedThreadPool and SingleThreadExecutor: The queue length allowed for requests is Integer.MAX_VALUE, which will cause OOM.

CachedThreadPool and ScheduledThreadPool: The number of threads allowed to be created is Integer.MAX_VALUE, which will cause OOM.

The bottom layer of the manually created thread pool uses ArrayBlockingQueue to prevent OOM.


6. Thread pool size setting

  • CPU intensive (n+1)

CPU intensive means that the task requires a lot of computation without blocking and the CPU is always running at full speed.

For CPU-intensive tasks, the number of threads should be as small as possible, generally the number of CPU cores + the thread pool of 1 thread.

  • IO intensive (2*n)

Since IO-intensive task threads are not always executing tasks, you can allocate more threads, such as CPU * 2

You can also use the formula: number of CPU cores * (1 + average waiting time / average working time).


7. Why use thread pool?

Advantages: By reusing created threads, resource consumption is reduced , threads can directly process tasks in the queue to speed up response , and it facilitates unified monitoring and management .

1. Reduce resource consumption

The creation and destruction of threads will cause a certain consumption of time and space. The thread pool allows us to reuse the created threads.

2. Improve response speed

The thread pool has already created the thread for us. When the task arrives, it can be executed immediately without waiting for the thread to be created.

3. Improve thread manageability

Threads are scarce resources and cannot be created infinitely. The thread pool can be used for unified allocation, tuning and monitoring.

4. Provide more and more powerful functions

The thread pool is extensible, allowing developers to add more functionality to it. For example, the delayed scheduled thread pool ScheduledThreadPoolExecutor allows tasks to be executed deferred or periodically.


8. The role of blocking queue

1. A general queue can only be guaranteed to be a buffer of limited length. If the buffer length is exceeded, the current task cannot be retained. The blocking queue can retain the current task that wants to continue to be added to the queue through blocking.

The blocking queue can ensure that the thread acquiring the task is blocked when there is no task in the task queue, causing the thread to enter the wait state and release CPU resources.

The blocking queue has its own blocking and wake-up functions and does not require additional processing. When no tasks are executed, the thread pool uses the take method of the blocking queue to suspend, thereby maintaining the survival of the core thread and not occupying CPU resources all the time.

Why add the queue first instead of creating the largest thread first?

When creating a new thread, it is necessary to obtain the global lock. At this time, other threads must be blocked, which affects the overall efficiency.


9. Principle of thread reuse in thread pool

The thread pool decouples threads and tasks. A thread is a thread and a task is a task. It gets rid of the previous restriction that one thread must correspond to one task when creating threads through Thread.

In the thread pool, the same thread can continuously obtain new tasks from the blocking queue for execution. The core principle is that the thread pool encapsulates Thread. Thread.start() is not called every time a task is executed to create a new thread. Instead, each thread is asked to execute a "cyclic task". In this "cyclic task", it is constantly checked whether there is a task that needs to be executed. If there is, it is executed directly, that is, the run method in the task is called, and the run method is regarded as A common method execution, in which the run methods of all tasks are concatenated using only fixed threads.

Guess you like

Origin blog.csdn.net/weixin_43228814/article/details/132636538