On the Java thread pool works

With the rapid expansion of the project business, you've noticed a lot of free separate thread in each module, once wanted to monitoring and optimization of threading, the code will need to go to war.

I am sure you used the rxjava, okHttp these popular frameworks, they involve internal thread scheduling, and a good package for you to use a series of API, you even do not have to be concerned about how these threads are working. If no problem using them alone, but if you consider whether it should reconsider the project architecture from the perspective of how they are used.

Why use a thread pool?

  1. Threads are scarce resources, its creation will consume a large amount of system resources.
  2. Thread frequently destroyed, GC will trigger mechanism frequently, the system performance.
  3. Concurrent execution of multiple threads lack of unified management and monitoring.

Use the thread pool

Creates a thread pool java and contracting by the Executors class is complete, it provides a common way to create a thread pool.

  • newFixedThreadPool
  • newSingleThreadExecutor
  • newCachedThreadPool
  • newScheduledThreadPool

Behind will continue to introduce them, let's look at an example.

public void main() {
    ExecutorService executorService = Executors.newFixedThreadPool(3);
    for(int i = 0; i < 20; i++) {
        executorService.execute(new MyRunnable(i));
    }
}

static class MyRunnable implements Runnable {
    int id;
     MyRunnable(int id) {
        this.id = id;
    }

    @Override
    public void run() {
        try {
            Thread.sleep(3000);
            Log.i("threadpool", "task id:"+id+" is running threadInfo:"+Thread.currentThread().toString());
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
    }
}
复制代码

Example creates a fixed number of thread pool threads, and add 20 of these tasks.

fix.gif

Printing can be seen through the log, print the log once three print every 3 seconds, all tasks are named pool-1-thread-1, pool-1-thread-2, thread-pool-1-thread-3's run, which set us for the size of the thread pool match. Reasons for this phenomenon is only three threads in the thread pool, when the 20-time task is added to the thread pool, the first three priority tasks to perform, tasks are waiting behind.

If we ExecutorService executorService = Executors.newFixedThreadPool(3);change to ExecutorService executorService = Executors.newCachedThreadPool();look at the results.

cache.gif

Instant task execution is over, can be expected to use the thread pool created newCachedThreadPool way, will create as many threads to perform tasks.

Next, we look at the official works inside the thread pool, we divided into three parts to explain.

Thread Pool .png

Common thread pool species already mentioned above, we look at how they are created out of, we give two chestnuts.

# -> Executor.newFixedThreadPool
public static ExecutorService newFixedThreadPool(int nThreads) {
    return new ThreadPoolExecutor(nThreads, nThreads,
                                  0L, TimeUnit.MILLISECONDS,
                                  new LinkedBlockingQueue<Runnable>());
}

# -> Executor.newSingleThreadExecutor
public static ExecutorService newSingleThreadExecutor() {
    return new FinalizableDelegatedExecutorService
        (new ThreadPoolExecutor(1, 1,
                                0L, TimeUnit.MILLISECONDS,
                                new LinkedBlockingQueue<Runnable>()));
}
复制代码

Create a thread pool is visible through ThreadPoolExecutor done, look at its constructor.

# -> ThreadPoolExecutor构造方法
public ThreadPoolExecutor(int corePoolSize,
                          int maximumPoolSize,
                          long keepAliveTime,
                          TimeUnit unit,
                          BlockingQueue<Runnable> workQueue) {
    this(corePoolSize, maximumPoolSize, keepAliveTime, unit, workQueue,
         Executors.defaultThreadFactory(), defaultHandler);
}

public ThreadPoolExecutor(int corePoolSize,
                              int maximumPoolSize,
                              long keepAliveTime,
                              TimeUnit unit,
                              BlockingQueue<Runnable> workQueue,
                              ThreadFactory threadFactory,
                              RejectedExecutionHandler handler) {
        ...
        this.corePoolSize = corePoolSize;
        this.maximumPoolSize = maximumPoolSize;
        this.workQueue = workQueue;
        this.keepAliveTime = unit.toNanos(keepAliveTime);
        this.threadFactory = threadFactory;
        this.handler = handler;
    }
复制代码

A series of parameters constructor statement is very important that they understand the basic principles of the thread pool you have mastered, we take a look at their specific meanings:

  • corePoolSize core number of threads, unless you set the core thread timeout (allowCoreThreadTimeOut), has been alive in the thread pool thread, even if the thread is idle.
  • maximumPoolSize thread pool to allow the maximum number of threads exist.
  • workQueue work queue, when the core thread is busy, the job submission to job queue. If the work queue also exceeded capacity, it would be to try to create a thread to perform non-core tasks.
  • keepAliveTime non-core threading idle most of the time, exceed the value of the thread is then recycled.
  • threadFactory thread factory class for creating threads.
  • RejectedExecutionHandler work queue saturation tactics, such as discard, throw an exception and so on.

After the thread pool is created, the execute method may be submitted by the task, the task of feeding the thread pool processing according to the current operating status and the specific parameters, the overall model below:

Thread pool model diagram .png

The figure below clearly shows the thread pool submission process after the task, not repeat them.

Thread pool work flow chart .jpg

Next we look at the structural parameters of Executors utility class in several common thread pool use is like.

Thread pool type The core number of threads The maximum number of threads Non-core thread idle time Work Queue
newFixedThreadPool specific specific 0 LinkedBlockingQueue
newSingleThreadExecutor 1 1 0 LinkedBlockingQueue
newCachedThreadPool 0 Integer.MAX_VALUE 60s SynchronousQueue
newScheduledThreadPool specific Integer.MAX_VALUE 0 DelayedWorkQueue

Wherein specific refers to the need to pass a user a fixed value.

Here we need to queue for blocking additional analysis.

Blocking queue

Have you ever wondered why use blocking queue in non-blocking will not do it?

In fact blocking queue commonly used in the producer - consumer model is added, the task of the producer, is scheduled to perform task of consumers, they are usually in a different thread, if you use non-blocking queue, it will inevitably need additional processing synchronization strategy and inter-thread wake up policy. For example, when the task queue is empty, consumer threads are blocked when taken elements, when a new task is added to the queue to wake up the consumer thread processing tasks.

Blocking queue is set to achieve a variety of lock operation (Lock + Condition) when adding elements and access elements.

Another concern is the capacity issue blocking the queue, because, according to the processing flow chart thread pool, blocking queue size directly affects the capacity to create a non-core thread. Specifically, when the queue is full and does not create obstruction of non-core thread, but the task will continue to add to the queue waiting behind the blocking core thread (if any) to perform.

  • Internal blocking implementation LinkedBlockingQueue linked list queue using the default constructor Integer.MAX_VALUEas the capacity, that is often said, "unbounded" and the other may be limited by the capacity of the capacity parameter with a constructor. Using Executors utility class created thread pool size are unbounded.
  • SynchronousQueue capacity is 0, whenever there is an immediate task will trigger add in consumption, that is, each insert operation must accompanied by a removal operation, and vice versa.
  • DelayedWorkQueue achieve an array, the default capacity of 16, supports dynamic expansion, the task can be sorted delay, similar to the priority queue, may be accomplished with a timed or delayed ScheduledThreadPoolExecutor task.
  • ArrayBlockingQueue above which it is not the thread pool system, which is based on an array implemented, and can not be fixed capacity expansion.

You should select the appropriate blocking queue based on actual demand, and now we look at these thread pool usage scenarios.

  • It is characterized by newFixedThreadPool no non-core thread, which means that the task will not be too much even create a new thread, even if the idle task also remains a core number of threads. Waiting queue unlimited, performance is relatively stable and suitable for long-term task to perform, while the task is not large scene.
  • newSingleThreadExecutor newFixedThreadPool 1 is equivalent to the number of threads, since the number of threads 1, so that the task required for the scene of the execution order.
  • newCachedThreadPool It is characterized by no core thread, unlimited non-core thread can handle an unlimited number of tasks in a short time, but in fact to create threads very resource-consuming, too many threads to create very likely lead to OOM, and set the thread timeout, also it relates to the release of the thread resources when a large number of tasks in parallel unstable, a small amount of follow-up tasks in parallel and no longer need to perform additional tasks scene may be used.
  • newScheduledThreadPool commonly used for timing or delayed tasks.

In the actual development process is not recommended for direct use Executors provided, if the scale of the task, the response time is roughly determined by the manual should ThreadPoolExecutor constructor to create a variety of actual demand, but also free to control the number of threads, timeout, blocking queue, saturation strategy (the default is AbortPolicy saturation strategy that is throwing an exception).

Saturation strategy

Built-saturation strategy following four

  • DiscardPolicy discards the rejected task.
  • DiscardOldestPolicy head of the queue will be discarded task, namely first-team task team will be out to make room.
  • AbortPolicy RejectedExecutionException throw an exception.
  • CallerRunsPolicy runs the rejected task in the calling thread of the execute method.

The user may also, and passing through the interface to implement custom RejectedExecutionHandler saturated ThreadPoolExecutor multiple policies through the constructor argument.

Next we need to look at the inheritance structure of the thread pool.

Thread pool class diagram

Thread pool class diagram .png

  • Executor base class interface that defines only one method execute, which is the method used when we submit the task, execute the added task does not have a return value.
  • ExecutorService still is an interface, beginning to pool concept, defines the method submit submit the job with a return result, and shut down the thread pool shutDown method.
  • AbstractExecutorService interface method implements most of the remaining shutDown and execute the relevant abstract method is not implemented.
  • The most common thread pool ThreadPoolExecutor
  • ScheduledThreadPoolExecutor defines a set of tasks to support the delay of the thread pool.
  • ForkJoinPool and solve different problems ThreadPoolExecutor, which uses divide and conquer idea of a task subdivided into multiple sub-tasks execute in multiple threads. For example, to calculate 1-1000000 integer and, according to ThreadPoolExecutor of these programs is a task split into multiple submit to the thread pool, and according to ForkJoinPool solution is to submit a job to the thread pool, the task of detailed work split referred to the custom task execution. Small partners refer to the article of interest multithreaded ForkJoinPool explain .

Selected thread pool size

Understand the internal structure of the thread pool, in actual combat on how we should choose the size of the thread pool it?

This requires a general understanding of the task is CPU-intensive or IO intensive.

  • For example, a large number of CPU-intensive computing tasks, high CPU usage, so at this time if it will open up a thread to do because of frequent CPU thread scheduling result in reduced performance. General recommendations for the number of threads cpu cores +1, plus 1 is to prevent the core as a candidate when a thread is blocked or unplanned outages.
  • Generally refers IO intensive file I / O, network I / O and the like. Select the number of threads ratio IO time-consuming and time-consuming CPU related to 最佳线程数 = CPU 核数 * [ 1 +(I/O 耗时 / CPU 耗时)]the reason for setting the ratio of the I / O devices and CPU utilization is maximized.

Processed mononuclear, CPU computing and I / O operations is 1: 2, for example, three threads can be seen up to 100% CPU utilization (in this case from a very time-off programming --Java concurrent real ).

Time slicing .png

Thread pool status

State of the thread pool is vital to the whole process tasks, such as adding a task would be to determine whether the thread pool when running tasks added to a queue and then determine the operational state, at this time if the thread pool has been shut down and remove tasks perform saturation strategy.

We turn next to the thread pool of several states:

  • RUNNING: the task can accept new submissions, and can handle the task queue of obstruction;
  • SHUTDOWN: shut down state, no longer accept new submissions task, but it can continue to handle the blocking queue saved job. When the thread pool is in the RUNNING state, calls the shutdown () method causes the thread pool to enter the state. (Finalize () method will be invoked during execution of the shutdown () method proceeds to the state);
  • STOP: can not accept new tasks, does not handle the task queue, interrupted thread is processing tasks. When the thread pool is in the RUNNING or SHUTDOWN state, call shutdownNow () method causes the thread pool to enter state;
  • TIDYING: If all the tasks have been terminated, workerCount (effective number of threads) is 0, the thread pool after entering the state calls terminated () method to enter the TERMINATED state.
  • TERMINATED: This state is entered after the implementation of the terminated () method, the default terminated () method does nothing.

A flow diagram illustrating state

Thread pool status change map .png

to sum up

Back to the beginning of the article the question: RxJava and OkHttp which uses the thread pool scheduling it?

  1. Rxjava internal thread scheduler defines several commonly used Schedulers.io()and Schedulers.computation()correspond to the CPU-intensive and intensive IO scheduler, using both internal ScheduledThreadPoolExecutorthread pool, it is possible for the call delay chain delay operations like and design considerations. The difference is that the two different maximum number of threads of computation for the maximum number of threads as the number of cores CPU, while the maximum number of threads io is infinite.
  2. Default thread pool is in the OkHttp newCachedThreadPool, introduced above drawbacks of this thread pool, which may be due to high concurrency considerations and make choices, it can be flexibly configured according to the actual situation in actual use.
# -> Dispatcher
public Dispatcher() {
}

public Dispatcher(ExecutorService executorService) {
    this.executorService = executorService;
}

public synchronized ExecutorService executorService() {
    if (executorService == null) {
      executorService = new ThreadPoolExecutor(0, Integer.MAX_VALUE, 60, TimeUnit.SECONDS,
          new SynchronousQueue<Runnable>(), Util.threadFactory("OkHttp Dispatcher", false));
    }
    return executorService;
}
复制代码

Reference article

Guess you like

Origin juejin.im/post/5d566458f265da03e71af066