Advantages and principles thread pool

  • The benefits of using a thread pool

    Pooling technology: thread pools, database connection pool, http connection pools, and so on.

    The main idea of ​​pooling technique is to reduce the consumption of each access to resources, improve the utilization of resources.

    It provides a thread pool limit, resource management strategy. Each thread pool also maintains some basic statistics, such as number of tasks have been completed.

    Use the thread pool benefits:

    • Reduce resource consumption : reduce thread creation and destruction caused by the consumption by reusing threads that have been created.

    • Improve the response speed : When the task arrives, may not need to wait for thread creation can be implemented immediately.

    • Improve the manageability of threads : the thread is a scarce resource, if the unlimited creation, not only consumes system resources, but also reduce the stability of the system, using a thread pool can be unified distribution, monitoring and tuning.

     

    Two Executor framework

    Executor framework includes not only the management thread pool, it also provides a thread factory, queue and deny policy, so that concurrent programming easier.

    Use Executor schematic framework

    1. The main thread must first create a task object that implements the Runnable or Callable interface.

    2. Object implements Runnable / Callable interface directly to the creation of complete ExecutorService execution:

      ExecutorService.execute(Runnable command)或者ExecutorService.sumbit(Runnable command)或ExecutorService.sumbit(Callable <T> task).
    3. If you do ExecutorService.submit (...), ExecutorService will return a Future object that implements the interface. Finally, the main thread can execute FutureTask.get () method to wait for the completion of the task execution. The main thread can perform FutureTask.cancel () to cancel the execution of missions.

    import java.util.concurrent.ArrayBlockingQueue;
    import java.util.concurrent.ThreadPoolExecutor;
    import java.util.concurrent.TimeUnit;
    
    public class ThreadPoolExecutorDemo {
    
        private static final int CORE_POOL_SIZE = 5;
        private static final int MAX_POOL_SIZE = 10;
        private static final int QUEUE_CAPACITY = 100;
        private static final Long KEEP_ALIVE_TIME = 1L;
    
        public static void main(String[] args) {
            ThreadPoolExecutor executor = new ThreadPoolExecutor(
                    CORE_POOL_SIZE,
                    MAX_POOL_SIZE,
                    KEEP_ALIVE_TIME,
                    TimeUnit.SECONDS,
                    new ArrayBlockingQueue<>(QUEUE_CAPACITY),
                    new ThreadPoolExecutor.CallerRunsPolicy());
    
            //执行线程代码
    
            executor.shutdown();
    
        }
    
    }
     

    CORE_POOL_SIZE: core number of threads defined minimum number of threads that can run concurrently.

    MAX_POOL_SIZE: When the task queue stored in the queue to reach the capacity of the current number of threads that can run concurrently becomes the maximum number of threads.

    QUEUE_CAPACITY: When a new task is added to the number of threads will first determine whether the core is currently running threads, if reached, the task will be stored in the queue.

    KEEP_ALIVE_TIME: When the number of threads in the pool is greater than the core number of threads, then if no new job submission, outside the core thread thread is not destroyed immediately, but will wait until the wait time exceeds KEEP_ALIVE_TIME will be recycled destroyed .

    ThreadPoolExecutor.CallerRunsPolicy (): call to do your own thread running task, which is directly in the calling thread of the execute method run (run) rejected task, if the executor has been shut down, the task is discarded. Therefore, this strategy will reduce the rate of submission of new tasks, affect the overall performance of the program. In addition, this strategy like to increase the capacity of the queue. If an application can withstand this task can not be delayed and dropped a task requests, you can choose this strategy.

     

    Analysis Principle thread pool

     

    Three thread pool size determination

    There is a fairly wide use and simple formula:

    • CPU-intensive tasks (N + 1): This is the main task consumes CPU resources, the number of threads can be set to N (CPU cores) +1, the extra number of CPU cores than one thread to prevent thread occasional lack of page breaks, or other causes task pause influence brought. Once the task is stopped, CPU will be idle for, but in this case more out of a thread can take full advantage of idle time of the CPU.

    • I / O-intensive (2N): This application task together, most of the time the system is used to process I / O interaction, and the thread in the process I / O is the period of time it does not occupy CPU to handle, then you can CPU will hand over to other threads. Therefore, the application I / O intensive task, some threads may configure multiple, specific calculation party is 2N.

Guess you like

Origin www.cnblogs.com/allenhhw/p/12101431.html