Stress test thread pool problem

Symptom:

exception: java.util.concurrent.RejectedExecutionException:
Task java.util.concurrent.FutureTask@1329962c rejected from java.util.concurrent.ThreadPoolExecutor@5a10b905[Running, pool size = 250, active threads = 250, queued tasks = 1024, completed tasks = 292976],
dubbo version: 2.6.7, current host: 192.168.144.107
java.util.concurrent.RejectedExecutionException: Task java.util.concurrent.FutureTask@1329962c rejected from java.util.concurrent.ThreadPoolExecutor@5a10b905[Running, pool size = 250, active threads = 250, queued tasks = 1024, completed tasks = 292976]

problem analysis

The pressure test triggers the default rejection policy AbortPolicy of ThreadPoolExecutor

 public void rejectedExecution(Runnable r, ThreadPoolExecutor e) {
    
    
            throw new RejectedExecutionException("Task " + r.toString() +" rejected from " +e.toString());}

Creating a new thread will cause the currently running thread to exceed the maximumPoolSize, the task will be rejected, and the RejectedExecutionHandler.rejectedExecution() method will be called.

Original thread pool parameter configuration

public class ThreadPoolConfig {
    
    
    @Bean("flattenQueryExecutor")
    public TitansExecutorService flattenQueryExecutor() {
    
    
        final TitansExecutorService executor = new TitansExecutorService(
                new ThreadPoolExecutor(200,250,30L,
                        TimeUnit.SECONDS,new LinkedBlockingQueue<>(1024)));
        return executor;
    }
}

According to the pressure test of 200 concurrency, each request has 50 tracking numbers and 200*50=1W tasks. Obviously, the core thread setting is a bit small, the queue size setting is too small, and the thread processing speed cannot keep up. As a result, no thread can handle the subsequent tasks, resulting in an exception.
optimization

new ThreadPoolExecutor(300, 300
                        , 60L, TimeUnit.SECONDS
                        , new LinkedBlockingQueue<>(8_192)
                        , new NamedThreadFactory("AsyncQuery")
                        , new ThreadPoolExecutor.CallerRunsPolicy()));

Reasonably set corePoolSize and BlockingQueue workQueue size, and select CallerRunsPolicy saturation strategy

CallerRunsPolicy: Use the thread of the [caller] (the calling thread of the execute method) to run the task (the current thread of the producer. No new threads are opened or handed over to the thread pool for scheduling). [Dubbo's own thread]

to sum up

• How to set the thread pool parameters reasonably.
• CallerRunsPolicy uses the thread where the [caller] is located. The default thread of dubbo and tomcat is 200.
• The design idea of ​​the thread pool

Design ideas

tomcat (web container)

• maxThreads (maximum number of threads): every time an HTTP request arrives at the Web service, tomcat will create a thread to process the request, then the maximum number of threads determines how many requests the Web service can handle at the same time, the default is 200.
• accepCount (the maximum number of waiting ): When the number of HTTP requests that call the Web service reaches the maximum number of threads of tomcat, there are new HTTP requests coming, then tomcat will put the request in the waiting queue , this acceptCount refers to the maximum number of waits that can be accepted, The default is 100. If the waiting queue is also full, new requests at this time will be rejected by tomcat (connection refused).
• maxConnections (maximum number of connections): This parameter refers to the maximum number of connections that tomcat can accept at the same time. Generally this value is greater than maxThreads+acceptCount.

Increasing threads is costly. By default, a thread stack of 1M is allocated when creating a new thread in the JVM, so more threads will smell more memory; more threads will bring more thread context Switching costs.

Dubbo call model
Dubbo call model

parameter name Scope of action Defaults Description Remarks
threads provider 200 Business processing thread pool size The number of core threads is similar to the corePool of the thread pool
iothreads provider CPU+1 io thread pool size xxxxxx
queues provider 0 Thread pool queue size. When the thread pool is full, the queue size waiting to be executed is recommended not to be set. When the thread pool is executed, it should fail immediately and retry other service providing machines instead of queuing unless there are special requirements. The default value is 0, which means the synchronous blocking queue is used; if queues is set to a value less than 0, the blocking linked list queue with a capacity of Integer.MAX_VALUE is used; if it is other value, the blocking linked list queue of the specified size is used.
connections consumer 0 For the maximum number of connections for each provider, short connection protocols such as rmi, http, and hessian indicate the limit of the number of connections, and long connection protocols such as Dubbo indicate the number of long connections established The Dubbo protocol shares a long connection by default (for each Provider, all clients share a long connection); other values: establish a specified number of long connections. When calling, if there are multiple persistent connections, use polling to obtain one persistent connection.
actives consumer 0 Maximum number of concurrent calls per service consumer per service per method When the Consumer is invoked, the invocation status of the service and method dimensions is counted. If the number of concurrency exceeds the set maximum value, the current thread will be blocked until the previous request processing is completed.
accepted provider 0 The maximum number of connections accepted by the service provider When the number of connections is greater than the maximum, the current connection is closed.
executes provider 0 The maximum number of concurrent execution requests per service per method by the service provider When the Provider processes the request, it counts the invocation of the method dimension. If the number of concurrency exceeds the set maximum value, an exception will be thrown directly.

Guess you like

Origin blog.csdn.net/eluanshi12/article/details/109238573