How to scientifically set the thread pool

The online high-concurrency service is like a levee standing silently beside the big rivers. It is always ready to deal with the impact of the flood. The online high-concurrency service's thread pool also causes many problems, such as: the thread pool is full , High CPU utilization, service thread hangs, etc. These are caused by improper use of the thread pool, or failure to do a good job of protection and degradation.

Of course, some small partners have the idea of ​​protecting the thread pool, but do you have such experience and impressions, sometimes the thread of the thread pool is set more and the performance is lower, the setting is less or the performance is low, how should it be set? What about thread pools?

After years of design reviews of small partners, I learned that the small partners are based on experience and intuition to set the number of threads in the thread pool, then adjust the number according to the online situation, and finally find a most suitable value. This is through experience. Sometimes it works, sometimes it does n’t work, sometimes it works but it costs a lot of money to find the optimal number of settings.

In fact, the setting of the thread pool is well-founded and can be set based on theoretical calculations.

First of all, let ’s look at the ideal situation, that is, all the tasks to be processed are calculation tasks. At this time, the number of threads should be equal to the number of CPU cores. Let each CPU run a thread without thread switching. Of course, this is the ideal situation.

In this case, if we want to achieve a certain amount of QPS, we use the following calculation formula.

Set number of threads = target QPS / (1 / actual task processing time)

For example, suppose the target QPS = 100, the actual processing time of the task is 0.2s, and 100 * 0.2 = 20 threads. The 20 threads here must correspond to the physical 20 CPU cores, otherwise the estimated QPS index will not be reached.

But in fact, in addition to doing in-memory computing, our online services are more about accessing databases, caches, and external services. Most of the time is waiting for IO tasks.

If there are many IO tasks, we use Amdahl's law to calculate.

Set number of threads = number of CPU cores * (1 + io / computing)

For example, assuming a 4-core CPU, the IO tasks in each task account for 80% of the total tasks. 4 * (1 + 4) = 20 threads. The 20 threads here correspond to 4-core CPUs.

In addition to the setting of the number of threads in the thread, the setting of the thread queue size is also very important, which can also be obtained through theoretical calculation, the rule is to calculate the queue size according to the target response time.

Queue size = number of threads * (target corresponding time / actual task processing time)

For example, assuming that the corresponding time of the target is 0.4s, the length of the blocking queue is calculated as 20 * (0.4 / 0.2) = 40.

In addition, when setting the number of thread pools, we have the following best practices.

  1. The use of thread pools should consider the maximum number of threads and the minimum number of threads.

  2. For a single-part service, the maximum number of threads should be equal to the minimum number of threads, and the mixed service can appropriately widen the gap between the maximum and minimum numbers to adjust the overall utilization of CPU cores.

  3. The size of the thread queue must be set to a bounded queue, otherwise the pressure will overwhelm the entire service.

  4. Use thread pools only when necessary, and design performance evaluation and stress testing must be performed.

  5. The failure strategy of the thread pool and the compensation after failure must be considered.

  6. Background batch processing services must be separated from online user-oriented services.

Guess you like

Origin www.cnblogs.com/lupeng2010/p/12705795.html