The solution to the high server cpu when excel exports a large amount of data

       Recently, the server has been stuck from time to time. Later, through log analysis, I found that there was an excel export operation when it was stuck. So I killed the process and restarted it and tried it manually. I found that the amount of exported data exceeded 30,000. The utilization rate is as high as 600% (the server has 8 cores and 8G, and the environment of the original configuration of tomcat), and then the tomcat will freeze after this situation lasts for about a minute.

       The problem has been clarified that it is the export problem. Later, I searched for various solutions on the Internet. It was said that the large amount of data was replaced by csv. However, in order to facilitate financial use, we insisted on exporting excel. According to our current business volume, one week With nearly 30,000 order data, exporting to excel should normally be no problem at all, so I went on to find a solution.

       First, find the reason why the cpu is too high. By using the jvisualvm monitoring tool to check the real-time dynamics of the cpu, it is found that after the heap memory suddenly rises, the cpu will rise suddenly when jvm garbage collection occurs, and it will drop slightly later, and garbage collection will be carried out in a while. , causing the cpu to soar again. Through analysis, the problem is basically caused by the high usage of heap memory and frequent GC, so I started to optimize the export code, and the jar package of poi-3.10 was imported into excel. Because it involves the for loop operation, the previous mind There is an impression that the declaration of the variable is finally placed outside the loop, but it is not very sure, so I checked it online, combined with http://www.iteye.com/problems/16385 and http://blog.csdn. net/virtualman2000/article/details/1138496 two blog posts, I personally still agree with the declaration of variables outside the loop, because if the variable is declared inside the loop, if the amount of data is large, new memory will be continuously allocated in the stack, It will still have a certain impact, so the first optimization is to put the variables in the loop out of the loop for declaration;

       The first step of optimization is relatively insignificant, and the next step is to optimize the jvm by setting the jvm parameters:

      JAVA_OPTS="-Xms4096m -Xmx4096m -XX:PermSize=256M -XX:MaxNewSize=1024m"

After the second step of optimization, restart tomcat and test again. It is found that the maximum CPU is more than 200%, and the usual CPU is less than 15%.

The second step is the most critical, and then there are corresponding restrictions on the front-end export buttons and the number of export items. For our business, the current limitation is that only data within one week can be exported. After the button is clicked, it is grayed out and cannot be clicked repeatedly, and the export The maximum number of entries is limited to 45,000.

       The current optimization has done this part first, and it may be adjusted later as the business changes. After the adjustment, I will share it with you.

 

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=326358458&siteId=291194637