jvm memory leak analysis

Turn: https: //www.cnblogs.com/wanghaoyang/p/11687329.html

A few days ago, just tell me the operation and maintenance of on-line services CPU-java too high.

The following are specific discovery process to solve the problem.

   1: By #top command to view my java service is really the CPU almost filled, as

   

   This process can be seen 18400 CPU utilization reached 12%, which is definitely not normal, so we are going to analyze in the end which threads take up CPU

 

  2: #top -Hp 18400 We can see this process in the case of threads this command, part of the screenshot below.

   

  You can see by the screenshot, occupied the front of the thread CPU is relatively high, then we detailed analysis of these threads 

 

  3: We #jstack 18400> 18400.txt command this thread stack java process to catch up, you can catch more than a few times to be compared.

  We have 18414 this thread, for example, it will turn into hexadecimal, linux can #printf "% x" 18414 thread id into hexadecimal 47ee terminal command, then we are going to find this in the file 47ee thread doing, some shots are as follows

  

   We can see 47ee a garbage collection thread; we do the same for other high occupancy CPU threads, GC threads are found. This service has been explained java GC, this is very normal. So we are going to GC analysis of the situation.

 

  4: We command #jstat -gcutil 18400 1000 100 to see the GC took the case open, some shots are as follows

    

   Is not very intuitive, YGC by this column found no increase in the number of younggc, but by this column to see the number of fullgc FGC has been increasing, terrible years old and have not recovered (see O through this column).

  This is not the time you think of a noun: a memory leak. Yes, then we need to analyze where there is a memory leak.

 

  5: We can #jmap -dump: live, format = b , file = 18400.dump 18400 this process to dump the current stack down. Note that this file can be seen as a snapshot of the heap, so much the current heap, dump down almost too much. I dump down almost 2G.

  With this document, we need to analyze, you can use the command jhat analysis, of course, we often are more powerful graphical tools such as JDK comes visualvm, you can also use third-party JProfiler (I use this) If you use Eclipse, you can also install plug-MAT. These tools can analyze heap dump file.

  Note that, due to the dump file may be large, so the required memory analysis tool is relatively large, it is best analyzed on the relatively good performance of the machine.

  Here are some screenshots of my analysis of JProfiler

  

 

   Is not very intuitive, there are objects accounted for 97% of the memory. Then the next step is to analyze the object in which to produce, which is referenced. I have here is obviously the LinkedList takes up all the space, then go to analyze this LinkedList which have saved something, which may need to combine your code, I will not elaborate.

  My analysis out of the ElasticSearch client tools JestClient asynchronous request queue is too long, the entire List which nodes are asynchronous request information, probably generated more than 100,000. Consumption is not and can not be recovered in time, it had a memory leak. (Note that using the thread pool is also likely to occur in this case)

 

  6: analyze the reasons, then the rest is to solve the problem. Because I was rushing to the line, I do not know how asynchronous queue length JestClient feature, temporarily put into a synchronous asynchronous, temporary solution to this problem. After a review of the situation on the line CPU, garbage collection really returned to normal.

  Overall, the above six steps is a complete analysis solution jvm virtual machine memory leak process, of course, may have imperfections, but the general idea is correct.

  Through this article, we can conclude the following:

  1: How to analyze Java service occupies too much CPU problem

  2: Use a variety of Java when the queue must pay attention to the length of the queue, prevent memory leaks.

  3: It is best to familiarize yourself with jvm memory model

Guess you like

Origin www.cnblogs.com/jvStarBlog/p/11688313.html