java应用cpu占用过高问题分析及解决方法

这篇文章主要介绍了java应用cpu占用过高问题分析及解决方法,具有一定参考价值,需要的朋友可以参考下。

使用jstack分析java程序cpu占用率过高的问题

1,使用jps查找出java进程的pid,如3707

2,使用top -p 14292 -H观察该进程中所有线程的CPU占用。

?
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
[root @cp01 -game-dudai- 0100 .cp01.baidu.com ~]# top -p 14292 -H
top - 22 : 14 : 13 up 33 days, 7 : 29 , 4 users, load average: 25.68 , 32.11 , 33.76
Tasks: 113 total,  2 running, 111 sleeping,  0 stopped,  0 zombie
Cpu(s): 68.3 %us, 6.3 %sy, 0.0 %ni, 20.2 %id, 0.1 %wa, 0.2 %hi, 4.9 %si, 0.0 %st
Mem: 65965312k total, 65451232k used,  514080k free,  82164k buffers
Swap:  975864k total,  972052k used,   3812k free, 9714400k cached
  PID USER   PR NI VIRT RES SHR S %CPU %MEM  TIME+ COMMAND                                                                                    
15844 root   15  0 6889m 5 .7g 4864 S 20.6 9.1 814 : 13.29 java                                                                                     
15848 root   15  0 6889m 5 .7g 4864 S 13.0 9.1 460 : 25.17 java                                                                                     
15611 root   15  0 6889m 5 .7g 4864 S 12.7 9.1 468 : 17.77 java                                                                                     
15613 root   15  0 6889m 5 .7g 4864 S 11.7 9.1 479 : 40.45 java                                                                                     
15743 root   15  0 6889m 5 .7g 4864 S 11.7 9.1 443 : 04.80 java                                                                                     
15612 root   15  0 6889m 5 .7g 4864 S 11.0 9.1 453 : 43.68 java                                                                                     
15965 root   15  0 6889m 5 .7g 4864 S 10.3 9.1 371 : 00.33 java                                                                                     
15490 root   15  0 6889m 5 .7g 4864 S 7.7 9.1 255 : 32.74 java                                                                                     
15587 root   15  0 6889m 5 .7g 4864 S 7.3 9.1 282 : 27.58 java                                                                                     
15590 root   15  0 6889m 5 .7g 4864 S 7.3 9.1 205 : 48.37 java                                                                                     
15491 root   15  0 6889m 5 .7g 4864 R 6.3 9.1 279 : 09.08 java                                                                                     
15689 root   15  0 6889m 5 .7g 4864 S 5.7 9.1 251 : 42.36 java                                                                                     
16935 root   15  0 6889m 5 .7g 4864 S 5.7 9.1 190 : 34.37 java                                                                                     
15665 root   15  0 6889m 5 .7g 4864 S 5.3 9.1 250 : 07.34 java                                                                                     
16920 root   15  0 6889m 5 .7g 4864 S 5.3 9.1 241 : 34.50 java                                                                                     
15671 root   15  0 6889m 5 .7g 4864 S 5.0 9.1 239 : 49.97 java                                                                                     
15492 root   15  0 6889m 5 .7g 4864 S 4.7 9.1 210 : 23.09 java                                                                                     
14322 root   16  0 6889m 5 .7g 4864 S 4.3 9.1 107 : 39.61 java                                                                                     
14316 root   16  0 6889m 5 .7g 4864 S 4.0 9.1 107 : 18.43 java                                                                                     
14317 root   16  0 6889m 5 .7g 4864 S 4.0 9.1 107 : 29.13 java                                                                                     
15591 root   15  0 6889m 5 .7g 4864 S 4.0 9.1 114 : 34.90 java                                                                                     
14313 root   16  0 6889m 5 .7g 4864 S 3.7 9.1 107 : 12.70 java                                                                                     
14314 root   15  0 6889m 5 .7g 4864 S 3.7 9.1 107 : 28.05 java                                                                                     
14319 root   16  0 6889m 5 .7g 4864 S 3.7 9.1 107 : 27.43 java                                                                                     
14321 root   15  0 6889m 5 .7g 4864 S 3.3 9.1 108 : 01.12 java                                                                                     
15589 root   15  0 6889m 5 .7g 4864 R 3.0 9.1 109 : 01.91 java                                                                                     
15615 root   15  0 6889m 5 .7g 4864 S 3.0 9.1 114 : 55.29 java                                                                                     
16808 root   15  0 6889m 5 .7g 4864 S 2.7 9.1 279 : 05.03 java                                                                                     
14315 root   15  0 6889m 5 .7g 4864 S 2.0 9.1 107 : 45.00 java                                                                                     
14320 root   15  0 6889m 5 .7g 4864 S 2.0 9.1 107 : 48.30 java                                                                                     
15489 root   15  0 6889m 5 .7g 4864 S 1.7 9.1 57 : 38.46 java                                                                                     
15670 root   15  0 6889m 5 .7g 4864 S 1.3 9.1  5 : 55.43 java                                                                                     
14318 root   15  0 6889m 5 .7g 4864 S 0.7 9.1 107 : 45.88 java                                                                                     
14826 root   15  0 6889m 5 .7g 4864 S 0.7 9.1 25 : 07.64 java

3,找出CPU消耗较多的线程id,如15844,将15844转换为16进制0x3de4,注意是小写哦

printf "%x\n" 15844

3de4

4,使用jstack 14292|grep -A 10 0x3de4来查询出具体的线程状态。

?
1
2
3
4
5
6
7
8
9
10
11
[root @cp01 -game-dudai- 0100 .cp01.baidu.com ~]# jstack 14292 |grep -A 10 0x3de4
"pool-52-thread-1" prio= 10 tid= 0x000000005a08e000 nid= 0x3de4 waiting on condition [ 0x00002ae63d917000 ]
   java.lang.Thread.State: WAITING (parking)
     at sun.misc.Unsafe.park(Native Method)
     - parking to wait for < 0x00000006f9a0a110 > (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
     at java.util.concurrent.locks.LockSupport.park(LockSupport.java: 156 )
     at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java: 1987 )
     at java.util.concurrent.LinkedBlockingQueue.take(LinkedBlockingQueue.java: 399 )
     at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java: 947 )
     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java: 907 )
     at java.lang.Thread.run(Thread.java: 662 )

通过这些线程状态便可基本定位问题之所在。

解决办法:

方法1

1.jps 获取Java进程的PID。

扫描二维码关注公众号,回复: 1704204 查看本文章

2.jstack pid >> java.txt 导出CPU占用高进程的线程栈。

3.top -H -p PID 查看对应进程的哪个线程占用CPU过高。

4.echo “obase=16; PID” | bc 将线程的PID转换为16进制,大写转换为小写。

5.在第二步导出的Java.txt中查找转换成为16进制的线程PID。找到对应的线程栈。

6.分析负载高的线程栈都是什么业务操作。优化程序并处理问题。

方法2

1.使用top 定位到占用CPU高的进程PID

top

通过ps aux | grep PID命令

2.获取线程信息,并找到占用CPU高的线程

ps -mp pid -o THREAD,tid,time | sort -rn

3.将需要的线程ID转换为16进制格式

printf "%x\n" tid

4.打印线程的堆栈信息

jstack pid |grep tid -A 30

猜你喜欢

转载自blog.csdn.net/qq_29663071/article/details/80730996