Comprehensive use of jdk tools to view thread conditions

Recently, during the stress test in the test environment, after the stress test was run for a period of time, it was found that the application could not handle any response. Finally, after investigation, it was found that because logback's RollingFileAppender will cause thread deadlock waiting (a large number of threads in the waiting state are seen in jstack)

solution under concurrent conditions, change logback's appender

<appender name="RollingFile-Appender" class="ch.qos.logback.core.rolling.RollingFileAppender">
        <file>${LOG_PATH}/rollingfile.log</file>
        <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
            <fileNamePattern>${LOG_ARCHIVE}/%d{yyyy-MM-dd}.%i.log</fileNamePattern>
            <maxHistory>90</maxHistory>
            <timeBasedFileNamingAndTriggeringPolicy
                    class="ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP">
                <!-- The maximum 64MB exceeds the maximum value, and a file will be rebuilt-->
                <maxFileSize>64 MB</maxFileSize>
            </timeBasedFileNamingAndTriggeringPolicy>
        </rollingPolicy>
        <encoder>
            <pattern>
                [ %-5level] [%date{yyyy-MM-dd HH:mm:ss}] %logger{96} [%line] - %msg%n
            </pattern>
            <charset>UTF-8</charset> <!-- Set the character set here to prevent Chinese garbled characters-->
        </encoder>
    </appender>
    <appender name="Async-Appender" class="ch.qos.logback.classic.AsyncAppender">
        <appender-ref ref="RollingFile-Appender" />
    </appender>


    <root level="WARN">
        <appender-ref ref="Async-Appender" />
    </root>


Attached is the Java thread status troubleshooting plan. Some of the following commands may be special and are not applicable to the current problem (the troubleshooting command for this problem is relatively simple):

Use the jps command to view the process ID number

root@ubuntu:~/apps$ jps -mlvV | grep -v 'mlvV'
1401 gateway-0.0.1-SNAPSHOT.jar
1374 eureka-server-0.0.1-SNAPSHOT.jar


Use the top command to view the memory and CPU conditions (you can directly view the overall situation with top, where you use top -Hp ${pid} to view all threads in a single java process)
root@ubuntu:~/apps$ top -Hp 1401
top - 19:17:23 up 2 days, 13:56,  1 user,  load average: 0.03, 0.03, 0.00
Threads:  71 total,   0 running,  71 sleeping,   0 stopped,   0 zombie
%Cpu (s): 0.2 us, 0.0 sy, 0.0 ni, 99.8 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st
KiB Mem :  4028724 total,  1166204 free,  2214336 used,   648184 buff/cache
KiB Swap:  4192252 total,  4192252 free,        0 used.  1555352 avail Mem

   PID USER      PR  NI    VIRT    RES    SHR S %CPU %MEM     TIME+ COMMAND
  1457 root      20   0 3584736 657100  17876 S  1.0 16.3  19:39.91 java
  1435 root      20   0 3584736 657100  17876 S  0.7 16.3  19:32.40 java
  1439 root      20   0 3584736 657100  17876 S  0.3 16.3   0:14.52 java
  1374 root      20   0 3584736 657100  17876 S  0.0 16.3   0:00.00 java
  1376 root      20   0 3584736 657100  17876 S  0.0 16.3   0:09.49 java


If you find that the memory or CPU used by a process exceeds expectations, you can view the virtual machine stack of the process separately.
Because in jstack, the thread ID is in hexadecimal, and the thread ID seen by the top command is in decimal, you can use the following Command to convert decimal to hexadecimal:

root@ubuntu:~/apps$ printf '%x\n' 1457
5b1


Use jstack to observe the virtual machine stack and operation of the thread. Using grep -n '0x5b1' here, you can view the specified thread number, the line number (position) in the jstack output, and then use the more +n${num} command to view the content from the specified line,

or you can directly Use jstack 1374 to view the overall situation of the stack

root@ubuntu:~/apps$ jstack 1374|grep -n '0x5b1'
321:"TaskAcceptor-localhost" #51 daemon prio=5 os_prio=0 tid=0x00007fae7d7f6800 nid=0x5b1 waiting on condition [0x00007fae2da86000]


View the virtual machine stack and running status of the thread from the specified line (the key is more +n)
root@ubuntu:~/apps$ jstack 1374|more +n321
"TaskAcceptor-localhost" #51 daemon prio=5 os_prio=0 tid=0x00007fae7d7f6800 nid=0x5b1 waiting on condition [0x00007fae2da86000]
   java.lang.Thread.State: TIMED_WAITING (parking)
	at sun.misc.Unsafe.park(Native Method)
	- parking to wait for  <0x00000000c537a0a8> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
	at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:215)
	at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2078)
	at java.util.concurrent.LinkedBlockingQueue.poll(LinkedBlockingQueue.java:467)
	at com.netflix.eureka.util.batcher.AcceptorExecutor$AcceptorRunner.drainInputQueues(AcceptorExecutor.java:225)
	at com.netflix.eureka.util.batcher.AcceptorExecutor$AcceptorRunner.run(AcceptorExecutor.java:186)
	at java.lang.Thread.run(Thread.java:748)

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=326078352&siteId=291194637