The sequel to Jenkin performance problem handling in K8S environment (task pod setting)

Handling Jenkin performance problems in K8S environment

This article is a sequel to "Handling Jenkin Performance Problems in the K8S Environment" . The previous article solved the performance problem of the Master node in the Jenkins cluster, but it is not the Master node that actually performs the task, but the Pod temporarily created for each task. These Pod performance issues determine the speed of the task or even the success or failure;

Environmental information

  1. Hardware: Three CentOS 7.7 servers
  2. governors : 1.15
  3. JDK:1.8.0_141
  4. maven:3.6.3

For details on deploying and using Jenkins in a K8S environment, please refer to "Helm Deployment and Experience Jenkins"

Task node performance issues

Java programmers often use Jenkins to compile and build Maven projects. If the build parameters use the default configuration, performance problems are prone to occur. Next, use open source projects to reproduce this problem:

  1. When configuring the Jenkins kubernetes plugin, the memory allocated to the task pod is 1G, as shown below:
    Insert picture description here
  2. Next, download the Flink source code (version 1.8.3-rc3) from GitHub, and then compile and build. The pipeline source code corresponding to this task is as follows:
pipeline {
  agent {
    label 'my-jenkins-jenkins-slave'
  }
 
  tools {
    maven 'mvn-3.6.3'
  }
 
  stages {
    stage('Checkout') {
      steps {
        echo '从GitHub下载flink工程的源码(1.8.3-rc3归档包)'
        sh 'wget https://codeload.github.com/apache/flink/tar.gz/release-1.8.3-rc3'
        echo '下载结束,解压归档包'
        sh 'tar -zxf release-1.8.3-rc3'
      }
    } 
    
    stage('Build') {
      steps {
        echo '开始编译构建'
        sh 'cd flink-release-1.8.3-rc3 && mvn clean package -U -s /home/jenkins/settings/settings.xml'
      }
    }
  }
}
  1. In the process of executing the build task, click the gray ball in the red box below to jump to the log page of the Pod that executed the task:
    Insert picture description here
  2. The following figure is the Pod log page. The red box shows that the task has an exception, (this time a unit test case is being executed)
    Insert picture description here
  3. As shown in the red box below, the Pod performing the task is displayed as offline:
    Insert picture description here
  4. At this point, log in to the kubernetes environment and check the Pod status. As shown in the red box below, the Pod status of the task is OOMKilled. It seems that insufficient memory has caused the Podl to be destroyed:
    Insert picture description here

The first adjustment (K8S parameter)

  1. Because the Pod memory is too small and the task fails, you can adjust the Pod memory in the settings page of Jenkins. As shown in the following figure, this time set to 6G. Be careful not to exceed the hardware configuration of the host machine:
    Insert picture description here
  2. Repeat the above task once, this time the memory is sufficient and the construction is successful;

Observed

Since this task is mainly to perform maven compilation and build, it is necessary to understand the memory usage of the maven process:

  1. In the process of task execution, find the docker container corresponding to the Pod ( kubectl describe pod xxx command), the ID is 22484d8b1e56
  2. Execution docker exec 22484d8b1e56 jps get maven process ID 87 (the name of the Launcher), as shown below:
    Insert picture description here
  3. Executing docker exec 22484d8b1e56 jmap -heap 87 can see the JVM memory of the maven process. As shown in the following figure, it can be seen that the actual memory used by maven is only about 3G:
    Insert picture description here
  4. At this time, the Pod has 6G memory, and you can give the maven process more memory through parameter settings;

Second adjustment (JVM parameters)

Next, try to set the memory parameters of the maven process. Here, try to divide most of the memory to the old generation :

  1. As shown below, enter the settings page:
    Insert picture description here
  2. As shown in the figure below, find Pod Templates , add the environment variable, the key is MAVEN_OPTS , the value is -Xms5632m -Xmx5632m -Xmn512m -Xss256k , because the total memory of the Pod is 6G, so after this setting, the entire Pod system memory is only 512m , the rest The 5632m memory is all allocated to the maven process, and the young generation of the maven process is only 512m, leaving all the memory to the old generation:
    Insert picture description here
  3. After saving the settings, execute the task again, first find the docker container corresponding to the task Pod, and then use the jmap -heap command to view the memory of the maven process. As shown in the figure below, 5632m of memory is all allocated to the maven process, and the young generation is also controlled at 512m:
    Insert picture description here
  4. The following figure is the execution of the jstat command to view the GC situation of the maven process. The red box is the number of YGC, and the blue box is the number of FGC. Due to the small memory of the young generation, the frequency of YGC is frequent, but the number of FGC is not large:
    Insert picture description here
  5. The StackOverflow exception also occurred during the build process . As shown in the following figure, the solution is still to adjust the parameter MAVEN_OPTS , the value is -Xms5632m -Xmx5632m -Xmn512m -Xss512k , which is to double the thread stack memory:
    Insert picture description here
  6. The above settings are not optimal, but verify that the adjustment of MAVEN memory parameters can take effect;
  7. Through the Jenkins settings page, the custom settings of the task Pod and the corresponding maven process have been completed. I hope this article can give you some reference to help you make targeted adjustments and optimization according to the characteristics of the project;

Welcome to pay attention to my public number: programmer Xinchen

Insert picture description here

Published 376 original articles · praised 986 · 1.28 million views

Guess you like

Origin blog.csdn.net/boling_cavalry/article/details/105181474