Why does stash/unstash not work in this Jenkinsfile?

vextorspace :

I've got a Jenkins server running onsite, and it uses a Jenkinsfile to manage a pipeline that uses the parallel test executor plugin to run all my JUnit tests on several agents to speed the test up. We have a blade server we made (way cheaper than buying one!) and it sped our tests from close to 2 hours to 22 minutes. The JUnit plugin works great with parallel tests.

The Jacoco Plugin however does not. So I am trying to get the coverage files merged to one file so that the Jacoco plugin can publish the coverage results. Stash/unstash is working in the storing of sources but it is not working when I try to stash the different Jacoco output files to unstash them on the master.

Any ideas why?

Here is my Jenkinsfile:

#!/usr/bin/env groovy

def branch
def hash

node('remote') {
  sh 'echo starting'

  branch = env.gitlabBranch ?: '**'
  echo "Branch: $branch"

  checkout([$class: 'GitSCM',
        branches: [[name: "$branch"]],
        extensions: [
          [$class: 'PruneStaleBranch'],
          [$class: 'CheckoutOption', timeout: 120],
          [$class: 'CloneOption', depth: 0, noTags: true, shallow: true, timeout: 180]
        ],
        doGenerateSubmoduleConfigurations: false,
        submoduleCfg: [],
        userRemoteConfigs: [[credentialsId: 'gitlabLabptop', url: '[email protected]:protocase/my_project_url.git']]
       ]
      )

  hash = sh (script: 'git rev-parse HEAD', returnStdout: true).trim()

  ### - this stash works fine -###
  stash name: 'sources', includes: '**', excludes: '**/.git,**/.git/**'
}

def numBranches = 9
def splits = splitTests count(numBranches)
def branches = [:]

for (int i = 0; i < splits.size(); i++) {
  def index = i // fresh variable per iteration; i will be mutated

  branches["split${i}"] = {
    timeout(time: 125, unit: 'MINUTES') {
      node('remote') {
    sh 'echo starting a node'
    deleteDir()

    ### - this unstash works fine - ###
    unstash 'sources'

    def exclusions = splits.get(index);
    writeFile file: 'test/exclusions.txt', text: exclusions.join("\n")

    sh 'ant clean'

    sh 'rm -rf build'

    sh 'ant jar'

    sh 'ant -buildfile build-test.xml buildTests'

    sh 'ant -buildfile build-test.xml jenkinsBatch'

    junit 'build/test/results/*.xml'

    sh "mv build/test/jacoco/jacoco.exec build/test/jacoco/jacoco${index}.exec"
    echo "name: coverage$index, unclude jacoco${index}"

       ### - this stash appears to work - ### 
       stash name: "coverage$index", includes: "build/test/jacoco/jacoco${index}.exec"
       echo "stashed"

      }
    }
  }
}

parallel branches


def branchIndecis = 0..numBranches

node('master') {
  if (currentBuild.result != "ABORTED") {

    echo "collecting exec files"

    branchIndecis.each {
      echo "unstash coverage${it}"

      ### !!! this unstash causes an error !!! ###
      unstash name: "coverage${it}"



      echo "make file name"
      def coverageFileName = "build/test/jacoco/jacoco${it}.exec"
      echo "merge file"
      sh "ant -buildfile build-test.xml -Dfile=${coverageFileName} coverageMerge"
    }

    echo "collected exec files"

    step([$class: 'JacocoPublisher',
      execPattern:'build/test/jacoco/jacoco.exec',
      classPattern: 'build/classes',
      sourcePattern: 'src'])

    echo "finishing ${branch} - ${hash}"

  }
}

the output I get is:

[split7] [jdesigner] Running shell script
[split7] + mv build/test/jacoco/jacoco.exec build/test/jacoco/jacoco7.exec
[Pipeline] [split7] echo
[split7] name: coverage7, unclude jacoco7
[Pipeline] [split7] stash
[split7] Stashed 1 file(s)
[Pipeline] [split7] echo
[split7] stashed
[Pipeline] [split7] }
[Pipeline] [split7] // node
[Pipeline] [split7] }
[Pipeline] [split7] // timeout
[Pipeline] [split7] }
[Pipeline] // parallel
[Pipeline] node
Running on eightyeight in /var/jenkins/workspace/jdesigner
[Pipeline] {
[Pipeline] echo
collecting exec files
[Pipeline] echo
unstash coverage0
[Pipeline] unstash
[Pipeline] }
[Pipeline] End of Pipeline
Finished: FAILURE

[edit] the stash for coverage0 is

[split0] Recording test results
[Pipeline] [split0] sh
[split0] [jdesigner] Running shell script
[split0] + mv build/test/jacoco/jacoco.exec build/test/jacoco/jacoco0.exec
[Pipeline] [split0] echo
[split0] name: coverage0, include jacoco0
[Pipeline] [split0] stash
[split0] Stashed 1 file(s)
[Pipeline] [split0] echo
[split0] stashed
[Pipeline] [split0] }
[Pipeline] [split0] // node
[Pipeline] [split0] }
[Pipeline] [split0] // timeout
[Pipeline] [split0] }
[split3]     [junit] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.737 sec
[split3]     [junit] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.737 sec

note the line

[split0] name: coverage0, include jacoco0

is just my echo statement where I echo the name from this part of the script:

    sh "mv build/test/jacoco/jacoco.exec build/test/jacoco/jacoco${index}.exec"
    echo "name: coverage$index, include jacoco${index}"

    stash name: "coverage$index", includes: "build/test/jacoco/jacoco${index}.exec"
    echo "stashed"

Note the actual stashing is not done on the node, it is listed as pipeline even though it is done on a remote node. I've seen things that indicated the stash is done on the master but not really where that directory resides.

[[FURTHER EDIT]] - thanks to eis for the recommendations.

The jobs/jdesigner/builds/1639/stashes/ directory on master has coverage#.tar.gz files that include the appropriate jacoco#.exec files. When I put a try catch around the unstash:

try {
    unstash name: "coverage${it}"
} catch (error) {
    echo "error unstashing: ${error}"
}

the output I get is:

collecting exec files
[Pipeline] echo
unstash coverage0
[Pipeline] unstash
[Pipeline] echo
error unstashing: java.io.NotSerializableException: groovy.lang.IntRange
[Pipeline] echo
make file name
eis :

TLDR: this was a case of this problem where iterating style caused the issue, since key it used wasn't Serializable.

Thing making this hard to debug was that the error message wasn't properly reported, possibly due to this issue. Catching the exception in code and "manual" reporting fixed that.

Actual issue was fixed by using Serializable keys.


Longer version:

Since in your example this works:

node('remote') {
    ### - this stash works fine -###
    stash name: 'sources', includes: '**', excludes: '**/.git,**/.git/**'
}
node('remote') {    
    ### - this unstash works fine - ###
    unstash 'sources'
}

But this doesn't:

node('remote') {

   ### - this stash appears to work - ### 
   stash name: "coverage$index", includes: "build/test/jacoco/jacoco${index}.exec"
   echo "stashed"

}
node('master') {
   echo "unstash coverage${it}"

   ### !!! this unstash causes an error !!! ###
   unstash name: "coverage${it}"
}

I initially thought the working one is stashed and unstashed on your remote node, while the non-working one is stashed on your remote node but you try to unstash it on your master node (where it naturally won't be found).

However, that wasn't the case. According to this,

When you stash a file on a slave, the files are send to the master. The files will be stored in the Job folder, in the associated build folder under the stash folder. Each stash will be stored as a tar file. These files are deleted at the end of the build.

So master-remote separation shouldn't make a difference. Additionally, if it was about stash not being found, you can see from the sources that it would fail with "No such saved stash ‘" + name + "’, since according to AbortException javadoc "When this exception is caught, the specified message will be reported.". That is clearly not happening.

Instead, one should debug using a try-catch block to find out what is the real exception that is breaking the build.

As to why it's not reported properly by default, there is this issue: "Serialization error at end of flow not reported properly in build log, only Jenkins log". The bug report claims its "fixed" but apparently only because on new versions, some test of this behaviour didn't trigger the problem, so it might still exist.

With the error message catched, one could see that the problem was this - we were trying to serialize an unserializable key when we were passing it on.

Guess you like

Origin http://10.200.1.11:23101/article/api/json?id=461747&siteId=1