'How to use a FileParameterValue in a jenkins 2 pipeline

How can a file from the current project workspace be passed as a parameter to another project.

e.g. something like:

build job: 'otherproject', parameters: [[$class: 'FileParameterValue', name: 'output.tar.gz', value: ??? ]], wait: false


Solution 1:[1]

The java.File object only can recover files from the master node.
So to load the files as a java.File objects we use the master node to unstash the required files, then we wrap them as file objects and finally we send them as a FileParameterValue objects.

node("myNode") {
    sh " my-commands -f myFile.any " // This command create a new file.
    stash includes: "*.any", name: "my-custom-name", useDefaultExcludes: true
}

node("master") {
    unstash "my-custom-name"
    def myFile = new File("${WORKSPACE}/myFile.any")
    def myJob = build(job: "my-job", parameters: 
                    [ string(name: 'required-param-1', value: "myValue1"),
                      new FileParameterValue("myFile.any", myFile, "myFile.any")
                    ], propagate: false)

    print "The Job execution status is: ${myJob.result}."

    if(myJob.result == "FAILURE") {
      error("The Job execution has failed.")
    }
    else {
      print "The Job was executed successfully."
    }
}

You could skip the master node If the file that you need to send contain only text.

def myFileContent = readFile("myFile.txt")
FilePath fp = new FilePath(new File("${WORKSPACE}","myFile.txt"))
if(fp!=null){
    fp.write(myFileContent, null)
}
def file = new File("${WORKSPACE}/myFile.txt")

Then use the file on the FileParameterValue object as usual.
Don't forget to import the FilePath object -> import hudson.FilePath

Solution 2:[2]

I've tried this myself recently with little success. There seems to be a problem with this. According to the documentation for class FileParameterValue there is a constructor which accepts a java.io.File like so:

@DataBoundConstructor
FileParameterValue(String name,
                   org.apache.commons.fileupload.FileItem file)

There is another wich expects a FileItem like so:

FileParameterValue(String name,
                   File file,
                   String originalFileName)

But since only the former is annotated with @DataBoundConstructor even when I try to use the latter in a script:

file = new File(pwd(), 'test.txt');
build(
    job: 'jobB',
    parameters: [
        [$class: "FileParameterValue", name: "TEST_FILE", file: file, originalFileName: 'test.txt']
    ]
)

Note that this requires script approval for instantiating java.io.File

... I get the following error:

java.lang.ClassCastException: hudson.model.FileParameterValue.file expects interface org.apache.commons.fileupload.FileItem but received class java.io.File

I understand that only a file uploaded by the user as interactive runtime input provides an object of type org.apache.commons.fileupload.FileItem so in the end I resorted to archiving the file in the first job and unarchiving it in the downstream job, and got around the problem. It's not ideal of course but if you're in a jam it's the quickest way to sort it out.

Solution 3:[3]

You can't. Here is the jenkins bug. Update this thread once the bug is fixed. In the meantime, login and vote for this issue and ask for them to add documentation for pipeline build job parameters.

https://issues.jenkins-ci.org/browse/JENKINS-27413

Linked to from here: http://jenkins-ci.361315.n4.nabble.com/pipeline-build-job-with-FileParameterValue-td4861199.html

Here is the documentation for different parameter types (Link to FileParameterValue)

http://javadoc.jenkins.io/hudson/model/FileParameterValue.html

Solution 4:[4]

Try to pass an instance of FileParameterValue to parameters (it worked for me):

import hudson.model.*

def param_file = new File("path/to/file")

build job: 'otherproject', parameters: [new FileParameterValue('file_param_name', param_file, 'original_file_name')], wait: false

Solution 5:[5]

Using jenkins file parameter plugin, it supports (i) base 64 file and (ii) stash file.

The following is an "example" of caller and callee pipeline jenkins scripts on windows agent.

Caller

pipeline {
    stages {
        stage ('Call Callee Job') {
            steps {
                script {
                    def callee_job = build(job: 'test-callee', parameters: [
                        base64File(name: 'smallfile', base64: Base64.encoder.encodeToString('small file 123'.bytes)),
                        stashedFile(name: 'largefile', file: getFileItem())
                    ], propagate: true)
                }
            }
        }
    }
}

// Read file and convert from java file io object to apache commons disk file item object
@NonCPS
def getFileItem() {
    def largeFileObject = new File(pwd(), "filename.apk")
    def diskFileItem = new org.apache.commons.fileupload.disk.DiskFileItem("fieldNameFile", "application/vnd.android.package-archive", false, largeFileObject.getName(), (int) largeFileObject.length() , largeFileObject.getParentFile())
    def inputStream = new FileInputStream(largeFileObject)
    def outputStream = diskFileItem.getOutputStream()
    org.apache.commons.io.IOUtils.copy(inputStream, outputStream)
    inputStream.close()
    outputStream.close()
    return diskFileItem         
}

Callee

pipeline {

    parameters {
        base64File(name: 'smallfile')
        stashedFile(name: 'largefile')         
    }

    stages {
        stage ('Print params') {
            steps {
                echo "params.smallfile: ${params.smallfile}" // gives base64 encoded value
                echo "params.largefile: ${params.largefile}" // gives null
                
                withFileParameter('smallfile') {
                  echo "$smallfile" // gives tmp file path in callee job workspace
                  bat "more $smallfile" // reads tmp file to give content value
                }
        
                unstash 'largefile'
                bat 'dir largefile' // shows largefile in callee job workspace directory
            }
        }
    }
}

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1
Solution 2 Mig82
Solution 3 TheJeff
Solution 4 Artem Panchenko
Solution 5 MrTan