'How to pass complex DevOps pipeline template parameter to script
In an Azure DevOps pipeline template, I am declaring a parameter as an array/sequence
parameters:
mySubscription: ''
myArray: []
steps:
- AzureCLI@2
inputs:
azureSubscription: ${{ parameters.mySubscription }}
scriptType: pscore
scriptPath: $(Build.SourcesDirectory)/script.ps1
arguments: '-MyYAMLArgument ${{ parameters.myArray }}'
Value for the parameter is then passed from pipeline definition as
steps:
- template: myTemplate.yml
parameters:
mySubscription: 'azure-connection'
myArray:
- field1: 'a'
field2: 'b'
- field1: 'aa'
field2: 'bb'
My problem is I can't pass that array as-is in YAML syntax (kind of ToString()
) to be able to consume and treat that array from PowerShell in my template. When trying to run this pipeline, I get the following error:
/myTemplate.yml (Line: X, Col: X): Unable to convert from Array to String. Value: Array
. The line/column referenced in the error message correspond to arguments: '-MyYAMLArgument ${{ parameters.myArray }}'
from my template.
I also tried to map the parameter as an environment for my script
- AzureCLI@2
inputs:
azureSubscription: ${{ parameters.mySubscription }}
scriptType: pscore
scriptPath: $(Build.SourcesDirectory)/script.ps1
arguments: '-MyYAMLArgument $Env:MY_ENV_VAR'
env:
MY_ENV_VAR: ${{ parameters.myArray }}
This does not work too:
/myTemplate.yml (Line: X, Col: Y): A sequence was not expected
. That time line/column refers to MY_ENV_VAR: ${{ parameters.myArray }}
.
Does anyone ever faced a similar requirement to pass complex types (here an array/sequence of object) defined from the pipeline definition to a PowerShell script? If so, how did you achieve it?
Solution 1:[1]
You can now convert these types of parameters to String using the convertToJson
function in an ADO pipeline:
parameters:
- name: myParameter
type: object
default:
name1: value1
name2: value2
...
- task: Bash@3
inputs:
targetType: inline
script: |
echo "${{ convertToJson(parameters.myParameter) }}"
convertToJson: https://docs.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops#converttojson
Solution 2:[2]
Based on convertToJson idea by @ed-randall, together with ConvertFrom-Json Powershell function, we can use a JSON 'contract' to pass values between yaml and PS script:
- powershell: |
$myArray = '${{ convertToJson(parameters.myArray) }}' | ConvertFrom-Json
...
Solution 3:[3]
I'm also facing a similar problem, my workaround is to flatten the array in a string using different separator for different dimensions.
For example I want to make some parameters required and fail the build if these parameters are not passed, instead of add a task for every parameter to check, I want to do this in a single task.
To do this I first pass, as a parameter (to another template, called check-required-params.yml
which hold the task responsible for check the parameters), an array where each element is a string of the type name:value
which is a concatenation (using the format
expression) of the name
and the value
of required parameters separated by a colon:
# templates/pipeline-template.yml
parameters:
- name: endpoint
type: string
default: ''
- name: rootDirectory
type: string
default: $(Pipeline.Workspace)
- name: remoteDirectory
type: string
default: '/'
- name: archiveName
type: string
default: ''
#other stuff
- template: check-required-params.yml
parameters:
requiredParams:
- ${{ format('endpoint:{0}', parameters.endpont) }}
- ${{ format('archiveName:{0}', parameters.archiveName) }}
Then in check-required-params.yml
I join the array separating the elements with a semicolon using the expression ${{ join(';', parameters.requiredParams) }}
, this create a string of the type endpoint:value;archiveName:value
and pass this as an environmental variable.
At this point, using a little of string manipulations, in a script I can split the string using the semicolon as separator so I will get an array of strings like name:value
which I can further split but this time using colon as separator.
My check-required-params.yml
looks like:
# templates/check-required-params.yml
parameters:
- name: requiredParams
type: object
default: []
steps:
- task: PowerShell@2
env:
REQURED_PARAMS: ${{ join(';', parameters.requiredParams) }}
displayName: Check for required parameters
inputs:
targetType: inline
pwsh: true
script: |
$params = $env:REQURED_PARAMS -split ";"
foreach($param in $params) {
if ([string]::IsNullOrEmpty($param.Split(":")[1])) {
Write-Host "##vso[task.logissue type=error;]Missing template parameter $($param.Split(":")[0])"
Write-Host "##vso[task.complete result=Failed;]"
}
}
Then in my azure-pipelines.yml
I can do:
#other stuff
- template: templates/pipeline-template.yml
parameters:
endpoint: 'myEndpoint'
rootDirectory: $(Pipeline.Workspace)/mycode
In this example the build will fail because i don't pass the parameter archiveName
You can add some flexibility by using variables also for defining the separators instead of hardcoding in the scripts and in the expressions
Solution 4:[4]
As @Leo Liu MSFT mentioned in its answer, this is indeed not supported right now but someone already opened an issue for this improvement .
This issue also contains a good workaround for now to use environment variables instead. Draw back of this solution is you need to be aware of the data structure in order to map it properly.
parameters:
mylist:[]
#where mylist is a sequence of object matching the mapping:
#- name: 'the name 1'
# value: 'the value of 1'
# index: 0
#- name: 'the name 2'
# value: 'the value of 2'
# index: 1
env:
${{ each item in parameters.mylist }}:
${{ format('SCRIPT_PARAM_{0}_KEY', item.index) }}: ${{ item.name }}
${{ format('SCRIPT_PARAM_{0}_VAL', item.index) }}: ${{ item.value }}
Solution 5:[5]
How to pass complex DevOps pipeline template parameter to script
I am afraid we could not pass complex DevOps pipeline template parameters to a PowerShell script.
Currently, the task of Azure devops only supports the transfer of one-dimensional arrays. It cannot accept and transfer two-dimensional arrays. Although we can define the parameters of a two-dimensional array, but we need to extend the parameters from a template by the scripts like:
- ${{ each field in parameters.myArray}}:
We could use it like:
- ${{ each step in parameters.buildSteps }}:
#- ${{ each pair in step }}:
- task: PowerShell@2
inputs:
targetType : inline
script: |
Write-Host 'Hello World'
But we could not pass the two-dimensional arrays directly to the task like: [field1: 'a', field2: 'b']
. That the reason why you got the error Unable to convert from Array to String
.
You could check document Extend from a template for some more details.
Hope this helps.
Solution 6:[6]
Script file arguments
The below example provides the syntax needed to pass a Azure DevOps yaml boolean and an array to a PowerShell script file via arguments.
boolean -> Switch
object -> Array
Powershell Script
[CmdletBinding()]
param (
[Parameter()]
[switch]
$Check,
[Parameter()]
[string[]]
$Array
)
If($Check.IsPresent)
{
Write-Host "Check is present"
}
else {
Write-Host "Check is not present"
}
Write-Host "Next we loop the array..."
Foreach($a in $Array){
Write-Host "Item in the array: $a"
}
Yaml Pipeline
trigger: none
pool:
vmImage: windows-latest
parameters:
- name: checkBool
type: boolean
- name: paramArray
type: object
default:
- one
- two
steps:
- task: PowerShell@2
inputs:
filePath: 'Scripts/DebugSwitches.ps1'
arguments: -Check:$${{ parameters.checkBool }} -Array ${{ join(', ', parameters.paramArray) }}
Boolean Syntax
Notice the yaml boolean is passed to the PowerShell switch parameter with a colon ':' with no spaces.
Array Syntax
Notice the yaml object array above uses the Join operator to format the array as a comma separated array that is passed to the PowerShell array argument.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | |
Solution 2 | MaMazav |
Solution 3 | DeadlyChambers |
Solution 4 | GGirard |
Solution 5 | Leo Liu-MSFT |
Solution 6 | Dejulia489 |