'Azure Data Factory not interpreting well an array global parameter
We have an Azure Data Factory using Global Parameters, it's working fine on our Dev environment, but we when try do deploy it to QA environment using an Azure DevOps pipeline, it seems it's not understanding the only Global Parameter with type = array; even though all of the other parameters are good.
This is the guide we're using to build the CI/CD pipelines.
We have something similar to this in the Global Parameters JSON file:
{
"FilesToProcess": {
"type": "array",
"value": [
"VALUE01",
"VALUE02",
"VALUE03",
"VALUE04",
"VALUE05",
"VALUE06",
"VALUE07",
"VALUE08",
"VALUE09",
"VALUE10",
"VALUE11",
"VALUE12",
"VALUE13",
"VALUE14",
"VALUE15",
"VALUE16",
"VALUE17",
"VALUE18",
"VALUE19",
"VALUE20",
"VALUE21",
"VALUE22",
"VALUE23",
"VALUE24",
"VALUE25",
"VALUE26",
"VALUE27"
]
},
"EmailLogicAppUrl": {
"type": "string",
"value": "URL
}
}
All of the paremeters are deployed fine, except for the array one, and we're getting this:
We have debugged the PS script to update the Global Parameters, and it seems it's understanding well the array, so it has to be something else.
Any help will be highly appreciated.
Thanks!
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|