Category "azure-data-factory"

Is there any guideline how track changes to an ADF pipeline source code in a Git repository?

I am developing have a few relatively complex ADF pipelines and I like to track my changes. Traditionally with programming languages I keep my code in a git rep

Copy Activity fails from ADO (Odata) Source to Azure Data lake

I need to copy data from ADO(OData) to Azure Data lake. All connections and Linked services are working good. I can preview data from ADO(OData) but getting bel

How can I use square brackets in a mapping dataflow path?

I'm trying read a file from a directory that contains square brackets in the path using a mapping dataflow in Azure Synapse, like this: /path/to/[a].[b]/some/fi

ADF append data to an onprem file server file

I'm trying to get data from an ADF variable and append it to a file available in an on-prem file server. I've also created a self hosted IR to connect to the on

Complex Derived Column Pattern with Cache Sink Lookup

I'm not sure if this is a bug or a limitation but I have a some what complex Derived Column transform using a Column Pattern and Cache Sink to map/set the value

REST source in Azure data flow retrieving an xml file results in corrupt record

In Azure Data Factory, I've got a source in data flow that calls a rest API. This is the call: https://stats.oecd.org/restsdmx/sdmx.ashx/GetData/MEI_CLI/CSCIC

SQL Server Change Data Capture - Validating Incremental Window

I want to implement an incremental load process using SQL Server Change Data Capture. Every example I find takes the "happy path." In other words, they assume t

How to embed SQL script in Azure Data Factory Mapping Data Flows Expression Builder

After executing my mapping dataflow I would like to run a clean-up script using the Post SQL Script option of the Sink activity. However I'm not having much su

Error in dataflow plugins.adfprod.AutoResolveIntegrationRuntime.45

I am getting below error while running my dataflow. This dataflow was running fine till yesterday. From today onwards I am getting error like this Operation on

Escape '\' in dynamic content in Azure Data Factory

I am hitting a query through lookup activity on DB2 Database using ADF. Query is: But when I execute the query the '\' count gets double. The output is provide

Expression to check if a given string exist in a output of a script task in ADF

I get following result from my script task, How should I build expression to check if the result contains 'exceed' { "resultSetCount": 1, "recordsAffected": 0,

Data Factory: How to pass JSON as text to Web Body

I have to pass multiple records from a SQL source to the body of a web call in Data Factory in JSON format. eg. {"EmployeeCode":"1234",""FirstName":"Joe","LastN

Issue reading a variable JSON in Azure Data Factory

My pipeline receives the Path and the name of a JSON file. This is part of the flow that I created: The lookup step is used to read the JSON File and later I n

Synapse/Data Factory and Dataverse

Have had a look around but can't see any concrete information. Essentially, if anyone could help it would be great. We are building reporting in the cloud and l

Keep Sink Columns in Copy Activity when Source Columns less than Sink Columns

I have a copy activity in Datafactory that dynamically maps the columns between files in tables A and B. Both tables, A and B are .parquet. Table A has 8 column

Azure Data Factory - Trigger Pipeline Manually Externally

I currently have a Data Pipeline within ADF that pulls data from multiple REST APIs, transforms the data and stores it in an Azure SQL Database and from there i

How set parameters in SQL Server table from Copy Data Activity - Source: XML / Sink: SQL Server Table / Mapping: XML column

I have a question, hopefully someone in the forum could give some help here. I am able to pull data from Soap API call to SQL Server table (xml data type field

How can I change the `Blob path begins with` in the Azure data factory storage event trigger using azure DevOps?

I have set a storage event trigger for the pipelines in the azure data factory Dev environment. the blob path begins with is different in the dev environment an

loop through next url into api url in Azure data factory

I have a an api that contains some data and another api url named "nextapi" I want to loop through each api under api and store the data of each api page to azu

az cli deployment fails with Python error

I am deploying ADF linked templates from ADLS through GitHub Actions and pipeline fails withy generic Python error: ERROR: 'str' object has no attribute 'get' I