I am developing have a few relatively complex ADF pipelines and I like to track my changes. Traditionally with programming languages I keep my code in a git rep
I need to copy data from ADO(OData) to Azure Data lake. All connections and Linked services are working good. I can preview data from ADO(OData) but getting bel
I'm trying read a file from a directory that contains square brackets in the path using a mapping dataflow in Azure Synapse, like this: /path/to/[a].[b]/some/fi
I'm trying to get data from an ADF variable and append it to a file available in an on-prem file server. I've also created a self hosted IR to connect to the on
I'm not sure if this is a bug or a limitation but I have a some what complex Derived Column transform using a Column Pattern and Cache Sink to map/set the value
In Azure Data Factory, I've got a source in data flow that calls a rest API. This is the call: https://stats.oecd.org/restsdmx/sdmx.ashx/GetData/MEI_CLI/CSCIC
I want to implement an incremental load process using SQL Server Change Data Capture. Every example I find takes the "happy path." In other words, they assume t
After executing my mapping dataflow I would like to run a clean-up script using the Post SQL Script option of the Sink activity. However I'm not having much su
I am getting below error while running my dataflow. This dataflow was running fine till yesterday. From today onwards I am getting error like this Operation on
I am hitting a query through lookup activity on DB2 Database using ADF. Query is: But when I execute the query the '\' count gets double. The output is provide
I get following result from my script task, How should I build expression to check if the result contains 'exceed' { "resultSetCount": 1, "recordsAffected": 0,
I have to pass multiple records from a SQL source to the body of a web call in Data Factory in JSON format. eg. {"EmployeeCode":"1234",""FirstName":"Joe","LastN
My pipeline receives the Path and the name of a JSON file. This is part of the flow that I created: The lookup step is used to read the JSON File and later I n
Have had a look around but can't see any concrete information. Essentially, if anyone could help it would be great. We are building reporting in the cloud and l
I have a copy activity in Datafactory that dynamically maps the columns between files in tables A and B. Both tables, A and B are .parquet. Table A has 8 column
I currently have a Data Pipeline within ADF that pulls data from multiple REST APIs, transforms the data and stores it in an Azure SQL Database and from there i
I have a question, hopefully someone in the forum could give some help here. I am able to pull data from Soap API call to SQL Server table (xml data type field
I have set a storage event trigger for the pipelines in the azure data factory Dev environment. the blob path begins with is different in the dev environment an
I have a an api that contains some data and another api url named "nextapi" I want to loop through each api under api and store the data of each api page to azu
I am deploying ADF linked templates from ADLS through GitHub Actions and pipeline fails withy generic Python error: ERROR: 'str' object has no attribute 'get' I