Background I have an ADF pipeline that takes jobs from a “3rd party” queue (ueing a REST call), completes jobs and marks the queued message as compl
I am using Copy and transform data from and to a REST endpoint by using Azure Data Factory to load a file from my Box.com account to an Azure Data Lake Gen2 (AD
We are reading csv file input which has two fields. Sample input file is given for reference source, balance 1,100 2, 200 I need to create a trailer record in
I am building an Azure Data Factory pipeline and I would like to know how to get this parameter into the python script. The python script is l
We have an Azure Data Lake Gen 2 which contains 100's of thousands of JSON messages that come in on a continuous basis. These files are stored in a folder struc
After generating ~90 different 100 mb gzip'd CSV files, I want to merge them all into a single file. Using the built-in merge option for a data copy process, it
I have XML file something like below and need to get all complex elements having different names but all ends with "_KEYS" and they are part of different segmen
I am trying to copy data from a REST API into an Azure SQL Database table. The copy data activity uses pagination and can run longer than 10 minutes after which
We have an Azure Data Factory using Global Parameters, it's working fine on our Dev environment, but we when try do deploy it to QA environment using an Azure D
Description When I copy data from Storage to CosmosDB by Azure data factory, the Cosmos DB RU/s is full and nobody can use it during this time. I want other ope
I have a set of excel files inside ADLS. The format looks similar to the one below: The first 4 rows would always be the document header information and the la