'Azure Storage Account file details in a table in databricks
I am loading data via pipelines in ADLS gen2 container.
Now I want to create a table that has details that when the pipeline start running and then completed. like below fields where startts - start time of job endts- end time of job but extractts is the extraction time which is what i want to create.. is there any approch i can create table ?? help will be really appreciated.
Solution 1:[1]
It might not be exactly what you need, but you can add columns inside a copy activity with the property @pipeline().TriggerTime to get your "startts" field. Go to the bottom option on the source tab and select new, and on value choose "Add dynamic content".
For your "extractts" you could use the property of the activity called "executionDuration", which will give you the time it took to adf to run the activity in seconds. Again, use dynamic content and get this: @activity('ReadFile').output.executionDuration, then store this value wherever you need.
In this example I'm storing the value in a variable, but you could use it anywhere.
I dont understand your need for a field "endts", I would just simply do startts+extractts to get it.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 | Martin Esteban Zurita |