'Creating dynamic workflows for Airflow tasks present in a Python list
I have a list of lists in the following way -
[['X_API', 'Y_API',....], ['Z_API', 'P_API', ...], [....], [...] .... ]
Here, each API name corresponds to a PythonOperator.
In Airflow, I would like to create task dependencies such that from a starting dummy task, I should have parallel tasks for each of the list inside the main list, and the operators inside the list of lists should execute in sequence :
How can I do this ? I would appreciate any help in this !
Existing code:
args = {
'depends_on_past': False,
'start_date': datetime.now(),
'email': '',
'email_on_failure': False,
'email_on_retry': False,
'retries': 3,
'retry_delay': timedelta(minutes=1)
}
dag = DAG(dag_id, default_args=args, schedule_interval=None)
with dag:
tasks = []
tmp, tmp2 = set(), set()
Start = DummyOperator(
task_id='Start',
dag=dag
)
End = DummyOperator(
task_id='End',
dag=dag
)
for i in dags:
for j in i:
if 'APIs' in list(i.keys()):
for l in i['APIs']:
tab = DummyOperator(
task_id=l['api'] + "_API",
dag=dag
)
tmp.add(tab)
elif 'tables' in list(i.keys()):
for k in i['tables']:
tab2 = DummyOperator(
task_id=k['table'] + "_API",
dag=dag
)
tmp2.add(tab2)
tasks.append(list(tmp))
tasks.append(list(tmp2))
for task in tasks:
for op in range(0, len(task)-1):
Start.set_downstream(task[op])
task[op].set_downstream(task[op+1])
task[op+1].set_downstream(End)
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|