I have a dag that I want to run multiple times say 30. But airflow can parallelly execute 16 dag runs at a time. Suppose one dag run takes longer time to execut
So, I'm currently working with an Airflow installation via MWAA. I'm having this issue with a broken dependency, specifically: ERROR: pip's dependency resolve
We have 100 of dags which has a prefix with "dag_EDW_HC_*" . we have below command to pause the dag Command: airflow pause dag_id Is there any way we can pause
I have a problem with my dag getting stuck at subdag. The subdag is in RUNNING state but on zooming in all the tasks of the subdag are in None status. Using Air
I want to run my dag as per new york time zone. As the data comes as per the new york time zone and dag fails for the initial runs and skips last runs data as w
I am new to airflow and need assistance on how to install airflow on k8s . Needs are: 1 . How to Build docker image of airflow only for webserver and scheduler
I need to run few Airflow tasks in parallel concurrently and if one task got completed successfully, need to call the other task. How can I do that? Ex: Task A
I have a DAG. Here is a sample of the parameters. dag = DAG( 'My Dag', default_args=default_args, description='Cron Job : My Dag', schedule_inte
I have just upgraded my Airflow from 1.10.13 to 2.0. I am running it in Kubernetes (AKS Azure) with Kubernetes Executor. Unfortunately, I see my Scheduler getti
I am working with Apache Airflow and I have a problem with the scheduled day and the starting day. I want a DAG to run every day at 8:00 AM UTC. So, I did: defa
I have some dags that can't seem to locate python modules. Inside of the Airflow UI, I see a ton of these message variations. Broken DAG: [/home/airflow/source
Are there any best practices that are followed for deploying new dags to airflow? I saw a couple of comments on the google forum stating that the dags are sav
I would like to send alerts via my custom callback function that I'm calling during dag initialization as show below- dag = DAG('tutorial', default_args=defaul
Setup: Airflow 2.0.1 with Kubernetes 1.18 and Python 3.8, Kubernetes Client: 18.17.x Pod template file: apiVersion: v1 kind: Pod metadata: name: workerPod sp
I have an Airflow Http sensor that calls a REST endpoint and checks for a specific value in the JSON structure returned by the API sensor = HttpSensor( soft
I see that one can trigger_dag with parameters/config key-value pairs using the airflow command line: For Apache Airflow, How can I pass the parameters when ma