I would like to pass a list of strings, containing the name of files in google storage to XCom. Later to be picked up by a GoogleCloudStorageToBigQueryOperator
I need to run few Airflow tasks in parallel concurrently and if one task got completed successfully, need to call the other task. How can I do that? Ex: Task A
I have a DAG. How I can I pass parameters to the DAG at runtime and start the DAG? Basically, the DAG can take upto 10 values for a param (say, number). Based o
I have a DAG. Here is a sample of the parameters. dag = DAG( 'My Dag', default_args=default_args, description='Cron Job : My Dag', schedule_inte
I've been using the airflow pool to control my concurrent tasks. so I've created a test_pool with 10 slots and have created 4 tasks, out of which I have assigne
I have just upgraded my Airflow from 1.10.13 to 2.0. I am running it in Kubernetes (AKS Azure) with Kubernetes Executor. Unfortunately, I see my Scheduler getti
I am getting this error: Broken DAG: [/usr/local/airflow/dags/reg_controller_new.py] type object 'DAG' has no attribute 'default_args' I didn't find anything s
I am working with Apache Airflow and I have a problem with the scheduled day and the starting day. I want a DAG to run every day at 8:00 AM UTC. So, I did: defa
I encountered a problem with converting the airflow macros reference 'ts' into a datetime object. The problem is with the tz at the end of the string. from dat
I am copying a CSV into a new BQ table using the GCSToBigQueryOperator task in Airflow. Is there a way to add a table expiration to this table within this task?
Is there a way to pause a specific DagRun within Airflow? I want to be able to have multiple, simultaneous executing runs of a single DAG, and I want to be abl
I have airflow 1.10.5 on a Kubernetes cluster. The DAGs are written with Kubernetes operator so that they can spin pods for each task inside the DAG on executio
I have three DAGs (say, DAG1, DAG2 and DAG3). I have a monthly scheduler for DAG1. DAG2 and DAG3 must not be run directly (no scheduler for these) and must be r
I would like to see what the code in a whole repository looks like of a specific release. As an example, I'd like to view the code for Apache Airflow as of vers
I am new to airflow and I just follow the tutorial to run a dag. Actually I did it successfully, but the problem is when I try to pause the dag by inputing comm
I have some dags that can't seem to locate python modules. Inside of the Airflow UI, I see a ton of these message variations. Broken DAG: [/home/airflow/source
I have a list of lists in the following way - [['X_API', 'Y_API',....], ['Z_API', 'P_API', ...], [....], [...] .... ] Here, each API name corresponds to a Pytho
My scrapy project runs perfectly well with 'scrapy crawl spider_1' command. How to trigger it (or call the scrappy command) from airflow dag? with DAG(<args&
I just installed Airflow 2.3.0 using the command pip install "apache-airflow==2.3.0" --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-
Are there any best practices that are followed for deploying new dags to airflow? I saw a couple of comments on the google forum stating that the dags are sav