Category "airflow"

MWAA Airflow 2.0 in AWS Snowflake connection not showing

Snowflake is not showing in the connections dropdown. I am using MWAA 2.0 and the providers are already in the requirements.txt MWAA uses python 3.7 dont know i

Is there any way to can install airflow on kubernetes on premise

I am new to airflow and need assistance on how to install airflow on k8s . Needs are: 1 . How to Build docker image of airflow only for webserver and scheduler

MWAA - Tracking/monitoring progress of the "Updating" phase triggered by changing the version of requirements.txt used by an environment

I am working with Amazon Managed Workflows for Apache Airflow (MWAA). When I copy a new requirements.txt file to my S3 bucket, then use the AWS Console to speci

GoogleCloudStorageToBigQueryOperator source_objects to receive list via XCom

I would like to pass a list of strings, containing the name of files in google storage to XCom. Later to be picked up by a GoogleCloudStorageToBigQueryOperator

Airflow Tasks running in parallel

I need to run few Airflow tasks in parallel concurrently and if one task got completed successfully, need to call the other task. How can I do that? Ex: Task A

Airflow- Pass Parameters at runtime

I have a DAG. How I can I pass parameters to the DAG at runtime and start the DAG? Basically, the DAG can take upto 10 values for a param (say, number). Based o

How do I stop Apache Airflow running a task the first time when I unpause it?

I have a DAG. Here is a sample of the parameters. dag = DAG( 'My Dag', default_args=default_args, description='Cron Job : My Dag', schedule_inte

Airflow pool - less priority task is triggering first

I've been using the airflow pool to control my concurrent tasks. so I've created a test_pool with 10 slots and have created 4 tasks, out of which I have assigne

Airflow Scheduler liveness probe crashing (version 2.0)

I have just upgraded my Airflow from 1.10.13 to 2.0. I am running it in Kubernetes (AKS Azure) with Kubernetes Executor. Unfortunately, I see my Scheduler getti

How to handle 'Default_args' error in Airflow?

I am getting this error: Broken DAG: [/usr/local/airflow/dags/reg_controller_new.py] type object 'DAG' has no attribute 'default_args' I didn't find anything s

Problem with start date and scheduled date in Apache Airflow

I am working with Apache Airflow and I have a problem with the scheduled day and the starting day. I want a DAG to run every day at 8:00 AM UTC. So, I did: defa

Convert airflow macro 'ts' into datetime object

I encountered a problem with converting the airflow macros reference 'ts' into a datetime object. The problem is with the tz at the end of the string. from dat

Table expiration in GCS to BQ Airflow task

I am copying a CSV into a new BQ table using the GCSToBigQueryOperator task in Airflow. Is there a way to add a table expiration to this table within this task?

Is there a way to pause an airflow DagRun?

Is there a way to pause a specific DagRun within Airflow? I want to be able to have multiple, simultaneous executing runs of a single DAG, and I want to be abl

Kubernetes Operator in Airflow is not sharing the load across nodes. Why?

I have airflow 1.10.5 on a Kubernetes cluster. The DAGs are written with Kubernetes operator so that they can spin pods for each task inside the DAG on executio

Airflow DAGS Orchestration

I have three DAGs (say, DAG1, DAG2 and DAG3). I have a monthly scheduler for DAG1. DAG2 and DAG3 must not be run directly (no scheduler for these) and must be r

How to view code in a Github repository as of a specific release?

I would like to see what the code in a whole repository looks like of a specific release. As an example, I'd like to view the code for Apache Airflow as of vers

airflow 'NoneType' object has no attribute 'is_paused',how to fix it?

I am new to airflow and I just follow the tutorial to run a dag. Actually I did it successfully, but the problem is when I try to pause the dag by inputing comm

Airflow dags and PYTHONPATH

I have some dags that can't seem to locate python modules. Inside of the Airflow UI, I see a ton of these message variations. Broken DAG: [/home/airflow/source

Creating dynamic workflows for Airflow tasks present in a Python list

I have a list of lists in the following way - [['X_API', 'Y_API',....], ['Z_API', 'P_API', ...], [....], [...] .... ] Here, each API name corresponds to a Pytho