I am working with Airflow 2.2.3 in GCP (Composer) and I am seeing inconsistent behavior which I can't explain when trying to use template values. When I referen
I have a dag that I want to run multiple times say 30. But airflow can parallelly execute 16 dag runs at a time. Suppose one dag run takes longer time to execut
I have two tasks inside a TaskGroup that need to pull xcom values to supply the job_flow_id and step_id. Here's the code: with TaskGroup('execute_my_steps') a
I have looked through similar posts on SO and they seem to be specific to using Docker environments and haven't been much helpful. Ours is a little different, w
I'm looking for a way to extract execution_date, start_date, end_date etc. of the last successful run instance of a task in a DAG and then decide to raise an er
I have created a a dag which call to the snowflake wharehouse and call query on it. CODE FOR THE DAG:- import logging from datetime import datetime, timedelta i
I have a sensor task that listens to files being created in S3. After a poke I may have 3 files, after another poke I might have another 5 files. I want to crea
So, I'm currently working with an Airflow installation via MWAA. I'm having this issue with a broken dependency, specifically: ERROR: pip's dependency resolve
I have an airflow task using SimpleHTTPOperator hitting a service in Google Cloud Run. I checked in Cloud Run Logs, the service itself is completed in 12 minute
We have 100 of dags which has a prefix with "dag_EDW_HC_*" . we have below command to pause the dag Command: airflow pause dag_id Is there any way we can pause
I'm attempting to integrate Airflow with Okta, however there is little documentation available online. I'm referring to a blog article, but I can't seem to get
When using BigQueryInsertJobOperator and setting the configuration to perform a dry run on a faulty .sql file/ a hardcoded query, the task succeeds even though
in my virtual env on azure VM, i ran pip3 install apache-airflow when i started airflow db init i received File "/home/shivamanand/myenv/lib/python3.7/site-pack
I have a problem with my dag getting stuck at subdag. The subdag is in RUNNING state but on zooming in all the tasks of the subdag are in None status. Using Air
I have a list of dags that are hosted on Airflow. I want to get the name of the dags in a AWS lambda function so that I can use the names and trigger the dag us
I was writing the below code but it is running endless in airflow, but in my system it take 5 min to run gc=pygsheets.authorize(service_account_file='file.json'
I am trying to get xcom for a glue job run to get it's glueid. I need this to display the cloudwatch link on airflow output console in case the glue job fails.
Hi I am taking a datacamp class on how to use Airflow and it shows how to create dags once you have access to an Airflow Web Interface. Is there an easy way to
Simple question here. When looking at Airflow's UI, like in the screenshot below: What does Is Encrypted and Is Extra Encrypted stand for? Is there clear docum
I've got a current implementation of some code which works fine, but only carries out a single check per dag run as I cannot feed through multiple results to do