Category "airflow"

psycopg2.OperationalError: could not translate host name "<address>" to address: Temporary failure in name resolution

I have looked through similar posts on SO and they seem to be specific to using Docker environments and haven't been much helpful. Ours is a little different, w

How do I run SQL queries on the airflow meta db from inside an airflow DAG?

I'm looking for a way to extract execution_date, start_date, end_date etc. of the last successful run instance of a task in a DAG and then decide to raise an er

Snowflake connection to airflow

I have created a a dag which call to the snowflake wharehouse and call query on it. CODE FOR THE DAG:- import logging from datetime import datetime, timedelta i

Airflow - What do I do when I have a variable amount of Work that needs to be handled by a DAG?

I have a sensor task that listens to files being created in S3. After a poke I may have 3 files, after another poke I might have another 5 files. I want to crea

Listing packages in MWAA with a broken scheduler

So, I'm currently working with an Airflow installation via MWAA. I'm having this issue with a broken dependency, specifically: ERROR: pip's dependency resolve

Airflow task still running even after the endpoint process is completed

I have an airflow task using SimpleHTTPOperator hitting a service in Google Cloud Run. I checked in Cloud Run Logs, the service itself is completed in 12 minute

How to pause / Un pause multiple dags in airflow

We have 100 of dags which has a prefix with "dag_EDW_HC_*" . we have below command to pause the dag Command: airflow pause dag_id Is there any way we can pause

Airflow with Okta integration

I'm attempting to integrate Airflow with Okta, however there is little documentation available online. I'm referring to a blog article, but I can't seem to get

BigQueryInsertJobOperator dryRun is returning success instead of failure on composer (airflow)

When using BigQueryInsertJobOperator and setting the configuration to perform a dry run on a faulty .sql file/ a hardcoded query, the task succeeds even though

Airflow without sudo access?

in my virtual env on azure VM, i ran pip3 install apache-airflow when i started airflow db init i received File "/home/shivamanand/myenv/lib/python3.7/site-pack

Airflow Subdag tasks are stuck in None state while subdag is showing as Running

I have a problem with my dag getting stuck at subdag. The subdag is in RUNNING state but on zooming in all the tasks of the subdag are in None status. Using Air

Get List of all the dags in python

I have a list of dags that are hosted on Airflow. I want to get the name of the dags in a AWS lambda function so that I can use the names and trigger the dag us

Is there any difference between python scripts in airflow and same script in python

I was writing the below code but it is running endless in airflow, but in my system it take 5 min to run gc=pygsheets.authorize(service_account_file='file.json'

How to push xcom from AwsGlueJobOperator when this task fails

I am trying to get xcom for a glue job run to get it's glueid. I need this to display the cloudwatch link on airflow output console in case the glue job fails.

How do you access Airflow Web Interface?

Hi I am taking a datacamp class on how to use Airflow and it shows how to create dags once you have access to an Airflow Web Interface. Is there an easy way to

In Airflow UI under connections, what does 'encrypted' and 'extra-encrypted' mean?

Simple question here. When looking at Airflow's UI, like in the screenshot below: What does Is Encrypted and Is Extra Encrypted stand for? Is there clear docum

Airflow 2.3 - Dynamic Task Mapping using Operators

I've got a current implementation of some code which works fine, but only carries out a single check per dag run as I cannot feed through multiple results to do

Using airflow to uploade data on S3

I tried to upload a dataframe containing informations about apple stock (using their api) as csv on s3 using airflow and pythonoperator. The script is below. Wh

Apache Airflow: How to template Volumes in DockerOperator using Jinja Templating

I want to convey list of volumes into DockerOperator using Jinja template: hard coded volumes works fine: volumes=['first:/dest', 'second:/sec_destination'] ho

loop over airflow variables issue question

I am having hard time looping over an airflow variable in my script so I have a requirement to list all files prefixed by string in a bucket. next loop throug