I have some dags that can't seem to locate python modules. Inside of the Airflow UI, I see a ton of these message variations. Broken DAG: [/home/airflow/source
I have a list of lists in the following way - [['X_API', 'Y_API',....], ['Z_API', 'P_API', ...], [....], [...] .... ] Here, each API name corresponds to a Pytho
My scrapy project runs perfectly well with 'scrapy crawl spider_1' command. How to trigger it (or call the scrappy command) from airflow dag? with DAG(<args&
I just installed Airflow 2.3.0 using the command pip install "apache-airflow==2.3.0" --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-
Are there any best practices that are followed for deploying new dags to airflow? I saw a couple of comments on the google forum stating that the dags are sav
I will want to provide a direct access to the log file. However, client will be accessing via external IP. So i will like to see if there's any way i can print
I am trying to run a simple python script within a docker run command scheduled with Airflow. I have followed the instructions here Airflow init. My .env file:
I'm not able to import new airflow variables from a json file to my MWAA env through Boto3 & aws_mwaa. The response code from aws_mwaa/cli is 400. However,
I have a deployed DAG in which I'm using check_for_wildcard_key() to check if files for a particular day are present in an s3 location and then decide which bra
I would like to send alerts via my custom callback function that I'm calling during dag initialization as show below- dag = DAG('tutorial', default_args=defaul
I am new to airflow automation, i dont now if it is possible to do this with apache airflow(or luigi etc) or should i just make a long bash file to do this. I
Setup: Airflow 2.0.1 with Kubernetes 1.18 and Python 3.8, Kubernetes Client: 18.17.x Pod template file: apiVersion: v1 kind: Pod metadata: name: workerPod sp
I would like to hide the field in Airflow UI: "Task Instance Details" -> Section "Task Attributes" -> Attribute "Env". I have some credentials stored in t
Requirement: To run local Airflow using official docker-compose with Airflow version 2.3.0 Issue: "You are running pip as root Please use user to run pip" Airfl
I have successfully written an API in Python to read Gmail message, URL from the message, call the URL and store CSV file, however, when I am deploying this in
I have an Airflow Http sensor that calls a REST endpoint and checks for a specific value in the JSON structure returned by the API sensor = HttpSensor( soft
I am using AWS's MWAA service (2.2.2) to run a variety of DAGs, most of which are implemented with standard PythonOperator types. I bundle the DAGs into an S3 b
I deployed the default helm chart for airflow 2. The postgres pod is reporting an error: ERROR: relation "log" does not exist at character 13 This appears af
I am new at airflow and when i click run 'ignore all dependence' on Task Instance Context Menu like this: Task Instance Context Menu It leads to 'Only works
I see that one can trigger_dag with parameters/config key-value pairs using the airflow command line: For Apache Airflow, How can I pass the parameters when ma