I have a sensor task that listens to files being created in S3. After a poke I may have 3 files, after another poke I might have another 5 files. I want to crea
So, I'm currently working with an Airflow installation via MWAA. I'm having this issue with a broken dependency, specifically: ERROR: pip's dependency resolve
I'm setting up an AWS MWAA instance and I have a problem with import custom plugins. My local project structure looks like this: airflow-project ├─&
Snowflake is not showing in the connections dropdown. I am using MWAA 2.0 and the providers are already in the requirements.txt MWAA uses python 3.7 dont know i
I am working with Amazon Managed Workflows for Apache Airflow (MWAA). When I copy a new requirements.txt file to my S3 bucket, then use the AWS Console to speci
I have a list of lists in the following way - [['X_API', 'Y_API',....], ['Z_API', 'P_API', ...], [....], [...] .... ] Here, each API name corresponds to a Pytho
I'm not able to import new airflow variables from a json file to my MWAA env through Boto3 & aws_mwaa. The response code from aws_mwaa/cli is 400. However,
I am using AWS's MWAA service (2.2.2) to run a variety of DAGs, most of which are implemented with standard PythonOperator types. I bundle the DAGs into an S3 b
I would like to use DBT in MWAA Airflow enviroment. To achieve this I need to install DBT in the managed environment and from there run the dbt commands via the