Skip to content

Unable to override aws_default connection values from UI #18276

@vschettino

Description

@vschettino

Apache Airflow version

2.0.2

Operating System

Centos 8

Versions of Apache Airflow Providers

apache-airflow-providers-amazon==2.2.0
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-http==2.0.1
apache-airflow-providers-imap==2.0.1
apache-airflow-providers-sqlite==2.0.1

Deployment

MWAA

Deployment details

The problem happens both on a local docker container and MWAA.

What happened

I am trying to set a user for aws_default connection in order to be able to run Fargate Tasks and access AWS Parameter Store during DAG execution. According to the docs:

The default connection ID is aws_default. If the environment/machine where you are running Airflow has the file credentials in /home/.aws/, and the default connection has user and pass fields empty, it will take automatically the credentials from there.

What happens is that no matter how I set up those values, the default connection is always empty:?

[2021-09-15 16:04:06,915] {{standard_task_runner.py:77}} INFO - Job 16: Subtask data-mwaa-lab
[2021-09-15 16:04:07,060] {{logging_mixin.py:104}} INFO - [2021-09-15 16:04:07,060] {{base_aws.py:368}} INFO - Airflow Connection: aws_conn_id=aws_default
[2021-09-15 16:04:07,111] {{logging_mixin.py:104}} INFO - [2021-09-15 16:04:07,111] {{base_aws.py:179}} INFO - No credentials retrieved from Connection
[2021-09-15 16:04:07,139] {{logging_mixin.py:104}} INFO - [2021-09-15 16:04:07,139] {{base_aws.py:87}} INFO - Creating session with aws_access_key_id=None region_name=us-east-2
[2021-09-15 16:04:07,179] {{logging_mixin.py:104}} INFO - [2021-09-15 16:04:07,179] {{base_aws.py:157}} INFO - role_arn is None

What you expected to happen

I understand that setting Login and Password on the UI (connection/edit/2) those values should be used instead of reaching for the /home/.aws/ file, that will be empty. When I create a identical connection called aws, the fargate task is correctly launched. The problem here is that I need this to be the default connection to reach for Parameter Store values in order to configure the dag run.

How to reproduce

I was able to reproduce the same issue using a docker container on my machine and on an old Airflow 1.x instance. Just setting valid AWS credentials via UI on Login/Pass of connection aws_default do not work.

Anything else

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions