Git Product home page Git Product logo

aws-mwaa-local-runner's Introduction

About aws-mwaa-local-runner

This repository provides a command line interface (CLI) utility that replicates an Amazon Managed Workflows for Apache Airflow (MWAA) environment locally.

Please note: MWAA/AWS/DAG/Plugin issues should be raised through AWS Support or the Airflow Slack #airflow-aws channel. Issues here should be focused on this local-runner repository.

About the CLI

The CLI builds a Docker container image locally that’s similar to a MWAA production image. This allows you to run a local Apache Airflow environment to develop and test DAGs, custom plugins, and dependencies before deploying to MWAA.

What this repo contains

dags/
  example_lambda.py
  example_dag_with_taskflow_api.py    
  example_redshift_data_execute_sql.py
docker/
  config/
    airflow.cfg
    constraints.txt
    mwaa-base-providers-requirements.txt
    webserver_config.py
    .env.localrunner
  script/
    bootstrap.sh
    entrypoint.sh
    systemlibs.sh
    generate_key.sh
  docker-compose-local.yml
  docker-compose-resetdb.yml
  docker-compose-sequential.yml
  Dockerfile
plugins/
  README.md
requirements/  
  requirements.txt
.gitignore
CODE_OF_CONDUCT.md
CONTRIBUTING.md
LICENSE
mwaa-local-env
README.md
VERSION

Prerequisites

Get started

git clone https://github.com/aws/aws-mwaa-local-runner.git
cd aws-mwaa-local-runner

Step one: Building the Docker image

Build the Docker container image using the following command:

./mwaa-local-env build-image

Note: it takes several minutes to build the Docker image locally.

Step two: Running Apache Airflow

Local runner

Runs a local Apache Airflow environment that is a close representation of MWAA by configuration.

./mwaa-local-env start

To stop the local environment, Ctrl+C on the terminal and wait till the local runner and the postgres containers are stopped.

Step three: Accessing the Airflow UI

By default, the bootstrap.sh script creates a username and password for your local Airflow environment.

  • Username: admin
  • Password: test

Airflow UI

Step four: Add DAGs and supporting files

The following section describes where to add your DAG code and supporting files. We recommend creating a directory structure similar to your MWAA environment.

DAGs

  1. Add DAG code to the dags/ folder.
  2. To run the sample code in this repository, see the example_dag_with_taskflow_api.py file.

Requirements.txt

  1. Add Python dependencies to requirements/requirements.txt.
  2. To test a requirements.txt without running Apache Airflow, use the following script:
./mwaa-local-env test-requirements

Let's say you add aws-batch==0.6 to your requirements/requirements.txt file. You should see an output similar to:

Installing requirements.txt
Collecting aws-batch (from -r /usr/local/airflow/dags/requirements.txt (line 1))
  Downloading https://files.pythonhosted.org/packages/5d/11/3aedc6e150d2df6f3d422d7107ac9eba5b50261cf57ab813bb00d8299a34/aws_batch-0.6.tar.gz
Collecting awscli (from aws-batch->-r /usr/local/airflow/dags/requirements.txt (line 1))
  Downloading https://files.pythonhosted.org/packages/07/4a/d054884c2ef4eb3c237e1f4007d3ece5c46e286e4258288f0116724af009/awscli-1.19.21-py2.py3-none-any.whl (3.6MB)
    100% |████████████████████████████████| 3.6MB 365kB/s 
...
...
...
Installing collected packages: botocore, docutils, pyasn1, rsa, awscli, aws-batch
  Running setup.py install for aws-batch ... done
Successfully installed aws-batch-0.6 awscli-1.19.21 botocore-1.20.21 docutils-0.15.2 pyasn1-0.4.8 rsa-4.7.2
  1. To package the necessary WHL files for your requirements.txt without running Apache Airflow, use the following script:
./mwaa-local-env package-requirements

For example usage see Installing Python dependencies using PyPi.org Requirements File Format Option two: Python wheels (.whl).

Custom plugins

  • There is a directory at the root of this repository called plugins.
  • In this directory, create a file for your new custom plugin.
  • Add any Python dependencies to requirements/requirements.txt.

Note: this step assumes you have a DAG that corresponds to the custom plugin. For example usage MWAA Code Examples.

Startup script

  • There is a sample shell script startup.sh located in a directory at the root of this repository called startup_script.
  • If there is a need to run additional setup (e.g. install system libraries, setting up environment variables), please modify the startup.sh script.
  • To test a startup.sh without running Apache Airflow, use the following script:
./mwaa-local-env test-startup-script

What's next?

FAQs

The following section contains common questions and answers you may encounter when using your Docker container image.

Can I test execution role permissions using this repository?

How do I add libraries to requirements.txt and test install?

  • A requirements.txt file is included in the /requirements folder of your local Docker container image. We recommend adding libraries to this file, and running locally.

What if a library is not available on PyPi.org?

Troubleshooting

The following section contains errors you may encounter when using the Docker container image in this repository.

My environment is not starting

  • If you encountered the following error: process fails with "dag_stats_table already exists", you'll need to reset your database using the following command:
./mwaa-local-env reset-db
  • If you are moving from an older version of local-runner you may need to run the above reset-db command, or delete your ./db-data folder. Note, too, that newer Airflow versions have newer provider packages, which may require updating your DAG code.

Fernet Key InvalidToken

A Fernet Key is generated during image build (./mwaa-local-env build-image) and is durable throughout all containers started from that image. This key is used to encrypt connection passwords in the Airflow DB. If changes are made to the image and it is rebuilt, you may get a new key that will not match the key used when the Airflow DB was initialized, in this case you will need to reset the DB (./mwaa-local-env reset-db).

Security

See CONTRIBUTING for more information.

License

This library is licensed under the MIT-0 License. See the LICENSE file.

aws-mwaa-local-runner's People

Contributors

aliavni avatar amazon-auto avatar chadac avatar dujjwalr-aws avatar john-jac avatar malwini avatar mayushko26 avatar noharafat avatar o-nikolas avatar rafidka avatar subashcanapathy avatar umaragu avatar vandonr-amz avatar vishalvijay18 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

aws-mwaa-local-runner's Issues

Specify region, aws credentials in airflow localrunner for Amazon MWAA

I have a DAG which uses boto3. After installing when I placed the dag under /dag I get the following error. How do I sepecify region and access key? This has not been mentioned in the documentation.

ERROR [airflow.models.dagbag.DagBag] Failed to import: /usr/local/airflow/dags/ny_taxi_brew_local.py
local-runner_1 | Traceback (most recent call last):
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/airflow/models/dagbag.py", line 305, in _load_modules_from_file
local-runner_1 | loader.exec_module(new_module)
local-runner_1 | File "", line 728, in exec_module
local-runner_1 | File "", line 219, in _call_with_frames_removed
local-runner_1 | File "/usr/local/airflow/dags/ny_taxi_brew_local.py", line 61, in
local-runner_1 | glue_databrew_client=boto3.client('databrew',region_name=region_name)
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/boto3/init.py", line 93, in client
local-runner_1 | return _get_default_session().client(*args, **kwargs)
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/boto3/session.py", line 263, in client
local-runner_1 | aws_session_token=aws_session_token, config=config)
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/botocore/session.py", line 851, in create_client
local-runner_1 | client_config=config, api_version=api_version)
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 88, in create_client
local-runner_1 | verify, credentials, scoped_config, client_config, endpoint_bridge)
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 357, in _get_client_args
local-runner_1 | verify, credentials, scoped_config, client_config, endpoint_bridge)
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/botocore/args.py", line 73, in get_client_args
local-runner_1 | endpoint_url, is_secure, scoped_config)
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/botocore/args.py", line 154, in compute_client_args
local-runner_1 | s3_config=s3_config,
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/botocore/args.py", line 220, in _compute_endpoint_config
local-runner_1 | return self._resolve_endpoint(**resolve_endpoint_kwargs)
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/botocore/args.py", line 303, in _resolve_endpoint
local-runner_1 | service_name, region_name, endpoint_url, is_secure)
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 431, in resolve
local-runner_1 | service_name, region_name)
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/botocore/regions.py", line 134, in construct_endpoint
local-runner_1 | partition, service_name, region_name)
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/botocore/regions.py", line 148, in _endpoint_for_partition
local-runner_1 | raise NoRegionError()
local-runner_1 | botocore.exceptions.NoRegionError: You must specify a region.

UI Login doesn't work

Hello,

I followed all the steps in the readme and when I access the login page using the credentials 'admin' 'test' nothing happens. I try to get to the homepage but I continue to have the login fields

local-runner_1  | 172.18.0.1 - - [26/May/2021:10:35:50 +0000] "POST /login/?next=http%3A%2F%2F127.0.0.1%3A8080%2Fhome HTTP/1.1" 302 209 "http://127.0.0.1:8080/login/?next=http%3A%2F%2F127.0.0.1%3A8080%2Fhome" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.132 Safari/537.36"
local-runner_1  | 172.18.0.1 - - [26/May/2021:10:35:50 +0000] "GET / HTTP/1.1" 302 217 "http://127.0.0.1:8080/login/?next=http%3A%2F%2F127.0.0.1%3A8080%2Fhome" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.132 Safari/537.36"
local-runner_1  | 172.18.0.1 - - [26/May/2021:10:35:50 +0000] "GET /home HTTP/1.1" 302 305 "http://127.0.0.1:8080/login/?next=http%3A%2F%2F127.0.0.1%3A8080%2Fhome" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.132 Safari/537.36"
local-runner_1  | 172.18.0.1 - - [26/May/2021:10:35:51 +0000] "GET /login/?next=http%3A%2F%2F127.0.0.1%3A8080%2Fhome HTTP/1.1" 200 17126 "http://127.0.0.1:8080/login/?next=http%3A%2F%2F127.0.0.1%3A8080%2Fhome" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.132 Safari/537.36"

image

cdn.amazonlinux SSL certs needed

I have a similar issue as #7 though I think it is related to SSL verification as opposed to an incorrect link can you provide some pointers on how to add the required SSL certs to validate the CDN.amazonlinux links?

OS: Ubuntu 2004 on WSL 1, Docker for Windows 20.10.2
Release: https://github.com/aws/aws-mwaa-local-runner/releases/tag/v2.0.2

=> ERROR [ 6/13] RUN chmod u+x /systemlibs.sh && /systemlibs.sh                                                                             3.3s
------
 > [ 6/13] RUN chmod u+x /systemlibs.sh && /systemlibs.sh:
#10 1.192 Loaded plugins: ovl, priorities
#10 3.125 https://cdn.amazonlinux.com/2/core/2.0/x86_64/0c4b5094bba8d46b07c60e3d85cd8baac5f75d07af6a33086b6d0cd9eb2e13f1/repodata/repomd.xml?instance_id=URLError&region=unknown: [Errno 14] curl#60 - "SSL certificate problem: unable to get local issuer certificate"

MWAA variables cannot be accessed in GUI if set in UI and vice-versa

It appears that when the FERNET_KEY is set in entrypoint.sh, it is not persisted so the airflow CLI cannot access variables that are set for MWAA in the web UI. The FERNET_KEY should be persisted to airflow.cfg so that both the CLI and the UI can use the same key across sessions.

Configuring the AWS role for the airflow workers botocore.exceptions.NoCredentialsError: Unable to locate credentials

What is the correct way to configure the AWS credentials for the workers?

[2021-08-23 23:18:53,661] {{glue.py:112}} ERROR - Failed to run aws glue job, error: Unable to locate credentials
[2021-08-23 23:18:53,661] {{taskinstance.py:1482}} ERROR - Task failed with exception
Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1138, in _run_raw_task
    self._prepare_and_execute_task_with_callbacks(context, task)
  File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1311, in _prepare_and_execute_task_with_callbacks
    result = self._execute_task(context, task_copy)
  File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1341, in _execute_task
    result = task_copy.execute(context=context)
  File "/usr/local/lib/python3.7/site-packages/airflow/providers/amazon/aws/operators/glue.py", line 121, in execute
    glue_job_run = glue_job.initialize_job(self.script_args)
  File "/usr/local/lib/python3.7/site-packages/airflow/providers/amazon/aws/hooks/glue.py", line 108, in initialize_job
    job_name = self.get_or_create_glue_job()
  File "/usr/local/lib/python3.7/site-packages/airflow/providers/amazon/aws/hooks/glue.py", line 166, in get_or_create_glue_job
    get_job_response = glue_client.get_job(JobName=self.job_name)
  File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 357, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 663, in _make_api_call
    operation_model, request_dict, request_context)
  File "/usr/local/lib/python3.7/site-packages/botocore/client.py", line 682, in _make_request
    return self._endpoint.make_request(operation_model, request_dict)
  File "/usr/local/lib/python3.7/site-packages/botocore/endpoint.py", line 102, in make_request
    return self._send_request(request_dict, operation_model)
  File "/usr/local/lib/python3.7/site-packages/botocore/endpoint.py", line 132, in _send_request
    request = self.create_request(request_dict, operation_model)
  File "/usr/local/lib/python3.7/site-packages/botocore/endpoint.py", line 116, in create_request
    operation_name=operation_model.name)
  File "/usr/local/lib/python3.7/site-packages/botocore/hooks.py", line 356, in emit
    return self._emitter.emit(aliased_event_name, **kwargs)
  File "/usr/local/lib/python3.7/site-packages/botocore/hooks.py", line 228, in emit
    return self._emit(event_name, kwargs)
  File "/usr/local/lib/python3.7/site-packages/botocore/hooks.py", line 211, in _emit
    response = handler(**kwargs)
  File "/usr/local/lib/python3.7/site-packages/botocore/signers.py", line 90, in handler
    return self.sign(operation_name, request)
  File "/usr/local/lib/python3.7/site-packages/botocore/signers.py", line 162, in sign
    auth.add_auth(request)
  File "/usr/local/lib/python3.7/site-packages/botocore/auth.py", line 373, in add_auth
    raise NoCredentialsError()
botocore.exceptions.NoCredentialsError: Unable to locate credentials

MWAA installing Python packages

I see that this local runner allows for setting environment variables such as $PIP_OPTION $PYTHON_DEPS which are used during build-image.

Does the actual MWAA environment support using these variables?

Installing non-pip dependencies

I am running an Airflow task that requires git be installed. I'm using this image to test before running in mwaa.

I am receiving the error below indicating that git is not installed.

[2021-08-04 02:51:37,820] {{bash.py:158}} INFO - Running command: git --version
[2021-08-04 02:51:37,881] {{bash.py:169}} INFO - Output:
[2021-08-04 02:51:37,902] {{bash.py:173}} INFO - bash: git: command not found
[2021-08-04 02:51:37,902] {{bash.py:177}} INFO - Command exited with return code 127

I see bootstrap.sh in this repo, adding yum install git -y solves the issue, however I can not find a way of providing a bootstrap script in the production image.

Am I overlooking this or is this a feature request?

Thanks!

"mwaa-local-env test-requirements" fails with flask-appbuilder dependency issues

To reproduce:

  1. clone this repository
  2. cd into the repo directory and run ./mwaa-local-env test-requirements

My system passes the prereq test of "./mwaa-local-env validate-prereqs"

Error Details:

#11 34.52 INFO: pip is looking at multiple versions of apache-airflow[celery,crypto,statsd] to determine which version is compatible with other requirements. This could take a while.
#11 34.52 ERROR: Cannot install apache-airflow because these package versions have conflicting dependencies.
#11 34.52
#11 34.52 The conflict is caused by:
#11 34.52 flask-appbuilder 3.4.1 depends on prison<1.0.0 and >=0.2.1
#11 34.52 flask-appbuilder 3.4.0 depends on prison<1.0.0 and >=0.2.1
#11 34.52 flask-appbuilder 3.3.4 depends on prison<1.0.0 and >=0.2.1
#11 34.52 flask-appbuilder 3.3.3 depends on prison<1.0.0 and >=0.2.1
#11 34.52 flask-appbuilder 3.3.2 depends on Flask-OpenID<2 and >=1.2.5
#11 34.52 flask-appbuilder 3.3.1 depends on Flask-OpenID<2 and >=1.2.5
#11 34.52 flask-appbuilder 3.3.0 depends on Flask-OpenID<2 and >=1.2.5
#11 34.52 flask-appbuilder 3.2.3 depends on Flask-OpenID<2 and >=1.2.5
#11 34.52 flask-appbuilder 3.2.2 depends on Flask-OpenID<2 and >=1.2.5
#11 34.52 flask-appbuilder 3.2.1 depends on Flask-OpenID<2 and >=1.2.5
#11 34.52 flask-appbuilder 3.2.0 depends on Flask-OpenID<2 and >=1.2.5
#11 34.52 flask-appbuilder 3.1.1 depends on Flask-OpenID<2 and >=1.2.5
#11 34.52 The user requested (constraint) prison==0.1.3
#11 34.52 The user requested (constraint) flask-openid==1.2.5

`./mwaa-local-env build-image` Creates Image Using Airflow 2.2.2

alexlordthorsen@Alexs-MBP ~/git/data/data-platform/airflow_local :) % ./mwaa-local-env build-image
[+] Building 188.9s (19/19) FINISHED
 => [internal] load build definition from Dockerfile                                                                                                                                                                                                                                                                                                                 0.0s
 => => transferring dockerfile: 1.10kB                                                                                                                                                                                                                                                                                                                               0.0s
 => [internal] load .dockerignore                                                                                                                                                                                                                                                                                                                                    0.0s
 => => transferring context: 2B                                                                                                                                                                                                                                                                                                                                      0.0s
 => [internal] load metadata for docker.io/library/amazonlinux:latest                                                                                                                                                                                                                                                                                                1.6s
 => [auth] library/amazonlinux:pull token for registry-1.docker.io                                                                                                                                                                                                                                                                                                   0.0s
 => [ 1/13] FROM docker.io/library/amazonlinux@sha256:05c170879b6dec01ee51dd380d4d63cfb9ba59e738a03531c7ab5923515af3b4                                                                                                                                                                                                                                               5.9s
 => => resolve docker.io/library/amazonlinux@sha256:05c170879b6dec01ee51dd380d4d63cfb9ba59e738a03531c7ab5923515af3b4                                                                                                                                                                                                                                                 0.0s
 => => sha256:d99c40d6547efd6328851315f5bcf7e6a8e0a96c200583a445fc0cdcd2146b81 1.48kB / 1.48kB                                                                                                                                                                                                                                                                       0.0s
 => => sha256:5263c4cb36ce7acd05658a221ec502b376a281d7a6075ad09beb23ac02a7668c 61.98MB / 61.98MB                                                                                                                                                                                                                                                                     3.0s
 => => sha256:05c170879b6dec01ee51dd380d4d63cfb9ba59e738a03531c7ab5923515af3b4 547B / 547B                                                                                                                                                                                                                                                                           0.0s
 => => sha256:05295e83275444cae0e55601f6a545b548fd3e03e8ef9a4ab9c38a52071519b8 529B / 529B                                                                                                                                                                                                                                                                           0.0s
 => => extracting sha256:5263c4cb36ce7acd05658a221ec502b376a281d7a6075ad09beb23ac02a7668c                                                                                                                                                                                                                                                                            2.7s
 => [internal] load build context                                                                                                                                                                                                                                                                                                                                    0.0s  => => transferring context: 55.72kB                                                                                                                                                                                                                                                                                                                                 0.0s  => [ 2/13] COPY script/bootstrap.sh /bootstrap.sh                                                                                                                                                                                                                                                                                                                   0.2s  => [ 3/13] COPY script/systemlibs.sh /systemlibs.sh                                                                                                                                                                                                                                                                                                                 0.0s  => [ 4/13] COPY config/constraints.txt /constraints.txt                                                                                                                                                                                                                                                                                                             0.0s
 => [ 5/13] COPY config/requirements.txt /requirements.txt                                                                                                                                                                                                                                                                                                           0.0s
 => [ 6/13] RUN chmod u+x /systemlibs.sh && /systemlibs.sh                                                                                                                                                                                                                                                                                                          72.7s
 => [ 7/13] RUN chmod u+x /bootstrap.sh && /bootstrap.sh                                                                                                                                                                                                                                                                                                           104.0s
 => [ 8/13] COPY script/entrypoint.sh /entrypoint.sh                                                                                                                                                                                                                                                                                                                 0.0s
 => [ 9/13] COPY config/airflow.cfg /usr/local/airflow/airflow.cfg                                                                                                                                                                                                                                                                                                   0.0s
 => [10/13] COPY config/webserver_config.py /usr/local/airflow/webserver_config.py                                                                                                                                                                                                                                                                                   0.0s
 => [11/13] RUN chown -R airflow: /usr/local/airflow                                                                                                                                                                                                                                                                                                                 0.2s  => [12/13] RUN chmod +x /entrypoint.sh                                                                                                                                                                                                                                                                                                                              0.2s  => [13/13] WORKDIR /usr/local/airflow                                                                                                                                                                                                                                                                                                                               0.0s  => exporting to image                                                                                                                                                                                                                                                                                                                                               3.9s  => => exporting layers                                                                                                                                                                                                                                                                                                                                              3.9s
 => => writing image sha256:f8a2ac5360b91d03bbf21bb2045d124f89b248b0af14f679bb1fa8d2c92f929e                                                                                                                                                                                                                                                                         0.0s
 => => naming to docker.io/amazon/mwaa-local:2.0.2                                                                                                                                                                                                                                                                                                                   0.0s

Use 'docker scan' to run Snyk tests against images to find vulnerabilities and learn how to fix them
alexlordthorsen@Alexs-MBP ~ :) % docker exec -it 257fa96c683c /bin/bash
[airflow@257fa96c683c ~]$ airflow info

Apache Airflow
version                | 2.2.2

This leads to a large set of deprecation warnings

local-runner_1  | /usr/local/lib/python3.7/site-packages/airflow/configuration.py:361 DeprecationWarning: The dag_concurrency option in [core] has been renamed to max_active_tasks_per_dag - the old setting has been used, but please update your config.
local-runner_1  | /usr/local/lib/python3.7/site-packages/airflow/configuration.py:361 DeprecationWarning: The dag_concurrency option in [core] has been renamed to max_active_tasks_per_dag - the old setting has been used, but please update your config.
local-runner_1  | /usr/local/lib/python3.7/site-packages/airflow/configuration.py:361 DeprecationWarning: The dag_concurrency option in [core] has been renamed to max_active_tasks_per_dag - the old setting has been used, but please update your config.
local-runner_1  | /usr/local/lib/python3.7/site-packages/airflow/configuration.py:361 DeprecationWarning: The processor_poll_interval option in [scheduler] has been renamed to scheduler_idle_sleep_time - the old setting has been used, but please update your config.
local-runner_1  | /usr/local/lib/python3.7/site-packages/airflow/configuration.py:361 DeprecationWarning: The dag_concurrency option in [core] has been renamed to max_active_tasks_per_dag - the old setting has been used, but please update your config.
local-runner_1  | /usr/local/lib/python3.7/site-packages/airflow/configuration.py:361 DeprecationWarning: The processor_poll_interval option in [scheduler] has been renamed to scheduler_idle_sleep_time - the old setting has been used, but please update your config.
local-runner_1  | /usr/local/lib/python3.7/site-packages/airflow/configuration.py:361 DeprecationWarning: The processor_poll_interval option in [scheduler] has been renamed to scheduler_idle_sleep_time - the old setting has been used, but please update your config.
local-runner_1  | [2021-11-25 01:28:56,017] {{providers_manager.py:288}} WARNING - The package apache-airflow-providers-celery is not compatible with this version of Airflow. The package has version 1.0.1 but the minimum supported version of the package is 2.1.0
local-runner_1  | /usr/local/lib/python3.7/site-packages/airflow/providers_manager.py:523 DeprecationWarning: The provider apache-airflow-providers-ftp uses `hook-class-names` property in provider-info and has no `connection-types` one. The 'hook-class-names' property has been deprecated in favour of 'connection-types' in Airflow 2.2. Use **both** in case you
want to have backwards compatibility with Airflow < 2.2
local-runner_1  | /usr/local/lib/python3.7/site-packages/airflow/providers_manager.py:523 DeprecationWarning: The provider apache-airflow-providers-http uses `hook-class-names` property in provider-info and has no `connection-types` one. The 'hook-class-names' property has been deprecated in favour of 'connection-types' in Airflow 2.2. Use **both** in case you
 want to have backwards compatibility with Airflow < 2.2
local-runner_1  | /usr/local/lib/python3.7/site-packages/airflow/providers_manager.py:523 DeprecationWarning: The provider apache-airflow-providers-imap uses `hook-class-names` property in provider-info and has no `connection-types` one. The 'hook-class-names' property has been deprecated in favour of 'connection-types' in Airflow 2.2. Use **both** in case you
 want to have backwards compatibility with Airflow < 2.2                                                                                                                                                                                                                                                                                                                  local-runner_1  | /usr/local/lib/python3.7/site-packages/airflow/providers_manager.py:523 DeprecationWarning: The provider apache-airflow-providers-sqlite uses `hook-class-names` property in provider-info and has no `connection-types` one. The 'hook-class-names' property has been deprecated in favour of 'connection-types' in Airflow 2.2. Use **both** in case you want to have backwards compatibility with Airflow < 2.2
 # This is not exhaustive

And a top level exception in my airflow instance.

Screen Shot 2021-11-24 at 2 05 43 PM

I'm actually a fairly stumped about why this is happening. Particularly since it appears the version is pinned in two locations here.

error in Flask-OpenID setup command: use_2to3 is invalid

I get an error while building the docker image at Step 15/25 : RUN chmod u+x /bootstrap.sh && /bootstrap.sh:

First:

Collecting Flask-OpenID<2,>=1.2.5
  Downloading Flask-OpenID-1.2.5.tar.gz (43 kB)
  Preparing metadata (setup.py): started
  Preparing metadata (setup.py): finished with status 'error'
  ERROR: Command errored out with exit status 1:
   command: /usr/bin/python3 -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-yi3pagr7/flask-openid_32316dc8542641d5a6835996d16b863b/setup.py'"'"'; __file__='"'"'/tmp/pip-install-yi3pagr7/flask-openid_32316dc8542641d5a6835996d16b863b/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(__file__) if os.path.exists(__file__) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base /tmp/pip-pip-egg-info-pdcws92i
       cwd: /tmp/pip-install-yi3pagr7/flask-openid_32316dc8542641d5a6835996d16b863b/
  Complete output (1 lines):
  error in Flask-OpenID setup command: use_2to3 is invalid.
  ----------------------------------------
WARNING: Discarding https://files.pythonhosted.org/packages/d1/a2/9d1fba3287a65f81b9d1c09c4f7cb16f8ea4988b1bc97ffea0d60983338f/Flask-OpenID-1.2.5.tar.gz#sha256=5a8ffe1c8c0ad1cc1f5030e1223ea27f8861ee0215a2a58a528cc61379e5ccab (from https://pypi.org/simple/flask-openid/). Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.

And then:

INFO: pip is looking at multiple versions of apache-airflow[celery,crypto,statsd] to determine which version is compatible with other requirements. This could take a while.
ERROR: Cannot install apache-airflow because these package versions have conflicting dependencies.
ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/user_guide/#fixing-conflicting-dependencies

The conflict is caused by:
    flask-appbuilder 3.4.1 depends on Flask-OpenID<2 and >=1.2.5
    flask-appbuilder 3.4.0 depends on Flask-OpenID<2 and >=1.2.5
    flask-appbuilder 3.3.4 depends on Flask-OpenID<2 and >=1.2.5
    flask-appbuilder 3.3.3 depends on Flask-OpenID<2 and >=1.2.5
    flask-appbuilder 3.3.2 depends on Flask-OpenID<2 and >=1.2.5
    flask-appbuilder 3.3.1 depends on Flask-OpenID<2 and >=1.2.5
    flask-appbuilder 3.3.0 depends on Flask-OpenID<2 and >=1.2.5
    flask-appbuilder 3.2.3 depends on Flask-OpenID<2 and >=1.2.5
    flask-appbuilder 3.2.2 depends on Flask-OpenID<2 and >=1.2.5
    flask-appbuilder 3.2.1 depends on Flask-OpenID<2 and >=1.2.5
    flask-appbuilder 3.2.0 depends on Flask-OpenID<2 and >=1.2.5
    flask-appbuilder 3.1.1 depends on Flask-OpenID<2 and >=1.2.5
    The user requested (constraint) flask-openid==1.2.5

To fix this you could try to:
1. loosen the range of package versions you've specified
2. remove package versions to allow pip attempt to solve the dependency conflict

The command '/bin/sh -c chmod u+x /bootstrap.sh && /bootstrap.sh' returned a non-zero code: 1

It's really weird since I was able to build the image without any issues yesterday. Today I removed it and tried to build it again and I keep getting the error above. It happens for both releases 2.0.2 and 1.10.12 - both raise the same error at the same point.


Hardware:

Architecture:                    x86_64
CPU op-mode(s):                  32-bit, 64-bit
Byte Order:                      Little Endian
Address sizes:                   48 bits physical, 48 bits virtual
CPU(s):                          16
On-line CPU(s) list:             0-15
Thread(s) per core:              2
Core(s) per socket:              8
Socket(s):                       1
NUMA node(s):                    1
Vendor ID:                       AuthenticAMD
CPU family:                      23
Model:                           96
Model name:                      AMD Ryzen 7 PRO 4750U with Radeon Graphics
Stepping:                        1
Frequency boost:                 enabled
CPU MHz:                         1700.000
CPU max MHz:                     1700,0000
CPU min MHz:                     1400,0000
BogoMIPS:                        3393.72
Virtualization:                  AMD-V
L1d cache:                       256 KiB
L1i cache:                       256 KiB
L2 cache:                        4 MiB
L3 cache:                        8 MiB
NUMA node0 CPU(s):               0-15

OS:

DISTRIB_ID=Ubuntu
DISTRIB_RELEASE=21.10
DISTRIB_CODENAME=impish
DISTRIB_DESCRIPTION="Ubuntu 21.10"
PRETTY_NAME="Ubuntu 21.10"
NAME="Ubuntu"
VERSION_ID="21.10"
VERSION="21.10 (Impish Indri)"
VERSION_CODENAME=impish
ID=ubuntu
ID_LIKE=debian
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
UBUNTU_CODENAME=impish

unable to login

I followed the instruction when it came to login I used the admin and test but unable to login. any suggestions

Airflow Database Fails to be Created v2.02

After running ./mwaa-local-env build-image, I'm getting an error on start up where it can't find a database in postgres called "airflow"

Build Output:
[+] Building 467.8s (18/18) FINISHED
=> [internal] load build definition from Dockerfile 0.0s
=> => transferring dockerfile: 1.13kB 0.0s
=> [internal] load .dockerignore 0.0s
=> => transferring context: 2B 0.0s
=> [internal] load metadata for public.ecr.aws/amazonlinux/amazonlinux:latest 1.3s
=> [ 1/13] FROM public.ecr.aws/amazonlinux/amazonlinux@sha256:69a7e4782be3dc25f610582d6ad7a88d1db7db08f27248153efe726ea6 4.5s
=> => resolve public.ecr.aws/amazonlinux/amazonlinux@sha256:69a7e4782be3dc25f610582d6ad7a88d1db7db08f27248153efe726ea607 0.0s
=> => sha256:69a7e4782be3dc25f610582d6ad7a88d1db7db08f27248153efe726ea6077bbf 770B / 770B 0.0s
=> => sha256:cc86ae1a38753902f280ccd88a727370c6b3b258c220a9cea97098bfab559e47 529B / 529B 0.0s
=> => sha256:72203fadaf3a1e1836f88dafe18ba7413381e5d623aa0167689e204d00e8be4c 1.49kB / 1.49kB 0.0s
=> => sha256:74f9a6be36f3bc3bf6041c40376d548e3a8b720a0455674b19e9174a9e567078 63.69MB / 63.69MB 2.6s
=> => extracting sha256:74f9a6be36f3bc3bf6041c40376d548e3a8b720a0455674b19e9174a9e567078 1.7s
=> [internal] load build context 0.0s
=> => transferring context: 58.52kB 0.0s
=> [ 2/13] COPY script/bootstrap.sh /bootstrap.sh 0.1s
=> [ 3/13] COPY script/systemlibs.sh /systemlibs.sh 0.0s
=> [ 4/13] COPY config/constraints.txt /constraints.txt 0.0s
=> [ 5/13] COPY config/requirements.txt /requirements.txt 0.0s
=> [ 6/13] RUN chmod u+x /systemlibs.sh && /systemlibs.sh 36.2s
=> [ 7/13] RUN chmod u+x /bootstrap.sh && /bootstrap.sh 420.9s
=> [ 8/13] COPY script/entrypoint.sh /entrypoint.sh 0.0s
=> [ 9/13] COPY config/airflow.cfg /usr/local/airflow/airflow.cfg 0.0s
=> [10/13] COPY config/webserver_config.py /usr/local/airflow/webserver_config.py 0.0s
=> [11/13] RUN chown -R airflow: /usr/local/airflow 0.1s
=> [12/13] RUN chmod +x /entrypoint.sh 0.2s
=> [13/13] WORKDIR /usr/local/airflow 0.0s
=> exporting to image 4.4s
=> => exporting layers 4.4s
=> => writing image sha256:106708a74262919fcffda619d0ad72fbd2133b162980f496a4ec47be53d8a8ca 0.0s
=> => naming to docker.io/amazon/mwaa-local:2.0.2

Run Output:
aws-mwaa-local-runner git:(main) ✗ ./mwaa-local-env start
[+] Running 10/10
⠿ postgres Pulled 4.5s
⠿ 9b3977197b4f Pull complete 0.8s
⠿ 995a68b04f2b Pull complete 0.8s
⠿ e8a8bdf2ee5e Pull complete 0.9s
⠿ 1ebc4d347b54 Pull complete 2.8s
⠿ 6b3dbdc5cac4 Pull complete 2.9s
⠿ b1466ea51568 Pull complete 2.9s
⠿ 9f0e4a4c8440 Pull complete 2.9s
⠿ a8ca5f3c2a28 Pull complete 3.0s
⠿ d29e603cee34 Pull complete 3.0s
[+] Running 3/2
⠿ Network docker_default Created 0.0s
⠿ Container docker_postgres_1 Created 0.1s
⠿ Container docker_local-runner_1 Created 0.0s
Attaching to local-runner_1, postgres_1
postgres_1 |
postgres_1 | PostgreSQL Database directory appears to contain a database; Skipping initialization
postgres_1 |
postgres_1 | 2021-12-07 20:07:18.417 UTC [1] LOG: listening on IPv4 address "0.0.0.0", port 5432
postgres_1 | 2021-12-07 20:07:18.417 UTC [1] LOG: listening on IPv6 address "::", port 5432
postgres_1 | 2021-12-07 20:07:18.431 UTC [1] LOG: listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
postgres_1 | 2021-12-07 20:07:18.496 UTC [23] LOG: database system was shut down at 2021-12-07 19:45:43 UTC
postgres_1 | 2021-12-07 20:07:18.522 UTC [1] LOG: database system is ready to accept connections
postgres_1 | 2021-12-07 20:07:18.719 UTC [30] LOG: incomplete startup packet
local-runner_1 | Installing requirements.txt
local-runner_1 | DB: postgresql+psycopg2://airflow:***@postgres:5432/airflow
local-runner_1 | [2021-12-07 20:07:20,906] {{db.py:684}} INFO - Creating tables
postgres_1 | 2021-12-07 20:07:20.913 UTC [31] FATAL: database "airflow" does not exist
postgres_1 | 2021-12-07 20:07:20.935 UTC [32] FATAL: database "airflow" does not exist
postgres_1 | 2021-12-07 20:07:20.942 UTC [33] FATAL: database "airflow" does not exist
local-runner_1 | Traceback (most recent call last):
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/engine/base.py", line 2336, in _wrap_pool_connect
local-runner_1 | return fn()
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/pool/base.py", line 304, in unique_connection
local-runner_1 | return _ConnectionFairy._checkout(self)
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/pool/base.py", line 778, in _checkout
local-runner_1 | fairy = _ConnectionRecord.checkout(pool)
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/pool/base.py", line 495, in checkout
local-runner_1 | rec = pool._do_get()
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/pool/impl.py", line 241, in _do_get
local-runner_1 | return self._create_connection()
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/pool/base.py", line 309, in _create_connection
local-runner_1 | return _ConnectionRecord(self)
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/pool/base.py", line 440, in init
local-runner_1 | self.__connect(first_connect_check=True)
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/pool/base.py", line 661, in _connect
local-runner_1 | pool.logger.debug("Error on connect(): %s", e)
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/util/langhelpers.py", line 70, in exit
local-runner_1 | with_traceback=exc_tb,
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/util/compat.py", line 182, in raise

local-runner_1 | raise exception
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/pool/base.py", line 656, in __connect
local-runner_1 | connection = pool._invoke_creator(self)
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/engine/strategies.py", line 114, in connect
local-runner_1 | return dialect.connect(*cargs, **cparams)
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/engine/default.py", line 508, in connect
local-runner_1 | return self.dbapi.connect(*cargs, **cparams)
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/psycopg2/init.py", line 122, in connect
local-runner_1 | conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
local-runner_1 | psycopg2.OperationalError: FATAL: database "airflow" does not exist
local-runner_1 |
local-runner_1 |
local-runner_1 | The above exception was the direct cause of the following exception:
local-runner_1 |
local-runner_1 | Traceback (most recent call last):
local-runner_1 | File "/usr/local/bin/airflow", line 8, in
local-runner_1 | sys.exit(main())
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/airflow/main.py", line 40, in main
local-runner_1 | args.func(args)
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/airflow/cli/cli_parser.py", line 48, in command
local-runner_1 | return func(*args, **kwargs)
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/airflow/cli/commands/db_command.py", line 31, in initdb
local-runner_1 | db.initdb()
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/airflow/utils/db.py", line 559, in initdb
local-runner_1 | upgradedb()
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/airflow/utils/db.py", line 694, in upgradedb
local-runner_1 | command.upgrade(config, 'heads')
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/alembic/command.py", line 294, in upgrade
local-runner_1 | script.run_env()
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/alembic/script/base.py", line 490, in run_env
local-runner_1 | util.load_python_file(self.dir, "env.py")
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/alembic/util/pyfiles.py", line 97, in load_python_file
local-runner_1 | module = load_module_py(module_id, path)
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/alembic/util/compat.py", line 182, in load_module_py
local-runner_1 | spec.loader.exec_module(module)
local-runner_1 | File "", line 728, in exec_module
local-runner_1 | File "", line 219, in _call_with_frames_removed
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/airflow/migrations/env.py", line 108, in
local-runner_1 | run_migrations_online()
local-runner_1 | File "/usr/local/lib/python3.7/site-packages/airflow/migrations/env.py", line 91, in run_migrations_online
local-runner_1 | with connectable.connect() as connection:
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/engine/base.py", line 2263, in connect
local-runner_1 | return self._connection_cls(self, **kwargs)
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/engine/base.py", line 104, in init
local-runner_1 | else engine.raw_connection()
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/engine/base.py", line 2370, in raw_connection
local-runner_1 | self.pool.unique_connection, _connection
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/engine/base.py", line 2340, in wrap_pool_connect
local-runner_1 | e, dialect, self
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/engine/base.py", line 1584, in handle_dbapi_exception_noconnection
local-runner_1 | sqlalchemy_exception, with_traceback=exc_info[2], from
=e
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/util/compat.py", line 182, in raise

local-runner_1 | raise exception
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/engine/base.py", line 2336, in _wrap_pool_connect
local-runner_1 | return fn()
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/pool/base.py", line 304, in unique_connection
local-runner_1 | return _ConnectionFairy._checkout(self)
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/pool/base.py", line 778, in _checkout
local-runner_1 | fairy = _ConnectionRecord.checkout(pool)
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/pool/base.py", line 495, in checkout
local-runner_1 | rec = pool._do_get()
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/pool/impl.py", line 241, in _do_get
local-runner_1 | return self._create_connection()
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/pool/base.py", line 309, in _create_connection
local-runner_1 | return _ConnectionRecord(self)
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/pool/base.py", line 440, in init
local-runner_1 | self.__connect(first_connect_check=True)
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/pool/base.py", line 661, in _connect
local-runner_1 | pool.logger.debug("Error on connect(): %s", e)
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/util/langhelpers.py", line 70, in exit
local-runner_1 | with_traceback=exc_tb,
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/util/compat.py", line 182, in raise

local-runner_1 | raise exception
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/pool/base.py", line 656, in __connect
local-runner_1 | connection = pool._invoke_creator(self)
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/engine/strategies.py", line 114, in connect
local-runner_1 | return dialect.connect(*cargs, **cparams)
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/engine/default.py", line 508, in connect
local-runner_1 | return self.dbapi.connect(*cargs, **cparams)
local-runner_1 | File "/usr/local/lib64/python3.7/site-packages/psycopg2/init.py", line 122, in connect
local-runner_1 | conn = _connect(dsn, connection_factory=connection_factory, **kwasync)
local-runner_1 | sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) FATAL: database "airflow" does not exist
local-runner_1 |
local-runner_1 | (Background on this error at: http://sqlalche.me/e/13/e3q8)
postgres_1 | 2021-12-07 20:07:21.383 UTC [34] FATAL: database "airflow" does not exist
local-runner_1 | [2021-12-07 20:07:21,384] {{cli_action_loggers.py:105}} WARNING - Failed to log action with (psycopg2.OperationalError) FATAL: database "airflow" does not exist

So far I've tried:

  • Clearing the Docker Cache and repulling the repo
  • The solution to issue #16
  • Manually adding the "airflow" db to postgres (I ran into issues logging into postgres when I tried to do this)

OS is Mac Big Sur with M1

"snowflake" not showing up in conn_type Dropwdown

Hi im using MWAA airflow 2.0.2 and have following (redacted) requirements.txt

apache-airflow-providers-snowflake==1.3.0
snowflake-connector-python==2.4.6
snowflake-sqlalchemy==1.2.4

but even after all deps are installed the "Snowflake" conn_type is not showing in dropdown under "Admin -> Connections"

Apple M1 issue

Hardware: MacBook Pro (13-inch, M1, 2020), Apple M1
OS: Big Sur 11.2.3

At step 1: ./mwaa-local-env build-image

Fails with error:

 ERROR [ 7/13] RUN chmod u+x /bootstrap.sh && /bootstrap.sh 
....
2dd0ca/pandas-1.1.0.tar.gz (5.2MB)
#12 28.13     Complete output from command python setup.py egg_info:
#12 28.13     Traceback (most recent call last):
#12 28.13       File "<string>", line 1, in <module>
#12 28.13       File "/tmp/pip-build-4qbofay4/pandas/setup.py", line 788, in <module>
#12 28.13         setup_package()
#12 28.13       File "/tmp/pip-build-4qbofay4/pandas/setup.py", line 758, in setup_package
#12 28.13         ext_modules=maybe_cythonize(extensions, compiler_directives=directives),
#12 28.13       File "/tmp/pip-build-4qbofay4/pandas/setup.py", line 515, in maybe_cythonize
#12 28.13         raise RuntimeError("Cannot cythonize without Cython installed.")
#12 28.13     RuntimeError: Cannot cythonize without Cython installed.
#12 28.13     
#12 28.13     ----------------------------------------
#12 28.18 Command "python setup.py egg_info" failed with error code 1 in /tmp/pip-build-4qbofay4/pandas/

I flag this as an M1 issue as this error is not raised on my non M1 Macbook.

To attempt to resolve I added cython to docker/config/requirements.txt but reach the same error

airflow command line support

It's not explicitly called out in the readme, but I would expect the airflow command line tool to be available as well. When attempting to run airflow commands on the container, I received the following provider error:

[airflow@6fabad976155 ~]$ airflow --version
Traceback (most recent call last):
  File "/usr/local/bin/airflow", line 5, in <module>
    from airflow.__main__ import main
  File "/usr/local/lib/python3.7/site-packages/airflow/__init__.py", line 34, in <module>
    from airflow import settings
  File "/usr/local/lib/python3.7/site-packages/airflow/settings.py", line 37, in <module>
    from airflow.configuration import AIRFLOW_HOME, WEBSERVER_CONFIG, conf  # NOQA F401
  File "/usr/local/lib/python3.7/site-packages/airflow/configuration.py", line 1098, in <module>
    conf = initialize_config()
  File "/usr/local/lib/python3.7/site-packages/airflow/configuration.py", line 860, in initialize_config
    conf.validate()
  File "/usr/local/lib/python3.7/site-packages/airflow/configuration.py", line 199, in validate
    self._validate_config_dependencies()
  File "/usr/local/lib/python3.7/site-packages/airflow/configuration.py", line 240, in _validate_config_dependencies
    f"error: sqlite C library version too old (< {min_sqlite_version}). "
airflow.exceptions.AirflowConfigException: error: sqlite C library version too old (< 3.15.0). See https://airflow.apache.org/docs/apache-airflow/2.0.2/howto/set-up-database.rst#setting-up-a-sqlite-database

It looks like a library reference needs to be updated somewhere, but I'm not sure if it's here or the underlying amazonlinux image. Or, maybe there's a way to avoid the apache-airflow-providers-sqlite dependency completely as this uses a postgres backend? Guidance or a fix would be appreciated, thanks!

plugins folder issues

I am trying to use the plugins folder to deploy a custom sensor and can't get it to work in the plugins folder. If I copy the sensor folder/file to the dags folder I can at least import the custom sensor and use it as a workaround

The first issue I encountered was the aws-mwaa-local-running/plugins folder had root ownership so I couldn't put plugin folders/files in the folder. I used chown to change the ownership and was able to drop folders/files in.

Files I drop in compile successfully (a pycache folder is created), but they do not show up in the plugins page in the Airflow.

Image Support for MWAA Local Runner v1.10.15

Requesting a release of MWAA Local Runner for Airflow v1.10.15.

The reason is so that as customers are migrating from v1.10.12 to v2.0.x they can leverage this "Bridge" version to migrate their DAGs from 1.10.x to v2.0.x using several of the built in Airflow Upgrade Checks available on v1.10.15.

running docker on local runner

we are testing docker dag locally and have the dag file below and when

default_args = {
'owner' : 'airflow',
'description' : 'Use of the DockerOperator',
'depend_on_past' : False,
'start_date' : datetime(2021, 7, 28),
'email_on_failure' : True,
'email_on_retry' : False,
'retries' : 0,
'retry_delay' : timedelta(minutes=5)
}
dag = DAG('bash_docker_dag', default_args=default_args, schedule_interval="5 * * * *", catchup=False, tags= ['maf_opt_by_bin'])
t1 = BashOperator(
task_id='print_current_dir',
bash_command='echo $(pwd)',
dag=dag
)
t3 = BashOperator(
task_id='test-job',
bash_command = '/usr/local/bin/docker run -v cmd "

and when we trigger dag from console it shows docker

[2021-07-28 15:03:11,253] {{bash.py:158}} INFO - Running command: docker run -v "CMD"
[2021-07-28 15:03:11,261] {{bash.py:169}} INFO - Output:
[2021-07-28 15:03:11,262] {{bash.py:173}} INFO - bash: docker: command not found
[2021-07-28 15:03:11,262] {{bash.py:177}} INFO - Command exited with return code 127

Run with config not working from ui .

I am not able to change the default params by passing config in the UI.

Tested this locally on the mwaa-local-runner as well as on AWS cloud.

Minimal Code for Reproducing Error

from datetime import datetime, timedelta

from airflow.models.dag import DAG
from airflow.operators.python import PythonOperator


def dummy_run_spark(**context):
    import pprint

    from random import randint
    from time import sleep
    config = context["params"]
    pprint.pprint(type(config))
    pprint.pprint(context["task"].task_id)
    pprint.pprint(context)
    sleep(randint(1, 10))


default_params = {
    "name": "Jhon"
}

default_args = {
    "owner": "airflow",
    "depends_on_past": False,
    "start_date": datetime(year=2018, month=7, day=10),
    "email": ["[email protected]"],
    "email_on_failure": False,
    "email_on_retry": False,
    "retries": 1,
    "retry_delay": timedelta(seconds=10),
    "catchup": False,
    "params": default_params,
}

with DAG(
        'Testing',
        default_args=default_args,
        schedule_interval=None,
 
) as dag:

    t1 = PythonOperator(task_id='t1',
                                provide_context=True,
                                python_callable=dummy_run_spark,)

For the above dag if I pass params as
{ "name": "Jhon Doe" }
The logs still show the name as Jhon

This however works on the official airflow docker image (Version 2.0.2). On the official image I can change the default params from UI and the same is reflected on the logs as well

I am checking the params passed by logging few stuff in the function def dummy_run_spark(**context):

ERROR - No module named 'celery_config' when triggering a separate task in Airflow on Docker

Deployment details
Running deployment on Docker for Local MWAA
aws-mwaa-local-runner:
https://github.com/aws/aws-mwaa-local-runner

What happened

When triggering a single task instance through the UI (on demand), as soon as I click Run I get an error "Ooops!
Something bad has happened."

Here is the stack trace:

local-runner_1  | /usr/local/lib/python3.7/site-packages/airflow/configuration.py:361 DeprecationWarning: The default_queue option in [celery] has been moved to the default_queue option in [operators] - the old setting has been used, but please update your config.
local-runner_1  | [2021-10-22 05:21:40,118] {{configuration.py:484}} ERROR - No module named 'celery_config'
local-runner_1  | [2021-10-22 05:21:40,124] {{app.py:1892}} ERROR - Exception on /run [POST]
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/configuration.py", line 482, in getimport
local-runner_1  |     return import_string(full_qualified_path)
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/utils/module_loading.py", line 32, in import_string
local-runner_1  |     module = import_module(module_path)
local-runner_1  |   File "/usr/lib64/python3.7/importlib/__init__.py", line 127, in import_module
local-runner_1  |     return _bootstrap._gcd_import(name[level:], package, level)
local-runner_1  |   File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
local-runner_1  |   File "<frozen importlib._bootstrap>", line 983, in _find_and_load
local-runner_1  |   File "<frozen importlib._bootstrap>", line 965, in _find_and_load_unlocked
local-runner_1  | ModuleNotFoundError: No module named 'celery_config'
local-runner_1  | 
local-runner_1  | During handling of the above exception, another exception occurred:
local-runner_1  | 
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 2447, in wsgi_app
local-runner_1  |     response = self.full_dispatch_request()
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1952, in full_dispatch_request
local-runner_1  |     rv = self.handle_user_exception(e)
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1821, in handle_user_exception
local-runner_1  |     reraise(exc_type, exc_value, tb)
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/flask/_compat.py", line 39, in reraise
local-runner_1  |     raise value
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1950, in full_dispatch_request
local-runner_1  |     rv = self.dispatch_request()
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/flask/app.py", line 1936, in dispatch_request
local-runner_1  |     return self.view_functions[rule.endpoint](**req.view_args)
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/www/auth.py", line 51, in decorated
local-runner_1  |     return func(*args, **kwargs)
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/www/decorators.py", line 72, in wrapper
local-runner_1  |     return f(*args, **kwargs)
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/www/views.py", line 1525, in run
local-runner_1  |     from airflow.executors.celery_executor import CeleryExecutor
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/executors/celery_executor.py", line 72, in <module>
local-runner_1  |     celery_configuration = conf.getimport('celery', 'celery_config_options')
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/configuration.py", line 486, in getimport
local-runner_1  |     f'The object could not be loaded. Please check "{key}" key in "{section}" section. '
local-runner_1  | airflow.exceptions.AirflowConfigException: The object could not be loaded. Please check "celery_config_options" key in "celery" section. Current value: "celery_config.CUSTOM_CELERY_CONFIG".

What you expected to happen

single task is triggered on-demand and running

How to reproduce
Start Apache airflow locally in docker environment
Create a sample DAG
Go into Graph View
Click on the Task Instance
Click Run

It looks like a environment issue,
This option is incorrect:

celery_config_options = celery_config.CUSTOM_CELERY_CONFIG

Here is default value:
airflow/airflow/config_templates/default_airflow.cfg

services with 'depends_on' cannot be extended

I have a setup that extends from docker-compose-local.yaml file. It was working, the connection between services was fine because there is a loop to try the connection

wait_for_port "Postgres" "$POSTGRES_HOST" "$POSTGRES_PORT"

The latest versions have a depends_on that throws that exepection - expected behaivour (docker/compose#7916).

Wondering if can be possible to keep the old approach to be able to extend the service. Thi is useful when you you want to use this repo as submodule instead of overwrite with custom code.

Release date for Airflow >=2.0?

I note the current version is Airflow version 1.10.12 so surely queuing up tech debt before an inevitable Airflow 2.0 update - is there a date for the update?

Calling AWS Glue local docker job from mwaa-local-runner

Hello,

as part of the construction of complete local dev environment for glue jobs I need this local runner as well, but I need to trigger glue jobs running in glue docker container. What operator or mechanism should I use to make this integration working?

airflow.exceptions.AirflowConfigException: error: sqlite C library version too old (< 3.15.0).

I am trying to follow the AWS Tutorial at "https://docs.aws.amazon.com/mwaa/latest/userguide/tutorials-docker.html".
I am able to get everything running locally. However, when I get to step 6 and spin up a stack containing the ECS cluster and task definition, I got this error:
airflow.exceptions.AirflowConfigException: error: sqlite C library version too old (< 3.15.0). See

These errors are indicating that the sqlite library it is using is too old (< 3.15.0). It seems something has changed since this tutorial was written.
The sqlite version that comes with aws-mwaa-local-runner https://github.com/aws/aws-mwaa-local-runner is 3.7.17

how to upgrade the sqlite version in the docker which is compatible with Airflow

Database instructions missing from README

Your instructions imply that there will be instructions about choosing/adding a database, but its missing:

### Step two: Running Apache Airflow

Run Apache Airflow using one of the following database backends.

But there is no 'following' database backends or any instructions about how to add one.

Support AWS MWAA REST API

Can you please add support for AWS MWAA REST API ?
e.g "${host}/aws_mwaa/cli"
body : 'pause my_dag'

Can't access Airflow UI with defaults

By default, the bootstrap.sh script creates a username and password for your local Airflow environment.

Username: admin
Password: test

This doesn't work for me!

Incorrect line endings when running commands using WSL 2 on Windows

Issue

When running ./mwaa-local-env build-image, I ran into the following error:

bash ./mwaa-local-env build-image
# ./mwaa-local-env: line 2: $'\r': command not found
# ./mwaa-local-env: line 4: $'\r': command not found
# ./mwaa-local-env: line 5: syntax error near unexpected token `$'{\r''
# '/mwaa-local-env: line 5: `display_help() {

Resolution

After some research, I found that the issue seemed to be tied to Windows carriage returns in the file. This issue provides two approaches to resolve the error. I followed the second option proposed by sakshihmss to modify carriage returns in both mwaa-local-env and each respective file in docker/script:

sed -i 's/\r//' mwaa-local-env

for i in docker/script/*;
do
    sed -i 's/\r//' $i
done

I was then able to run the following to build the Docker image and start Airflow:

# (1) Within WSL
bash ./mwaa-local-env build-image
# (2) First, exit out of WSL
docker-compose -f ./docker/docker-compose-local.yml up

For #2, There is most certainly a streamlined approach that leverages the functions in mwaa-local-env, though this worked for me

System Details

Docker

Network DNS Server: 8.8.8.8
WSL Integration: Ubuntu-20.04
Engine: v20.10.2

/etc/resolv.conf

search local
nameserver 8.8.8.8

Windows

Edition: Windows 10 Pro
Version: 2004

Postgresql would not start from a Windows WSL2 host - could not change permissions

Full disclosure I don't really know what I am doing with docker or airflow but wanted to post this in case it helped another Windows user use this project. I was able to build just fine but when running ./mwaa-local-env start it would fail starting postgres_1 with the error:

fixing permissions on existing directory /var/lib/postgresql/data ... initdb: could not change permissions of directory "/var/lib/postgresql/data": Operation not permitted

Following these steps allowed postgres and then airflow to load

  1. docker volume create --name=db-data to create a db-data volume in docker

  2. In file ./docker/docker-compose-local.yml change the line
    - "${PWD}/db-data:/var/lib/postgresql/data"
    to
    - "/db-data:/var/lib/postgresql/data"

I am not sure what side-effects this will have for me. I suppose I can't access the postgres data files from the windows filesystem but I don't think I need to do that anyway.

Hive plugin doesn't work

I follow the tutorial for custom plugins for Hive and Hadoop. Unfortunately, it doesn't work. I was able to reproduce the issue both locally and on the MWAA. The issue indicates the Python version is wrong, so with the thirft version.

Can we update the tutorial with some testing point?

local-runner_1  | Installing requirements.txt
local-runner_1  | You must give at least one requirement to install (see "pip help install")
local-runner_1  | [2021-05-05 16:30:01,884] {{plugins_manager.py:166}} ERROR - unsupported operand type(s) for +: 'NoneType' and 'str'
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/plugins_manager.py", line 160, in <module>
local-runner_1  |     m = imp.load_source(namespace, filepath)
local-runner_1  |   File "/usr/lib64/python3.7/imp.py", line 171, in load_source
local-runner_1  |     module = _load(spec)
local-runner_1  |   File "<frozen importlib._bootstrap>", line 696, in _load
local-runner_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 728, in exec_module
local-runner_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
local-runner_1  |   File "/usr/local/airflow/plugins/hive_plugin/hive_plugin.py", line 8, in <module>
local-runner_1  |     os.environ["CLASSPATH"] = os.getenv("CLASSPATH") + ":/usr/local/airflow/plugins/apache-hive-3.1.2-bin/lib"
local-runner_1  | TypeError: unsupported operand type(s) for +: 'NoneType' and 'str'
local-runner_1  | [2021-05-05 16:30:01,886] {{plugins_manager.py:167}} ERROR - Failed to import plugin /usr/local/airflow/plugins/hive_plugin/hive_plugin.py
local-runner_1  | [2021-05-05 16:30:02,074] {{plugins_manager.py:166}} ERROR - Missing parentheses in call to 'print'. Did you mean print("Would run:")? (hcat.py, line 143)
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/plugins_manager.py", line 160, in <module>
local-runner_1  |     m = imp.load_source(namespace, filepath)
local-runner_1  |   File "/usr/lib64/python3.7/imp.py", line 171, in load_source
local-runner_1  |     module = _load(spec)
local-runner_1  |   File "<frozen importlib._bootstrap>", line 696, in _load
local-runner_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 724, in exec_module
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 860, in get_code
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 791, in source_to_code
local-runner_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
local-runner_1  |   File "/usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/hcatalog/bin/hcat.py", line 143
local-runner_1  |     print "Would run:"
local-runner_1  |                      ^
local-runner_1  | SyntaxError: Missing parentheses in call to 'print'. Did you mean print("Would run:")?
local-runner_1  | [2021-05-05 16:30:02,074] {{plugins_manager.py:167}} ERROR - Failed to import plugin /usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/hcatalog/bin/hcat.py
local-runner_1  | [2021-05-05 16:30:02,093] {{plugins_manager.py:166}} ERROR - invalid syntax (hcat_server.py, line 29)
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/plugins_manager.py", line 160, in <module>
local-runner_1  |     m = imp.load_source(namespace, filepath)
local-runner_1  |   File "/usr/lib64/python3.7/imp.py", line 171, in load_source
local-runner_1  |     module = _load(spec)
local-runner_1  |   File "<frozen importlib._bootstrap>", line 696, in _load
local-runner_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 724, in exec_module
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 860, in get_code
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 791, in source_to_code
local-runner_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
local-runner_1  |   File "/usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/hcatalog/sbin/hcat_server.py", line 29
local-runner_1  |     print "Usage: %s [--config confdir] COMMAND" % (sys.argv[0])
local-runner_1  |                                                ^
local-runner_1  | SyntaxError: invalid syntax
local-runner_1  | [2021-05-05 16:30:02,093] {{plugins_manager.py:167}} ERROR - Failed to import plugin /usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/hcatalog/sbin/hcat_server.py
local-runner_1  | [2021-05-05 16:30:02,297] {{plugins_manager.py:166}} ERROR - No module named 'FacebookService'
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/plugins_manager.py", line 160, in <module>
local-runner_1  |     m = imp.load_source(namespace, filepath)
local-runner_1  |   File "/usr/lib64/python3.7/imp.py", line 171, in load_source
local-runner_1  |     module = _load(spec)
local-runner_1  |   File "<frozen importlib._bootstrap>", line 696, in _load
local-runner_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 728, in exec_module
local-runner_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
local-runner_1  |   File "/usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/fb303/FacebookBase.py", line 23, in <module>
local-runner_1  |     import FacebookService
local-runner_1  | ModuleNotFoundError: No module named 'FacebookService'
local-runner_1  | [2021-05-05 16:30:02,298] {{plugins_manager.py:167}} ERROR - Failed to import plugin /usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/fb303/FacebookBase.py
local-runner_1  | [2021-05-05 16:30:02,316] {{plugins_manager.py:166}} ERROR - No module named 'ttypes'
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/plugins_manager.py", line 160, in <module>
local-runner_1  |     m = imp.load_source(namespace, filepath)
local-runner_1  |   File "/usr/lib64/python3.7/imp.py", line 171, in load_source
local-runner_1  |     module = _load(spec)
local-runner_1  |   File "<frozen importlib._bootstrap>", line 696, in _load
local-runner_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 728, in exec_module
local-runner_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
local-runner_1  |   File "/usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/fb303/FacebookService.py", line 8, in <module>
local-runner_1  |     from ttypes import *
local-runner_1  | ModuleNotFoundError: No module named 'ttypes'
local-runner_1  | [2021-05-05 16:30:02,318] {{plugins_manager.py:167}} ERROR - Failed to import plugin /usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/fb303/FacebookService.py
local-runner_1  | [2021-05-05 16:30:02,333] {{plugins_manager.py:166}} ERROR - No module named 'ttypes'
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/plugins_manager.py", line 160, in <module>
local-runner_1  |     m = imp.load_source(namespace, filepath)
local-runner_1  |   File "/usr/lib64/python3.7/imp.py", line 171, in load_source
local-runner_1  |     module = _load(spec)
local-runner_1  |   File "<frozen importlib._bootstrap>", line 696, in _load
local-runner_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 728, in exec_module
local-runner_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
local-runner_1  |   File "/usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/fb303/constants.py", line 8, in <module>
local-runner_1  |     from ttypes import *
local-runner_1  | ModuleNotFoundError: No module named 'ttypes'
local-runner_1  | [2021-05-05 16:30:02,335] {{plugins_manager.py:167}} ERROR - Failed to import plugin /usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/fb303/constants.py
local-runner_1  | [2021-05-05 16:30:02,360] {{plugins_manager.py:166}} ERROR - Missing parentheses in call to 'print'. Did you mean print(msg)? (fb303_simple_mgmt.py, line 58)
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/plugins_manager.py", line 160, in <module>
local-runner_1  |     m = imp.load_source(namespace, filepath)
local-runner_1  |   File "/usr/lib64/python3.7/imp.py", line 171, in load_source
local-runner_1  |     module = _load(spec)
local-runner_1  |   File "<frozen importlib._bootstrap>", line 696, in _load
local-runner_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 724, in exec_module
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 860, in get_code
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 791, in source_to_code
local-runner_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
local-runner_1  |   File "/usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/fb303_scripts/fb303_simple_mgmt.py", line 58
local-runner_1  |     print msg
local-runner_1  |             ^
local-runner_1  | SyntaxError: Missing parentheses in call to 'print'. Did you mean print(msg)?
local-runner_1  | [2021-05-05 16:30:02,360] {{plugins_manager.py:167}} ERROR - Failed to import plugin /usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/fb303_scripts/fb303_simple_mgmt.py
local-runner_1  | [2021-05-05 16:30:02,377] {{plugins_manager.py:166}} ERROR - No module named 'ttypes'
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/plugins_manager.py", line 160, in <module>
local-runner_1  |     m = imp.load_source(namespace, filepath)
local-runner_1  |   File "/usr/lib64/python3.7/imp.py", line 171, in load_source
local-runner_1  |     module = _load(spec)
local-runner_1  |   File "<frozen importlib._bootstrap>", line 696, in _load
local-runner_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 728, in exec_module
local-runner_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
local-runner_1  |   File "/usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/hive_serde/constants.py", line 10, in <module>
local-runner_1  |     from ttypes import *
local-runner_1  | ModuleNotFoundError: No module named 'ttypes'
local-runner_1  | [2021-05-05 16:30:02,379] {{plugins_manager.py:167}} ERROR - Failed to import plugin /usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/hive_serde/constants.py
local-runner_1  | [2021-05-05 16:30:02,403] {{plugins_manager.py:166}} ERROR - No module named 'ttypes'
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/plugins_manager.py", line 160, in <module>
local-runner_1  |     m = imp.load_source(namespace, filepath)
local-runner_1  |   File "/usr/lib64/python3.7/imp.py", line 171, in load_source
local-runner_1  |     module = _load(spec)
local-runner_1  |   File "<frozen importlib._bootstrap>", line 696, in _load
local-runner_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 728, in exec_module
local-runner_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
local-runner_1  |   File "/usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/queryplan/constants.py", line 10, in <module>
local-runner_1  |     from ttypes import *
local-runner_1  | ModuleNotFoundError: No module named 'ttypes'
local-runner_1  | [2021-05-05 16:30:02,404] {{plugins_manager.py:167}} ERROR - Failed to import plugin /usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/queryplan/constants.py
local-runner_1  | [2021-05-05 16:30:02,436] {{plugins_manager.py:166}} ERROR - No module named 'SCons'
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/plugins_manager.py", line 160, in <module>
local-runner_1  |     m = imp.load_source(namespace, filepath)
local-runner_1  |   File "/usr/lib64/python3.7/imp.py", line 171, in load_source
local-runner_1  |     module = _load(spec)
local-runner_1  |   File "<frozen importlib._bootstrap>", line 696, in _load
local-runner_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 728, in exec_module
local-runner_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
local-runner_1  |   File "/usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/thrift/TSCons.py", line 21, in <module>
local-runner_1  |     from SCons.Builder import Builder
local-runner_1  | ModuleNotFoundError: No module named 'SCons'
local-runner_1  | [2021-05-05 16:30:02,438] {{plugins_manager.py:167}} ERROR - Failed to import plugin /usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/thrift/TSCons.py
local-runner_1  | [2021-05-05 16:30:02,470] {{plugins_manager.py:166}} ERROR - No module named 'TProtocol'
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/plugins_manager.py", line 160, in <module>
local-runner_1  |     m = imp.load_source(namespace, filepath)
local-runner_1  |   File "/usr/lib64/python3.7/imp.py", line 171, in load_source
local-runner_1  |     module = _load(spec)
local-runner_1  |   File "<frozen importlib._bootstrap>", line 696, in _load
local-runner_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 728, in exec_module
local-runner_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
local-runner_1  |   File "/usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/thrift/protocol/TBinaryProtocol.py", line 20, in <module>
local-runner_1  |     from TProtocol import *
local-runner_1  | ModuleNotFoundError: No module named 'TProtocol'
local-runner_1  | [2021-05-05 16:30:02,471] {{plugins_manager.py:167}} ERROR - Failed to import plugin /usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/thrift/protocol/TBinaryProtocol.py
local-runner_1  | [2021-05-05 16:30:02,515] {{plugins_manager.py:166}} ERROR - No module named 'ttypes'
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/plugins_manager.py", line 160, in <module>
local-runner_1  |     m = imp.load_source(namespace, filepath)
local-runner_1  |   File "/usr/lib64/python3.7/imp.py", line 171, in load_source
local-runner_1  |     module = _load(spec)
local-runner_1  |   File "<frozen importlib._bootstrap>", line 696, in _load
local-runner_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 728, in exec_module
local-runner_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
local-runner_1  |   File "/usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/thrift/reflection/limited/constants.py", line 8, in <module>
local-runner_1  |     from ttypes import *
local-runner_1  | ModuleNotFoundError: No module named 'ttypes'
local-runner_1  | [2021-05-05 16:30:02,517] {{plugins_manager.py:167}} ERROR - Failed to import plugin /usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/thrift/reflection/limited/constants.py
local-runner_1  | [2021-05-05 16:30:02,536] {{plugins_manager.py:166}} ERROR - No module named 'BaseHTTPServer'
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/plugins_manager.py", line 160, in <module>
local-runner_1  |     m = imp.load_source(namespace, filepath)
local-runner_1  |   File "/usr/lib64/python3.7/imp.py", line 171, in load_source
local-runner_1  |     module = _load(spec)
local-runner_1  |   File "<frozen importlib._bootstrap>", line 696, in _load
local-runner_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 728, in exec_module
local-runner_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
local-runner_1  |   File "/usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/thrift/server/THttpServer.py", line 20, in <module>
local-runner_1  |     import BaseHTTPServer
local-runner_1  | ModuleNotFoundError: No module named 'BaseHTTPServer'
local-runner_1  | [2021-05-05 16:30:02,537] {{plugins_manager.py:167}} ERROR - Failed to import plugin /usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/thrift/server/THttpServer.py
local-runner_1  | [2021-05-05 16:30:02,545] {{plugins_manager.py:166}} ERROR - No module named 'Queue'
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/plugins_manager.py", line 160, in <module>
local-runner_1  |     m = imp.load_source(namespace, filepath)
local-runner_1  |   File "/usr/lib64/python3.7/imp.py", line 171, in load_source
local-runner_1  |     module = _load(spec)
local-runner_1  |   File "<frozen importlib._bootstrap>", line 696, in _load
local-runner_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 728, in exec_module
local-runner_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
local-runner_1  |   File "/usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/thrift/server/TNonblockingServer.py", line 28, in <module>
local-runner_1  |     import Queue
local-runner_1  | ModuleNotFoundError: No module named 'Queue'
local-runner_1  | [2021-05-05 16:30:02,547] {{plugins_manager.py:167}} ERROR - Failed to import plugin /usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/thrift/server/TNonblockingServer.py
local-runner_1  | [2021-05-05 16:30:02,549] {{plugins_manager.py:166}} ERROR - invalid syntax (TServer.py, line 84)
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/plugins_manager.py", line 160, in <module>
local-runner_1  |     m = imp.load_source(namespace, filepath)
local-runner_1  |   File "/usr/lib64/python3.7/imp.py", line 171, in load_source
local-runner_1  |     module = _load(spec)
local-runner_1  |   File "<frozen importlib._bootstrap>", line 696, in _load
local-runner_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 724, in exec_module
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 860, in get_code
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 791, in source_to_code
local-runner_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
local-runner_1  |   File "/usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/thrift/server/TServer.py", line 84
local-runner_1  |     except TTransport.TTransportException, tx:
local-runner_1  |                                          ^
local-runner_1  | SyntaxError: invalid syntax
local-runner_1  | [2021-05-05 16:30:02,550] {{plugins_manager.py:167}} ERROR - Failed to import plugin /usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/thrift/server/TServer.py
local-runner_1  | [2021-05-05 16:30:02,568] {{plugins_manager.py:166}} ERROR - No module named 'TTransport'
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/plugins_manager.py", line 160, in <module>
local-runner_1  |     m = imp.load_source(namespace, filepath)
local-runner_1  |   File "/usr/lib64/python3.7/imp.py", line 171, in load_source
local-runner_1  |     module = _load(spec)
local-runner_1  |   File "<frozen importlib._bootstrap>", line 696, in _load
local-runner_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 728, in exec_module
local-runner_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
local-runner_1  |   File "/usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/thrift/transport/THttpClient.py", line 20, in <module>
local-runner_1  |     from TTransport import *
local-runner_1  | ModuleNotFoundError: No module named 'TTransport'
local-runner_1  | [2021-05-05 16:30:02,569] {{plugins_manager.py:167}} ERROR - Failed to import plugin /usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/thrift/transport/THttpClient.py
local-runner_1  | [2021-05-05 16:30:02,572] {{plugins_manager.py:166}} ERROR - invalid syntax (TSocket.py, line 78)
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/plugins_manager.py", line 160, in <module>
local-runner_1  |     m = imp.load_source(namespace, filepath)
local-runner_1  |   File "/usr/lib64/python3.7/imp.py", line 171, in load_source
local-runner_1  |     module = _load(spec)
local-runner_1  |   File "<frozen importlib._bootstrap>", line 696, in _load
local-runner_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 724, in exec_module
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 860, in get_code
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 791, in source_to_code
local-runner_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
local-runner_1  |   File "/usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/thrift/transport/TSocket.py", line 78
local-runner_1  |     except socket.error, e:
local-runner_1  |                        ^
local-runner_1  | SyntaxError: invalid syntax
local-runner_1  | [2021-05-05 16:30:02,573] {{plugins_manager.py:167}} ERROR - Failed to import plugin /usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/thrift/transport/TSocket.py
local-runner_1  | [2021-05-05 16:30:02,584] {{plugins_manager.py:166}} ERROR - No module named 'cStringIO'
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/plugins_manager.py", line 160, in <module>
local-runner_1  |     m = imp.load_source(namespace, filepath)
local-runner_1  |   File "/usr/lib64/python3.7/imp.py", line 171, in load_source
local-runner_1  |     module = _load(spec)
local-runner_1  |   File "<frozen importlib._bootstrap>", line 696, in _load
local-runner_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 728, in exec_module
local-runner_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
local-runner_1  |   File "/usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/thrift/transport/TTransport.py", line 20, in <module>
local-runner_1  |     from cStringIO import StringIO
local-runner_1  | ModuleNotFoundError: No module named 'cStringIO'
local-runner_1  | [2021-05-05 16:30:02,585] {{plugins_manager.py:167}} ERROR - Failed to import plugin /usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/thrift/transport/TTransport.py
local-runner_1  | [2021-05-05 16:30:02,595] {{plugins_manager.py:166}} ERROR - No module named 'zope.interface'
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/plugins_manager.py", line 160, in <module>
local-runner_1  |     m = imp.load_source(namespace, filepath)
local-runner_1  |   File "/usr/lib64/python3.7/imp.py", line 171, in load_source
local-runner_1  |     module = _load(spec)
local-runner_1  |   File "<frozen importlib._bootstrap>", line 696, in _load
local-runner_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 728, in exec_module
local-runner_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
local-runner_1  |   File "/usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/thrift/transport/TTwisted.py", line 19, in <module>
local-runner_1  |     from zope.interface import implements, Interface, Attribute
local-runner_1  | ModuleNotFoundError: No module named 'zope.interface'
local-runner_1  | [2021-05-05 16:30:02,596] {{plugins_manager.py:167}} ERROR - Failed to import plugin /usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/lib/py/thrift/transport/TTwisted.py
local-runner_1  | [2021-05-05 16:30:02,641] {{plugins_manager.py:166}} ERROR - Missing parentheses in call to 'print'. Did you mean print("Cannot determine the container size")? (package.py, line 30)
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/plugins_manager.py", line 160, in <module>
local-runner_1  |     m = imp.load_source(namespace, filepath)
local-runner_1  |   File "/usr/lib64/python3.7/imp.py", line 171, in load_source
local-runner_1  |     module = _load(spec)
local-runner_1  |   File "<frozen importlib._bootstrap>", line 696, in _load
local-runner_1  |   File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 724, in exec_module
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 860, in get_code
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 791, in source_to_code
local-runner_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
local-runner_1  |   File "/usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/scripts/llap/yarn/package.py", line 30
local-runner_1  |     print "Cannot determine the container size"
local-runner_1  |                                               ^
local-runner_1  | SyntaxError: Missing parentheses in call to 'print'. Did you mean print("Cannot determine the container size")?
local-runner_1  | [2021-05-05 16:30:02,642] {{plugins_manager.py:167}} ERROR - Failed to import plugin /usr/local/airflow/plugins/hive_plugin/apache-hive-3.1.2-bin/scripts/llap/yarn/package.py

AIRFLOW_ENV_NAME

When I tried running the below command inside the docker python I didn't find the variable AIRFLOW_ENV_NAME

pprint.pprint(dict(os.environ), width=1)

{'AIRFLOW_HOME': '/usr/local/airflow',
 'EXECUTOR': 'Local',
 'FERNET_KEY': '****************,
 'HOME': '/usr/local/airflow',
 'HOSTNAME': '*******',
 'LANG': 'en_US.UTF-8',
 'LD_LIBRARY_PATH': '/usr/local/lib',
 'LESSOPEN': '||/usr/bin/lesspipe.sh '
             '%s',
 'LOAD_EX': 'n',
 'LS_COLORS': 'rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=01;05;37;41:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=01;36:*.au=01;36:*.flac=01;36:*.mid=01;36:*.midi=01;36:*.mka=01;36:*.mp3=01;36:*.mpc=01;36:*.ogg=01;36:*.ra=01;36:*.wav=01;36:*.axa=01;36:*.oga=01;36:*.spx=01;36:*.xspf=01;36:',
 'PATH': '/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin',
 'PWD': '/usr/local/airflow',
 'SHLVL': '1',
 'TERM': 'xterm',
 '_': '/usr/bin/python3'}

Whereas when I run in AWS MWAA we have as below (sample) , I do see 'AIRFLOW_ENV_NAME': 'airflow-dev'

I was expecting the same variable in aws-mwaa-local-runner

[2021-09-29 13:10:32,673] {{taskinstance.py:1283}} INFO - Exporting the following env vars:
[email protected]
AIRFLOW_CTX_DAG_OWNER=airflow
AIRFLOW_CTX_DAG_ID=dag_with_task
AIRFLOW_CTX_TASK_ID=print_the_context
AIRFLOW_CTX_EXECUTION_DATE=2021-09-29T13:10:31.098987+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2021-09-29T13:10:31.098987+00:00
[2021-09-29 13:10:32,693] {{logging_mixin.py:104}} INFO - {'AIRFLOW_CONFIG_SECRETS': '{"AIRFLOW__SECRETS__BACKEND_KWARGS":"{\\"connections_prefix\\" '
                           ': '
                           '\\"airflow/connections\\", '
                           '\\"variables_prefix\\" '
                           ': '                         '\\"airflow/variables\\"}","AIRFLOW__SECRETS__BACKEND":"airflow.providers.amazon.aws.secrets.secrets_manager.SecretsManagerBackend"}',
 'AIRFLOW_CONN_AWS_DEFAULT': 'aws://',
 'AIRFLOW_CONSOLE_LOGS_ENABLED': 'true',
 'AIRFLOW_CONSOLE_LOG_LEVEL': 'WARNING',
 'AIRFLOW_CTX_DAG_EMAIL': '[email protected]',
 'AIRFLOW_CTX_DAG_ID': 'dag_with_task',
 'AIRFLOW_CTX_DAG_OWNER': 'airflow',
 'AIRFLOW_CTX_DAG_RUN_ID': 'manual__2021-09-29T13:10:31.098987+00:00',
 'AIRFLOW_CTX_EXECUTION_DATE': '2021-09-29T13:10:31.098987+00:00',
 'AIRFLOW_CTX_TASK_ID': 'print_the_context',
 'AIRFLOW_DAG_PROCESSING_LOGS_ENABLED': 'true',
 'AIRFLOW_DAG_PROCESSING_LOG_LEVEL': 'WARNING',
 'AIRFLOW_ENV_NAME': 'airflow-dev',
'AIRFLOW_HOME': '/usr/local/airflow',
 'AIRFLOW_SCHEDULER_LOGS_ENABLED': 'true',
 'AIRFLOW_SCHEDULER_LOG_LEVEL': 'WARNING',
 'AIRFLOW_TASK_LOGS_ENABLED': 'true',
 'AIRFLOW_TASK_LOG_LEVEL': 'INFO',
 'AIRFLOW_TASK_REVISION_ID': '4',
 'AIRFLOW_WEB_SERVER_LOGS_ENABLED': 'true',
 'AIRFLOW_WEB_SERVER_LOG_LEVEL': 'WARNING',
 'AIRFLOW_WORKER_LOGS_ENABLED': 'true',
 'AIRFLOW_WORKER_LOG_LEVEL': 'WARNING',
 '_MP_FORK_LOGLEVEL_': '20'}

New Packages not getting installed

I have cloned the Git to my MacOS and have added the Snowflake/MSSQL package in the dags/requirements.txt as below

apache-airflow-providers-snowflake==1.2.0
snowflake-connector-python==2.4.2
snowflake-sqlalchemy==1.2.4
apache-airflow-providers-mysql==1.1.0
mysql-connector-python==8.0.22
mysqlclient==2.0.3
apache-airflow-providers-microsoft-mssql==1.1.0
apache-airflow-providers-odbc==1.0.1
pymssql==2.2.1

Below is the error I see for the dag

image


./mwaa-local-env start                                                         
[+] Running 10/10
 ⠿ postgres Pulled                                                                                                                                           5.3s
   ⠿ a0d0a0d46f8b Pull complete                                                                                                                              0.8s
   ⠿ 5034a66b99e6 Pull complete                                                                                                                              0.9s
   ⠿ 82e9eb77798b Pull complete                                                                                                                              0.9s
   ⠿ c0911b5b8324 Pull complete                                                                                                                              3.9s
   ⠿ 11ce96790b22 Pull complete                                                                                                                              3.9s
   ⠿ 638c20c41c01 Pull complete                                                                                                                              4.0s
   ⠿ 84d4ffed5e4e Pull complete                                                                                                                              4.0s
   ⠿ a91ec44e1b4b Pull complete                                                                                                                              4.1s
   ⠿ 4d2fbc558f88 Pull complete                                                                                                                              4.1s
[+] Running 2/2
 ⠿ Container docker_postgres_1      Started                                                                                                                  0.5s
 ⠿ Container docker_local-runner_1  Started                                                                                                                  1.3s
Attaching to local-runner_1, postgres_1
local-runner_1  | Wed Sep  1 15:52:12 UTC 2021 - waiting for Postgres... 1/20
postgres_1      | ok
postgres_1      | performing post-bootstrap initialization ... sh: locale: not found
postgres_1      | 2021-09-01 15:52:15.119 UTC [31] WARNING:  no usable system locales were found
local-runner_1  | Wed Sep  1 15:52:17 UTC 2021 - waiting for Postgres... 2/20
postgres_1      | ok
postgres_1      | syncing data to disk ... ok
postgres_1      |
postgres_1      | Success. You can now start the database server using:
postgres_1      |
postgres_1      |     pg_ctl -D /var/lib/postgresql/data -l logfile start
postgres_1      |
postgres_1      |
postgres_1      | WARNING: enabling "trust" authentication for local connections
postgres_1      | You can change this by editing pg_hba.conf or using the option -A, or
postgres_1      | --auth-local and --auth-host, the next time you run initdb.
postgres_1      | waiting for server to start....2021-09-01 15:52:21.906 UTC [36] LOG:  listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
postgres_1      | 2021-09-01 15:52:21.938 UTC [37] LOG:  database system was shut down at 2021-09-01 15:52:19 UTC
postgres_1      | 2021-09-01 15:52:21.951 UTC [36] LOG:  database system is ready to accept connections
postgres_1      |  done
postgres_1      | server started
local-runner_1  | Wed Sep  1 15:52:22 UTC 2021 - waiting for Postgres... 3/20
postgres_1      | CREATE DATABASE
postgres_1      |
postgres_1      |
postgres_1      | /usr/local/bin/docker-entrypoint.sh: ignoring /docker-entrypoint-initdb.d/*
postgres_1      |
postgres_1      | waiting for server to shut down...2021-09-01 15:52:23.644 UTC [36] LOG:  received fast shutdown request
postgres_1      | .2021-09-01 15:52:23.645 UTC [36] LOG:  aborting any active transactions
postgres_1      | 2021-09-01 15:52:23.646 UTC [36] LOG:  worker process: logical replication launcher (PID 43) exited with exit code 1
postgres_1      | 2021-09-01 15:52:23.646 UTC [38] LOG:  shutting down
postgres_1      | 2021-09-01 15:52:23.680 UTC [36] LOG:  database system is shut down
postgres_1      |  done
postgres_1      | server stopped
postgres_1      |
postgres_1      | PostgreSQL init process complete; ready for start up.
postgres_1      |
postgres_1      | 2021-09-01 15:52:23.758 UTC [1] LOG:  listening on IPv4 address "0.0.0.0", port 5432
postgres_1      | 2021-09-01 15:52:23.758 UTC [1] LOG:  listening on IPv6 address "::", port 5432
postgres_1      | 2021-09-01 15:52:23.761 UTC [1] LOG:  listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
postgres_1      | 2021-09-01 15:52:23.807 UTC [50] LOG:  database system was shut down at 2021-09-01 15:52:23 UTC
postgres_1      | 2021-09-01 15:52:23.823 UTC [1] LOG:  database system is ready to accept connections
postgres_1      | 2021-09-01 15:52:27.319 UTC [57] LOG:  incomplete startup packet
local-runner_1  | DB: postgresql+psycopg2://airflow:***@postgres:5432/airflow
local-runner_1  | [2021-09-01 15:52:28,264] {{db.py:684}} INFO - Creating tables
postgres_1      | 2021-09-01 15:52:28.440 UTC [58] ERROR:  relation "connection" does not exist at character 55
postgres_1      | 2021-09-01 15:52:28.440 UTC [58] STATEMENT:  SELECT connection.conn_id AS connection_conn_id
postgres_1      | 	FROM connection GROUP BY connection.conn_id
postgres_1      | 	HAVING count(*) > 1
postgres_1      | 2021-09-01 15:52:28.471 UTC [58] ERROR:  current transaction is aborted, commands ignored until end of transaction block
postgres_1      | 2021-09-01 15:52:28.471 UTC [58] STATEMENT:  SELECT connection.password AS connection_password, connection.extra AS connection_extra, connection.id AS connection_id, connection.conn_id AS connection_conn_id, connection.conn_type AS connection_conn_type, connection.description AS connection_description, connection.host AS connection_host, connection.schema AS connection_schema, connection.login AS connection_login, connection.port AS connection_port, connection.is_encrypted AS connection_is_encrypted, connection.is_extra_encrypted AS connection_is_extra_encrypted
postgres_1      | 	FROM connection
postgres_1      | 	WHERE connection.conn_type IS NULL
local-runner_1  | INFO  [alembic.runtime.migration] Context impl PostgresqlImpl.
local-runner_1  | INFO  [alembic.runtime.migration] Will assume transactional DDL.
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade  -> e3a246e0dc1, current schema
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade e3a246e0dc1 -> 1507a7289a2f, create is_encrypted
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 1507a7289a2f -> 13eb55f81627, maintain history for compatibility with earlier migrations
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 13eb55f81627 -> 338e90f54d61, More logging into task_instance
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 338e90f54d61 -> 52d714495f0, job_id indices
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 52d714495f0 -> 502898887f84, Adding extra to Log
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 502898887f84 -> 1b38cef5b76e, add dagrun
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 1b38cef5b76e -> 2e541a1dcfed, task_duration
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 2e541a1dcfed -> 40e67319e3a9, dagrun_config
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 40e67319e3a9 -> 561833c1c74b, add password column to user
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 561833c1c74b -> 4446e08588, dagrun start end
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 4446e08588 -> bbc73705a13e, Add notification_sent column to sla_miss
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade bbc73705a13e -> bba5a7cfc896, Add a column to track the encryption state of the 'Extra' field in connection
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade bba5a7cfc896 -> 1968acfc09e3, add is_encrypted column to variable table
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 1968acfc09e3 -> 2e82aab8ef20, rename user table
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 2e82aab8ef20 -> 211e584da130, add TI state index
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 211e584da130 -> 64de9cddf6c9, add task fails journal table
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 64de9cddf6c9 -> f2ca10b85618, add dag_stats table
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade f2ca10b85618 -> 4addfa1236f1, Add fractional seconds to mysql tables
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 4addfa1236f1 -> 8504051e801b, xcom dag task indices
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 8504051e801b -> 5e7d17757c7a, add pid field to TaskInstance
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 5e7d17757c7a -> 127d2bf2dfa7, Add dag_id/state index on dag_run table
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 127d2bf2dfa7 -> cc1e65623dc7, add max tries column to task instance
local-runner_1  | ERROR [airflow.models.dagbag.DagBag] Failed to import: /usr/local/airflow/dags/example_dags/connection_test_dag.py
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/models/dagbag.py", line 305, in _load_modules_from_file
local-runner_1  |     loader.exec_module(new_module)
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 728, in exec_module
local-runner_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
local-runner_1  |   File "/usr/local/airflow/dags/example_dags/connection_test_dag.py", line 4, in <module>
local-runner_1  |     from airflow.providers.snowflake.hooks.snowflake import SnowflakeHook
local-runner_1  | ModuleNotFoundError: No module named 'airflow.providers.snowflake'
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade cc1e65623dc7 -> bdaa763e6c56, Make xcom value column a large binary
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade bdaa763e6c56 -> 947454bf1dff, add ti job_id index
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 947454bf1dff -> d2ae31099d61, Increase text size for MySQL (not relevant for other DBs' text types)
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade d2ae31099d61 -> 0e2a74e0fc9f, Add time zone awareness
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade d2ae31099d61 -> 33ae817a1ff4, kubernetes_resource_checkpointing
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 33ae817a1ff4 -> 27c6a30d7c24, kubernetes_resource_checkpointing
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 27c6a30d7c24 -> 86770d1215c0, add kubernetes scheduler uniqueness
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 86770d1215c0, 0e2a74e0fc9f -> 05f30312d566, merge heads
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 05f30312d566 -> f23433877c24, fix mysql not null constraint
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade f23433877c24 -> 856955da8476, fix sqlite foreign key
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 856955da8476 -> 9635ae0956e7, index-faskfail
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 9635ae0956e7 -> dd25f486b8ea, add idx_log_dag
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade dd25f486b8ea -> bf00311e1990, add index to taskinstance
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 9635ae0956e7 -> 0a2a5b66e19d, add task_reschedule table
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 0a2a5b66e19d, bf00311e1990 -> 03bc53e68815, merge_heads_2
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 03bc53e68815 -> 41f5f12752f8, add superuser field
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 41f5f12752f8 -> c8ffec048a3b, add fields to dag
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade c8ffec048a3b -> dd4ecb8fbee3, Add schedule interval to dag
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade dd4ecb8fbee3 -> 939bb1e647c8, task reschedule fk on cascade delete
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 939bb1e647c8 -> 6e96a59344a4, Make TaskInstance.pool not nullable
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 6e96a59344a4 -> d38e04c12aa2, add serialized_dag table
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade d38e04c12aa2 -> b3b105409875, add root_dag_id to DAG
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 6e96a59344a4 -> 74effc47d867, change datetime to datetime2(6) on MSSQL tables
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 939bb1e647c8 -> 004c1210f153, increase queue name size limit
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade c8ffec048a3b -> a56c9515abdc, Remove dag_stat table
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade a56c9515abdc, 004c1210f153, 74effc47d867, b3b105409875 -> 08364691d074, Merge the four heads back together
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 08364691d074 -> fe461863935f, increase_length_for_connection_password
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade fe461863935f -> 7939bcff74ba, Add DagTags table
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 7939bcff74ba -> a4c2fd67d16b, add pool_slots field to task_instance
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade a4c2fd67d16b -> 852ae6c715af, Add RenderedTaskInstanceFields table
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 852ae6c715af -> 952da73b5eff, add dag_code table
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 952da73b5eff -> a66efa278eea, Add Precision to execution_date in RenderedTaskInstanceFields table
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade a66efa278eea -> da3f683c3a5a, Add dag_hash Column to serialized_dag table
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade da3f683c3a5a -> 92c57b58940d, Create FAB Tables
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 92c57b58940d -> 03afc6b6f902, Increase length of FAB ab_view_menu.name column
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 03afc6b6f902 -> cf5dc11e79ad, drop_user_and_chart
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade cf5dc11e79ad -> bbf4a7ad0465, Remove id column from xcom
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade bbf4a7ad0465 -> b25a55525161, Increase length of pool name
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade b25a55525161 -> 3c20cacc0044, Add DagRun run_type
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 3c20cacc0044 -> 8f966b9c467a, Set conn_type as non-nullable
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 8f966b9c467a -> 8d48763f6d53, add unique constraint to conn_id
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 8d48763f6d53 -> e38be357a868, Add sensor_instance table
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade e38be357a868 -> b247b1e3d1ed, Add queued by Job ID to TI
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade b247b1e3d1ed -> e1a11ece99cc, Add external executor ID to TI
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade e1a11ece99cc -> bef4f3d11e8b, Drop KubeResourceVersion and KubeWorkerId
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade bef4f3d11e8b -> 98271e7606e2, Add scheduling_decision to DagRun and DAG
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 98271e7606e2 -> 52d53670a240, fix_mssql_exec_date_rendered_task_instance_fields_for_MSSQL
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 52d53670a240 -> 364159666cbd, Add creating_job_id to DagRun table
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 364159666cbd -> 45ba3f1493b9, add-k8s-yaml-to-rendered-templates
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 45ba3f1493b9 -> 849da589634d, Prefix DAG permissions.
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 849da589634d -> 2c6edca13270, Resource based permissions.
local-runner_1  | [2021-09-01 15:52:33,729] {{manager.py:788}} WARNING - No user yet created, use flask fab command to do it.
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 2c6edca13270 -> 61ec73d9401f, Add description field to connection
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 61ec73d9401f -> 64a7d6477aae, fix description field in connection to be text
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 64a7d6477aae -> e959f08ac86c, Change field in DagCode to MEDIUMTEXT for MySql
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade e959f08ac86c -> 82b7c48c147f, Remove can_read permission on config resource for User and Viewer role
local-runner_1  | [2021-09-01 15:52:42,858] {{manager.py:788}} WARNING - No user yet created, use flask fab command to do it.
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 82b7c48c147f -> 449b4072c2da, Increase size of connection.extra field to handle multiple RSA keys
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 449b4072c2da -> 8646922c8a04, Change default pool_slots to 1
local-runner_1  | INFO  [alembic.runtime.migration] Running upgrade 8646922c8a04 -> 2e42bb497a22, rename last_scheduler_run column
local-runner_1  | INFO  [airflow.models.dagbag.DagBag] Filling up the DagBag from /usr/local/airflow/dags
local-runner_1  | ERROR [airflow.models.dagbag.DagBag] Failed to import: /usr/local/airflow/dags/example_dags/connection_test_dag.py
local-runner_1  | Traceback (most recent call last):
local-runner_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/models/dagbag.py", line 305, in _load_modules_from_file
local-runner_1  |     loader.exec_module(new_module)
local-runner_1  |   File "<frozen importlib._bootstrap_external>", line 728, in exec_module
local-runner_1  |   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
local-runner_1  |   File "/usr/local/airflow/dags/example_dags/connection_test_dag.py", line 4, in <module>
local-runner_1  |     from airflow.providers.snowflake.hooks.snowflake import SnowflakeHook
local-runner_1  | ModuleNotFoundError: No module named 'airflow.providers.snowflake'
local-runner_1  | INFO  [airflow.models.dag] Sync 1 DAGs
local-runner_1  | INFO  [airflow.models.dag] Creating ORM DAG for dag_with_task
local-runner_1  | INFO  [airflow.models.dag] Setting next_dagrun for dag_with_task to None
local-runner_1  | Initialization done
local-runner_1  |   ____________       _____________
local-runner_1  |  ____    |__( )_________  __/__  /________      __
local-runner_1  | ____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
local-runner_1  | ___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
local-runner_1  |  _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
local-runner_1  | [2021-09-01 15:52:46,858] {{scheduler_job.py:1251}} INFO - Starting the scheduler
local-runner_1  | [2021-09-01 15:52:46,858] {{scheduler_job.py:1256}} INFO - Processing each file at most -1 times
local-runner_1  | [2021-09-01 15:52:47,028] {{dag_processing.py:252}} INFO - Launched DagFileProcessorManager with pid: 142
local-runner_1  | [2021-09-01 15:52:47,030] {{scheduler_job.py:1854}} INFO - Resetting orphaned tasks for active dag runs
local-runner_1  | [2021-09-01 15:52:47,127] {{settings.py:54}} INFO - Configured default timezone Timezone('UTC')
local-runner_1  | [2021-09-01 15:52:50,491] {{manager.py:788}} WARNING - No user yet created, use flask fab command to do it.
local-runner_1  | Admin user admin created
local-runner_1  |   ____________       _____________
local-runner_1  |  ____    |__( )_________  __/__  /________      __
local-runner_1  | ____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
local-runner_1  | ___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
local-runner_1  |  _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
local-runner_1  | [2021-09-01 15:52:58,458] {{dagbag.py:451}} INFO - Filling up the DagBag from /dev/null
local-runner_1  | [2021-09-01 15:53:01 +0000] [208] [INFO] Starting gunicorn 19.10.0
local-runner_1  | [2021-09-01 15:53:01 +0000] [208] [INFO] Listening at: http://0.0.0.0:8080 (208)
local-runner_1  | [2021-09-01 15:53:01 +0000] [208] [INFO] Using worker: sync
local-runner_1  | [2021-09-01 15:53:01 +0000] [211] [INFO] Booting worker with pid: 211
local-runner_1  | [2021-09-01 15:53:01 +0000] [212] [INFO] Booting worker with pid: 212
local-runner_1  | [2021-09-01 15:53:01 +0000] [213] [INFO] Booting worker with pid: 213
local-runner_1  | [2021-09-01 15:53:02 +0000] [214] [INFO] Booting worker with pid: 214
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:06 +0000] "GET / HTTP/1.1" 302 217 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:06 +0000] "GET /home HTTP/1.1" 302 305 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome HTTP/1.1" 200 16929 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/appbuilder/css/bootstrap.min.css HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/appbuilder/select2/select2.css HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/appbuilder/css/font-awesome.min.css HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/appbuilder/datepicker/bootstrap-datepicker.css HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/appbuilder/css/ab.css HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/appbuilder/css/flags/flags16.css HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/dist/airflowDefaultTheme.42f8d9f03e53e5b06087.css HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/dist/materialIcons.c86800f70eece0ad5c3e.css HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/dist/main.a02ab09a012af15327d8.css HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/dist/bootstrap-datetimepicker.min.css HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/dist/flash.82c9e653b17d76b0b572.css HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/dist/loadingDots.f9d109f104217ec97cea.css HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/appbuilder/js/jquery-latest.js HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/appbuilder/js/ab_filters.js HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/appbuilder/js/ab_actions.js HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/appbuilder/select2/select2.js HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/appbuilder/datepicker/bootstrap-datepicker.js HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/appbuilder/js/bootstrap.min.js HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/appbuilder/js/ab.js HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/dist/moment.c61e3ab5bc7680097402.js HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/dist/bootstrap3-typeahead.min.js HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/dist/bootstrap-datetimepicker.min.js HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/dist/main.a02ab09a012af15327d8.js HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/appbuilder/fonts/fontawesome-webfont.woff2?v=4.7.0 HTTP/1.1" 200 0 "http://localhost:8080/static/appbuilder/css/font-awesome.min.css" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:07 +0000] "GET /static/pin_32.png HTTP/1.1" 200 0 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:08 +0000] "POST /login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome HTTP/1.1" 302 209 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:08 +0000] "GET / HTTP/1.1" 302 217 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:09 +0000] "GET /home HTTP/1.1" 200 43724 "http://localhost:8080/login/?next=http%3A%2F%2Flocalhost%3A8080%2Fhome" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:09 +0000] "GET /static/appbuilder/css/bootstrap.min.css HTTP/1.1" 304 0 "http://localhost:8080/home" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:09 +0000] "GET /static/appbuilder/css/font-awesome.min.css HTTP/1.1" 304 0 "http://localhost:8080/home" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:09 +0000] "GET /static/appbuilder/select2/select2.css HTTP/1.1" 304 0 "http://localhost:8080/home" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:09 +0000] "GET /static/appbuilder/datepicker/bootstrap-datepicker.css HTTP/1.1" 304 0 "http://localhost:8080/home" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:09 +0000] "GET /static/appbuilder/css/flags/flags16.css HTTP/1.1" 304 0 "http://localhost:8080/home" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:09 +0000] "GET /static/appbuilder/css/ab.css HTTP/1.1" 304 0 "http://localhost:8080/home" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:09 +0000] "GET /static/dist/airflowDefaultTheme.42f8d9f03e53e5b06087.css HTTP/1.1" 304 0 "http://localhost:8080/home" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:09 +0000] "GET /static/dist/switch.07b9373717bbc645aa21.css HTTP/1.1" 200 0 "http://localhost:8080/home" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:09 +0000] "GET /static/dist/materialIcons.c86800f70eece0ad5c3e.css HTTP/1.1" 304 0 "http://localhost:8080/home" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:09 +0000] "GET /static/dist/dags.940d19fa177ddbedb680.css HTTP/1.1" 200 0 "http://localhost:8080/home" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:09 +0000] "GET /static/dist/loadingDots.f9d109f104217ec97cea.css HTTP/1.1" 304 0 "http://localhost:8080/home" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:09 +0000] "GET /static/dist/main.a02ab09a012af15327d8.css HTTP/1.1" 304 0 "http://localhost:8080/home" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:09 +0000] "GET /static/dist/bootstrap-datetimepicker.min.css HTTP/1.1" 304 0 "http://localhost:8080/home" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:09 +0000] "GET /static/dist/flash.82c9e653b17d76b0b572.css HTTP/1.1" 304 0 "http://localhost:8080/home" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:09 +0000] "GET /static/dist/d3.min.js HTTP/1.1" 200 0 "http://localhost:8080/home" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:09 +0000] "POST /blocked HTTP/1.1" 200 2 "http://localhost:8080/home" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:09 +0000] "POST /last_dagruns HTTP/1.1" 200 2 "http://localhost:8080/home" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:09 +0000] "POST /dag_stats HTTP/1.1" 200 258 "http://localhost:8080/home" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | 172.19.0.1 - - [01/Sep/2021:15:53:09 +0000] "POST /task_stats HTTP/1.1" 200 961 "http://localhost:8080/home" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.164 Safari/537.36"
local-runner_1  | [2021-09-01 15:53:32 +0000] [208] [INFO] Handling signal: ttin
local-runner_1  | [2021-09-01 15:53:32 +0000] [260] [INFO] Booting worker with pid: 260
local-runner_1  | [2021-09-01 15:53:37 +0000] [208] [INFO] Handling signal: ttou
local-runner_1  | [2021-09-01 15:53:37 +0000] [211] [INFO] Worker exiting (pid: 211)
local-runner_1  | [2021-09-01 15:54:02 +0000] [208] [INFO] Handling signal: ttin
local-runner_1  | [2021-09-01 15:54:02 +0000] [284] [INFO] Booting worker with pid: 284
local-runner_1  | [2021-09-01 15:54:07 +0000] [208] [INFO] Handling signal: ttou
local-runner_1  | [2021-09-01 15:54:07 +0000] [212] [INFO] Worker exiting (pid: 212)
local-runner_1  | [2021-09-01 15:54:33 +0000] [208] [INFO] Handling signal: ttin
local-runner_1  | [2021-09-01 15:54:33 +0000] [308] [INFO] Booting worker with pid: 308
local-runner_1  | [2021-09-01 15:54:38 +0000] [208] [INFO] Handling signal: ttou
local-runner_1  | [2021-09-01 15:54:38 +0000] [213] [INFO] Worker exiting (pid: 213)
local-runner_1  | [2021-09-01 15:55:03 +0000] [208] [INFO] Handling signal: ttin
local-runner_1  | [2021-09-01 15:55:03 +0000] [332] [INFO] Booting worker with pid: 332
local-runner_1  | [2021-09-01 15:55:08 +0000] [208] [INFO] Handling signal: ttou
local-runner_1  | [2021-09-01 15:55:08 +0000] [214] [INFO] Worker exiting (pid: 214)
local-runner_1  | [2021-09-01 15:55:34 +0000] [208] [INFO] Handling signal: ttin
local-runner_1  | [2021-09-01 15:55:34 +0000] [357] [INFO] Booting worker with pid: 357
local-runner_1  | [2021-09-01 15:55:39 +0000] [208] [INFO] Handling signal: ttou
local-runner_1  | [2021-09-01 15:55:39 +0000] [260] [INFO] Worker exiting (pid: 260)
local-runner_1  | [2021-09-01 15:56:04 +0000] [208] [INFO] Handling signal: ttin
local-runner_1  | [2021-09-01 15:56:04 +0000] [381] [INFO] Booting worker with pid: 381
local-runner_1  | [2021-09-01 15:56:09 +0000] [208] [INFO] Handling signal: ttou
local-runner_1  | [2021-09-01 15:56:09 +0000] [284] [INFO] Worker exiting (pid: 284)
local-runner_1  | [2021-09-01 15:56:35 +0000] [208] [INFO] Handling signal: ttin
local-runner_1  | [2021-09-01 15:56:35 +0000] [405] [INFO] Booting worker with pid: 405
local-runner_1  | [2021-09-01 15:56:40 +0000] [208] [INFO] Handling signal: ttou
local-runner_1  | [2021-09-01 15:56:40 +0000] [308] [INFO] Worker exiting (pid: 308)

ModuleNotFoundError: No module named 'google'

When you try to import any google libraries via

from google.cloud import bigquery
ModuleNotFoundError: No module named 'google'

I was surprised because we can see in the constraints.txt

google-cloud-bigquery==1.28.0

Is this a bug?

Change CLI language from Shell to something more robust

Love this project and a huge fan of Airflow, containers, and cloud in general. I think we can make this CLI/runner better by moving away from pure shell scripts and instead implementing the CLI in a more robust way, such as using a language like Python or Go for this. I'd be happy to contribute to that effort with PRs etc, creating the issue here first to track it.

Some ideas:

  • Astronomer has a really great CLI (written in Golang) that I've personally used many times to spin up an Airflow instance locally for dev and testing. It uses a few Docker containers under the hood. The CLI has a lot more things it can do, but I think the part that's of interest to us is just the commands in the astro dev section. I believe this is the link to the relevant source code for that subsection of their CLI; perhaps we could adopt something similar. https://github.com/astronomer/astro-cli/tree/v0.23.4/airflow
  • Alternatively we could use Python and Click to quickly put it together. I suspect that most of the folks reading this are far more comfortable in Python than Golang, so it would be a lot easier. The only downside is that you'd need to install it as a Python package, as opposed to the Golang one which is vended as a single executable binary file.

Unable to connect to AWS from docker

I tried all the below options in the docker-compose-local.yml , but I am not able to connect DB using AWS SecretManager

            - /$HOME/.aws/credentials:/usr/local/airflow/.aws/credentials:ro
            - /$HOME/.aws/credentials:/root/.aws/credentials:ro
   conn = Connection.get_connection_from_secrets(conn_id)
  File "/usr/local/lib/python3.7/site-packages/airflow/models/connection.py", line 354, in get_connection_from_secrets
    raise AirflowNotFoundException(f"The conn_id `{conn_id}` isn't defined")
airflow.exceptions.AirflowNotFoundException: The conn_id `snowflake_conn_dev` isn't defined

Also tried giving as below

 environment:
            - LOAD_EX=n
            - EXECUTOR=Local
            - AWS_ACCESS_KEY_ID=HY3RYVLMAXXX
            - AWS_SECRET_ACCESS_KEY=sNOycEbMO5nmZxxxxxx
            - AWS_DEFAULT_REGION=us-west-2

ModuleNotFoundError: No module named 'airflow.models.BaseOperator'

I am running in aws-mwaa-local-runner
I wrote custom operator as below,not sure why it says no module found

import os
import sys
import csv
import tempfile
import logging
from contextlib import closing
from io import BytesIO, TextIOWrapper, BufferedRandom
from airflow.models.baseoperator import BaseOperator
from airflow.hooks.S3_hook import S3Hook
from airflow.contrib.hooks.snowflake_hook import SnowflakeHook
from utils.files import is_gz_compressed, compress_gz

logger = logging.getLogger(__name__)


class BaseUploadToS3(BaseOperator):
    """Extract data and compress to gzip format and upload to S3 location with s3_key
        Parameters
       ***************

But got below error


Broken DAG: [/usr/local/airflow/dags/a_dag.py] Traceback (most recent call last):
  File "/usr/local/airflow/dags/a_dag.py", line 8, in <module>
    from custom_operators.mssql_to_s3_operator import MsSqlToS3Operator
  File "/usr/local/airflow/dags/custom_operators/mssql_to_s3_operator.py", line 3, in <module>
    from airflow.models.BaseOperator import BaseOperator
ModuleNotFoundError: No module named 'airflow.models.BaseOperator'

When I checked inside the docker with python3

[airflow@6902213e0fbe ~]$ clear
[airflow@6902213e0fbe ~]$ python3
Python 3.7.10 (default, Jun  3 2021, 00:02:01)
[GCC 7.3.1 20180712 (Red Hat 7.3.1-13)] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from airflow.models.BaseOperator import BaseOperator
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'airflow.models.BaseOperator'
>>>

airflow.cfg and webserver_config.py are overwritten by airflow initial run

The Dockerfile runs the following commands in an attempt to set initial values for these files

COPY config/airflow.cfg ${AIRFLOW_USER_HOME}/airflow.cfg
COPY config/webserver_config.py ${AIRFLOW_USER_HOME}/webserver_config.py

However the first time airflow runs, it overwrites these files completely to default files.

Perform a diff between the files in docker/config and ${AIRFLOW_USER_HOME} and you will see they are significantly different.

`airflow connections list` returns cryptography.fernet.InvalidToken

Running airflow connections list on the docker_local-runner container is returning cryptography.fernet.InvalidToken

sh-4.2$ docker exec -it docker_local-runner_1 /bin/sh
sh-4.2$ source /entrypoint.sh
sh-4.2$ airflow connections list
Traceback (most recent call last):
  File "/usr/local/bin/airflow", line 8, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.7/site-packages/airflow/__main__.py", line 40, in main
    args.func(args)
  File "/usr/local/lib/python3.7/site-packages/airflow/cli/cli_parser.py", line 48, in command
    return func(*args, **kwargs)
  File "/usr/lib64/python3.7/contextlib.py", line 74, in inner
    return func(*args, **kwds)
  File "/usr/local/lib/python3.7/site-packages/airflow/cli/commands/connection_command.py", line 81, in connections_list
    mapper=_connection_mapper,
  File "/usr/local/lib/python3.7/site-packages/airflow/cli/simple_table.py", line 101, in print_as
    dict_data: List[Dict] = [mapper(d) for d in data]
  File "/usr/local/lib/python3.7/site-packages/airflow/cli/simple_table.py", line 101, in <listcomp>
    dict_data: List[Dict] = [mapper(d) for d in data]
  File "/usr/local/lib/python3.7/site-packages/airflow/cli/commands/connection_command.py", line 50, in _connection_mapper
    'extra_dejson': conn.extra_dejson,
  File "/usr/local/lib/python3.7/site-packages/airflow/models/connection.py", line 333, in extra_dejson
    if self.extra:
  File "/usr/local/lib64/python3.7/site-packages/sqlalchemy/orm/attributes.py", line 365, in __get__
    retval = self.descriptor.__get__(instance, owner)
  File "/usr/local/lib/python3.7/site-packages/airflow/models/connection.py", line 240, in get_extra
    return fernet.decrypt(bytes(self._extra, 'utf-8')).decode()
  File "/usr/local/lib64/python3.7/site-packages/cryptography/fernet.py", line 194, in decrypt
    raise InvalidToken
cryptography.fernet.InvalidToken

I have specified my FERNET_KEY in my docker-compose file and run source /entrypoint.sh to ensure AIRFLOW__CORE__FERNET_KEY is set

local-runner:
    environment:
        - FERNET_KEY=...

Other airflow commands work, so it appears to be an issue specifically with accessing connections

sh-4.2$ airflow variables list
key    
=======
from ui

Any tips for how to remedy this would be appreciated.
Thanks

./mwaa-local-env build-image fails

Hi,

on my ubuntu 20.04 (running on vmware) I just tried to build the image and it failed (Cannot find a valid baseurl for repo: amzn2-core/2/x86_64..). I can try to fix it locally, was just wondering if this is something that you also see :

./mwaa-local-env build-image

Sending build context to Docker daemon 19.95kB
Step 1/25 : FROM amazonlinux
latest: Pulling from library/amazonlinux
3c2c91c7c431: Pull complete
Digest: sha256:06b9e2433e4e563e1d75bc8c71d32b76dc49a2841e9253746eefc8ca40b80b5e
Status: Downloaded newer image for amazonlinux:latest
---> 53ef897d731f
Step 2/25 : LABEL maintainer="amazon"
---> Running in 3465e285216b
Removing intermediate container 3465e285216b
---> a151fe0f7653
Step 3/25 : ARG AIRFLOW_VERSION=1.10.12
---> Running in 05afb75b71b6
Removing intermediate container 05afb75b71b6
---> 5b4993d58b04
Step 4/25 : ARG AIRFLOW_USER_HOME=/usr/local/airflow
---> Running in 5d5cb8815694
Removing intermediate container 5d5cb8815694
---> 1ad36c0f6675
Step 5/25 : ARG AIRFLOW_DEPS=""
---> Running in 87985002a9f0
Removing intermediate container 87985002a9f0
---> 57d6f50d7b58
Step 6/25 : ARG PYTHON_DEPS=""
---> Running in 5dab1621c728
Removing intermediate container 5dab1621c728
---> 2a7007526538
Step 7/25 : ARG SYSTEM_DEPS=""
---> Running in 5f38c7919606
Removing intermediate container 5f38c7919606
---> 586c15a57069
Step 8/25 : ARG INDEX_URL=""
---> Running in 7c55422b3812
Removing intermediate container 7c55422b3812
---> 2b988b48da95
Step 9/25 : ENV AIRFLOW_HOME=${AIRFLOW_USER_HOME}
---> Running in 705cfceb6732
Removing intermediate container 705cfceb6732
---> 34dbf4e14682
Step 10/25 : COPY script/bootstrap.sh /bootstrap.sh
---> bc966836b5d0
Step 11/25 : COPY script/systemlibs.sh /systemlibs.sh
---> 14e5c9ae9c32
Step 12/25 : COPY config/constraints.txt /constraints.txt
---> e25943d55d4f
Step 13/25 : COPY config/requirements.txt /requirements.txt
---> cf80151bae6d
Step 14/25 : RUN chmod u+x /systemlibs.sh && /systemlibs.sh
---> Running in e71c062050f3
Loaded plugins: ovl, priorities

One of the configured repositories failed (Unknown),
and yum doesn't have enough cached data to continue. At this point the only
safe thing yum can do is fail. There are a few ways to work "fix" this:

 1. Contact the upstream for the repository and get them to fix the problem.

 2. Reconfigure the baseurl/etc. for the repository, to point to a working
    upstream. This is most often useful if you are using a newer
    distribution release than is supported by the repository (and the
    packages for the previous distribution release still work).

 3. Run the command with the repository temporarily disabled
        yum --disablerepo=<repoid> ...

 4. Disable the repository permanently, so yum won't use it by default. Yum
    will then just ignore the repository until you permanently enable it
    again or use --enablerepo for temporary usage:

        yum-config-manager --disable <repoid>
    or
        subscription-manager repos --disable=<repoid>

 5. Configure the failing repository to be skipped, if it is unavailable.
    Note that yum will try to contact the repo. when it runs most commands,
    so will have to try and fail each time (and thus. yum will be be much
    slower). If it is a very temporary problem though, this is often a nice
    compromise:

        yum-config-manager --save --setopt=<repoid>.skip_if_unavailable=true

Cannot find a valid baseurl for repo: amzn2-core/2/x86_64
Could not retrieve mirrorlist http://amazonlinux.default.amazonaws.com/2/core/latest/x86_64/mirror.list error was
12: Timeout on http://amazonlinux.default.amazonaws.com/2/core/latest/x86_64/mirror.list: (28, 'Resolving timed out after 5001 milliseconds')
The command '/bin/sh -c chmod u+x /systemlibs.sh && /systemlibs.sh' returned a non-zero code: 1

reset-db: incorrect image

Having some issues when calling ./mwaa-local-env reset-db. Error:

Pulling resetdb (amazon/mwaa-local:2.0)...
ERROR: The image for the service you're trying to recreate has been removed. If you continue, volume data could be lost. Consider backing up your data before continuing.

I think this is due to this line:

image: amazon/mwaa-local:2.0

Could this be fixed by changing it to use image amazon/mwaa-local:2.0.2?

Cannot build on Apple M1

Hardware: MacBook Pro (16-inch, Apple M1, 2021)
OS: macOS Monterey 12.0.1

At step 1: ./mwaa-local-env build-image

Fails with error:

 => ERROR [ 7/13] RUN chmod u+x /bootstrap.sh && /bootstrap.sh                                               104.7s
------
 > [ 7/13] RUN chmod u+x /bootstrap.sh && /bootstrap.sh:
#11 0.265 WARNING: Running pip install with root privileges is generally not a good idea. Try `python3 -m pip install --user` instead.
#11 0.501 Collecting pip
#11 0.626   Downloading pip-21.3.1-py3-none-any.whl (1.7 MB)
#11 0.835 Installing collected packages: pip
#11 1.283 Successfully installed pip-21.3.1
#11 1.774 Collecting wheel
#11 1.873   Downloading wheel-0.37.1-py2.py3-none-any.whl (35 kB)
#11 1.902 Installing collected packages: wheel
#11 1.919 Successfully installed wheel-0.37.1
#11 1.919 WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
#11 2.107 Requirement already satisfied: pip in /usr/local/lib/python3.7/site-packages (21.3.1)
#11 2.272 WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
#11 2.589 Collecting pycurl
#11 2.694   Downloading pycurl-7.44.1.tar.gz (227 kB)
#11 2.800   Preparing metadata (setup.py): started
#11 2.921   Preparing metadata (setup.py): finished with status 'done'
#11 2.923 Building wheels for collected packages: pycurl
#11 2.924   Building wheel for pycurl (setup.py): started
#11 4.765   Building wheel for pycurl (setup.py): finished with status 'done'
#11 4.766   Created wheel for pycurl: filename=pycurl-7.44.1-cp37-cp37m-linux_aarch64.whl size=319527 sha256=b8b1c50817675392d6372e3b90d7972e31a508f1c08d0d20ba810786f025b551
#11 4.766   Stored in directory: /root/.cache/pip/wheels/c5/46/b4/4dc60b406282c22dd4f1ca7da5c949e88aeadb78283123f94d
#11 4.769 Successfully built pycurl
#11 4.789 Installing collected packages: pycurl
#11 4.810 Successfully installed pycurl-7.44.1
#11 4.810 WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
#11 5.149 Collecting celery[sqs]
#11 5.262   Downloading celery-5.2.3-py3-none-any.whl (405 kB)
#11 5.578 Collecting setuptools<59.7.0,>=59.1.1
#11 5.606   Downloading setuptools-59.6.0-py3-none-any.whl (952 kB)
#11 5.708 Collecting click-plugins>=1.1.1
#11 5.738   Downloading click_plugins-1.1.1-py2.py3-none-any.whl (7.5 kB)
#11 5.798 Collecting kombu<6.0,>=5.2.3
#11 5.824   Downloading kombu-5.2.3-py3-none-any.whl (189 kB)
#11 5.923 Collecting pytz>=2021.3
#11 5.948   Downloading pytz-2021.3-py2.py3-none-any.whl (503 kB)
#11 6.013 Collecting click-didyoumean>=0.0.3
#11 6.035   Downloading click_didyoumean-0.3.0-py3-none-any.whl (2.7 kB)
#11 6.077 Collecting billiard<4.0,>=3.6.4.0
#11 6.103   Downloading billiard-3.6.4.0-py3-none-any.whl (89 kB)
#11 6.139 Collecting click-repl>=0.2.0
#11 6.167   Downloading click_repl-0.2.0-py3-none-any.whl (5.2 kB)
#11 6.213 Collecting click<9.0,>=8.0.3
#11 6.235   Downloading click-8.0.3-py3-none-any.whl (97 kB)
#11 6.269 Collecting vine<6.0,>=5.0.0
#11 6.291   Downloading vine-5.0.0-py2.py3-none-any.whl (9.4 kB)
#11 6.364 Collecting importlib-metadata
#11 6.405   Downloading importlib_metadata-4.10.0-py3-none-any.whl (17 kB)
#11 6.473 Collecting prompt-toolkit
#11 6.499   Downloading prompt_toolkit-3.0.24-py3-none-any.whl (374 kB)
#11 6.562 Collecting six
#11 6.597   Downloading six-1.16.0-py2.py3-none-any.whl (11 kB)
#11 6.677 Collecting amqp<6.0.0,>=5.0.9
#11 6.702   Downloading amqp-5.0.9-py3-none-any.whl (50 kB)
#11 6.738 Collecting cached-property
#11 6.767   Downloading cached_property-1.5.2-py2.py3-none-any.whl (7.6 kB)
#11 6.777 Requirement already satisfied: pycurl~=7.44.1 in /usr/local/lib64/python3.7/site-packages (from kombu<6.0,>=5.2.3->celery[sqs]) (7.44.1)
#11 7.082 Collecting boto3>=1.9.12
#11 7.110   Downloading boto3-1.20.28-py3-none-any.whl (131 kB)
#11 7.163 Collecting urllib3>=1.26.7
#11 7.190   Downloading urllib3-1.26.7-py2.py3-none-any.whl (138 kB)
#11 7.240 Collecting s3transfer<0.6.0,>=0.5.0
#11 7.264   Downloading s3transfer-0.5.0-py3-none-any.whl (79 kB)
#11 7.614 Collecting botocore<1.24.0,>=1.23.28
#11 7.690   Downloading botocore-1.23.28-py3-none-any.whl (8.5 MB)
#11 8.539 Collecting jmespath<1.0.0,>=0.7.1
#11 8.564   Downloading jmespath-0.10.0-py2.py3-none-any.whl (24 kB)
#11 8.616 Collecting zipp>=0.5
#11 8.638   Downloading zipp-3.7.0-py3-none-any.whl (5.3 kB)
#11 8.674 Collecting typing-extensions>=3.6.4
#11 8.697   Downloading typing_extensions-4.0.1-py3-none-any.whl (22 kB)
#11 8.739 Collecting wcwidth
#11 8.761   Downloading wcwidth-0.2.5-py2.py3-none-any.whl (30 kB)
#11 8.808 Collecting python-dateutil<3.0.0,>=2.1
#11 8.834   Downloading python_dateutil-2.8.2-py2.py3-none-any.whl (247 kB)
#11 9.050 Installing collected packages: six, zipp, urllib3, typing-extensions, python-dateutil, jmespath, wcwidth, vine, importlib-metadata, botocore, s3transfer, prompt-toolkit, click, cached-property, amqp, setuptools, pytz, kombu, click-repl, click-plugins, click-didyoumean, boto3, billiard, celery
#11 9.713   Attempting uninstall: setuptools
#11 9.713     Found existing installation: setuptools 49.1.3
#11 9.731     Uninstalling setuptools-49.1.3:
#11 9.903       Successfully uninstalled setuptools-49.1.3
#11 10.38 Successfully installed amqp-5.0.9 billiard-3.6.4.0 boto3-1.20.28 botocore-1.23.28 cached-property-1.5.2 celery-5.2.3 click-8.0.3 click-didyoumean-0.3.0 click-plugins-1.1.1 click-repl-0.2.0 importlib-metadata-4.10.0 jmespath-0.10.0 kombu-5.2.3 prompt-toolkit-3.0.24 python-dateutil-2.8.2 pytz-2021.3 s3transfer-0.5.0 setuptools-59.6.0 six-1.16.0 typing-extensions-4.0.1 urllib3-1.26.7 vine-5.0.0 wcwidth-0.2.5 zipp-3.7.0
#11 10.38 WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
#11 10.81 Collecting psycopg2
#11 10.92   Downloading psycopg2-2.9.3.tar.gz (380 kB)
#11 11.04   Preparing metadata (setup.py): started
#11 11.15   Preparing metadata (setup.py): finished with status 'done'
#11 11.16 Building wheels for collected packages: psycopg2
#11 11.16   Building wheel for psycopg2 (setup.py): started
#11 14.22   Building wheel for psycopg2 (setup.py): finished with status 'done'
#11 14.22   Created wheel for psycopg2: filename=psycopg2-2.9.3-cp37-cp37m-linux_aarch64.whl size=474755 sha256=98573c214dff5e5ae5b6a67798ce10431559c12ffc09596a9d9565b828e90e07
#11 14.22   Stored in directory: /root/.cache/pip/wheels/20/78/2c/d2f59d80d97357ffd6526f209083e46d57827d94d89ac8c91a
#11 14.23 Successfully built psycopg2
#11 14.30 Installing collected packages: psycopg2
#11 14.32 Successfully installed psycopg2-2.9.3
#11 14.32 WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
#11 14.81 Collecting apache-airflow[celery,crypto,statsd]==2.0.2
#11 14.93   Downloading apache_airflow-2.0.2-py3-none-any.whl (4.6 MB)
#11 16.41 Collecting markupsafe<2.0,>=1.1.1
#11 16.43   Downloading MarkupSafe-1.1.1-cp37-cp37m-manylinux2014_aarch64.whl (34 kB)
#11 16.47 Collecting colorlog>=4.0.2
#11 16.49   Downloading colorlog-5.0.1-py2.py3-none-any.whl (10 kB)
#11 16.53 Collecting unicodecsv>=0.14.1
#11 16.55   Downloading unicodecsv-0.14.1.tar.gz (10 kB)
#11 16.56   Preparing metadata (setup.py): started
#11 16.66   Preparing metadata (setup.py): finished with status 'done'
#11 16.69 Collecting python-dateutil<3,>=2.3
#11 16.71   Downloading python_dateutil-2.8.1-py2.py3-none-any.whl (227 kB)
#11 16.81 Collecting flask-appbuilder>=3.1.1,~=3.1
#11 16.84   Downloading Flask_AppBuilder-3.4.1-py3-none-any.whl (1.9 MB)
#11 17.01 Collecting tabulate<0.9,>=0.7.5
#11 17.03   Downloading tabulate-0.8.9-py3-none-any.whl (25 kB)
#11 17.08 Collecting apache-airflow-providers-imap
#11 17.10   Downloading apache_airflow_providers_imap-1.0.1-py3-none-any.whl (15 kB)
#11 17.15 Collecting werkzeug>=1.0.1,~=1.0
#11 17.17   Downloading Werkzeug-1.0.1-py2.py3-none-any.whl (298 kB)
#11 17.23 Collecting typing-extensions>=3.7.4
#11 17.25   Downloading typing_extensions-3.7.4.3-py3-none-any.whl (22 kB)
#11 17.33 Collecting lazy-object-proxy
#11 17.35   Downloading lazy-object-proxy-1.4.3.tar.gz (34 kB)
#11 17.42   Installing build dependencies: started
#11 19.25   Installing build dependencies: finished with status 'done'
#11 19.25   Getting requirements to build wheel: started
#11 19.34   Getting requirements to build wheel: finished with status 'done'
#11 19.34   Preparing metadata (pyproject.toml): started
#11 19.46   Preparing metadata (pyproject.toml): finished with status 'done'
#11 19.50 Collecting attrs<21.0,>=20.0
#11 19.53   Downloading attrs-20.3.0-py2.py3-none-any.whl (49 kB)
#11 19.57 Collecting flask-wtf<0.15,>=0.14.3
#11 19.59   Downloading Flask_WTF-0.14.3-py2.py3-none-any.whl (13 kB)
#11 19.64 Collecting markdown<4.0,>=2.5.2
#11 19.66   Downloading Markdown-3.3.4-py3-none-any.whl (97 kB)
#11 19.70 Collecting python3-openid~=3.2
#11 19.72   Downloading python3_openid-3.2.0-py3-none-any.whl (133 kB)
#11 19.76 Collecting sqlalchemy-jsonfield~=1.0
#11 19.79   Downloading SQLAlchemy_JSONField-1.0.0-py3-none-any.whl (10 kB)
#11 19.83 Collecting pyjwt<2
#11 19.85   Downloading PyJWT-1.7.1-py2.py3-none-any.whl (18 kB)
#11 20.07 Collecting cryptography>=0.9.3
#11 20.10   Downloading cryptography-3.4.7-cp36-abi3-manylinux2014_aarch64.whl (3.1 MB)
#11 20.36 Collecting apache-airflow-providers-ftp
#11 20.38   Downloading apache_airflow_providers_ftp-1.0.1-py3-none-any.whl (14 kB)
#11 20.41 Collecting blinker
#11 20.43   Downloading blinker-1.4.tar.gz (111 kB)
#11 20.45   Preparing metadata (setup.py): started
#11 20.55   Preparing metadata (setup.py): finished with status 'done'
#11 20.59 Collecting cattrs~=1.1
#11 20.62   Downloading cattrs-1.5.0-py3-none-any.whl (19 kB)
#11 20.65 Collecting flask-login<0.5,>=0.3
#11 20.67   Downloading Flask-Login-0.4.1.tar.gz (14 kB)
#11 20.68   Preparing metadata (setup.py): started
#11 20.78   Preparing metadata (setup.py): finished with status 'done'
#11 20.81 Collecting iso8601>=0.1.12
#11 20.84   Downloading iso8601-0.1.14-py2.py3-none-any.whl (9.5 kB)
#11 20.98 Collecting connexion[flask,swagger-ui]<3,>=2.6.0
#11 21.00   Downloading connexion-2.7.0-py2.py3-none-any.whl (77 kB)
#11 21.04 Collecting python-daemon>=2.2.4
#11 21.06   Downloading python_daemon-2.3.0-py2.py3-none-any.whl (35 kB)
#11 21.11 Collecting gunicorn<20.0,>=19.5.0
#11 21.13   Downloading gunicorn-19.10.0-py2.py3-none-any.whl (113 kB)
#11 21.18 Collecting dill<0.4,>=0.2.2
#11 21.20   Downloading dill-0.3.2.zip (177 kB)
#11 21.22   Preparing metadata (setup.py): started
#11 21.32   Preparing metadata (setup.py): finished with status 'done'
#11 21.35 Collecting python-nvd3~=0.15.0
#11 21.38   Downloading python-nvd3-0.15.0.tar.gz (31 kB)
#11 21.39   Preparing metadata (setup.py): started
#11 21.49   Preparing metadata (setup.py): finished with status 'done'
#11 21.51 Collecting termcolor>=1.1.0
#11 21.54   Downloading termcolor-1.1.0.tar.gz (3.9 kB)
#11 21.54   Preparing metadata (setup.py): started
#11 21.63   Preparing metadata (setup.py): finished with status 'done'
#11 21.71 Collecting pendulum~=2.0
#11 21.73   Downloading pendulum-2.1.2.tar.gz (81 kB)
#11 21.82   Installing build dependencies: started
#11 23.16   Installing build dependencies: finished with status 'done'
#11 23.16   Getting requirements to build wheel: started
#11 23.19   Getting requirements to build wheel: finished with status 'done'
#11 23.20   Preparing metadata (pyproject.toml): started
#11 23.32   Preparing metadata (pyproject.toml): finished with status 'done'
#11 23.40 Collecting importlib-metadata~=1.7
#11 23.43   Downloading importlib_metadata-1.7.0-py2.py3-none-any.whl (31 kB)
#11 23.67 Collecting numpy
#11 23.70   Downloading numpy-1.20.2-cp37-cp37m-manylinux2014_aarch64.whl (12.7 MB)
#11 24.77 Collecting setproctitle<2,>=1.1.8
#11 24.80   Downloading setproctitle-1.2.2-cp37-cp37m-manylinux2014_aarch64.whl (36 kB)
#11 24.92 Collecting psutil<6.0.0,>=4.2.0
#11 24.95   Downloading psutil-5.8.0.tar.gz (470 kB)
#11 25.02   Preparing metadata (setup.py): started
#11 25.17   Preparing metadata (setup.py): finished with status 'done'
#11 25.20 Collecting flask-caching<2.0.0,>=1.5.0
#11 25.22   Downloading Flask_Caching-1.10.1-py3-none-any.whl (34 kB)
#11 25.27 Collecting importlib-resources~=1.4
#11 25.29   Downloading importlib_resources-1.5.0-py2.py3-none-any.whl (21 kB)
#11 25.34 Collecting jinja2<2.12.0,>=2.10.1
#11 25.37   Downloading Jinja2-2.11.3-py2.py3-none-any.whl (125 kB)
#11 25.62 Collecting sqlalchemy<1.4,>=1.3.18
#11 25.65   Downloading SQLAlchemy-1.3.24-cp37-cp37m-manylinux2014_aarch64.whl (1.3 MB)
#11 25.78 Collecting tenacity~=6.2.0
#11 25.80   Downloading tenacity-6.2.0-py2.py3-none-any.whl (24 kB)
#11 25.84 Collecting lockfile>=0.12.2
#11 25.87   Downloading lockfile-0.12.2-py2.py3-none-any.whl (13 kB)
#11 25.90 Collecting python-slugify<5.0,>=3.0.0
#11 25.92   Downloading python-slugify-4.0.1.tar.gz (11 kB)
#11 25.93   Preparing metadata (setup.py): started
#11 26.03   Preparing metadata (setup.py): finished with status 'done'
#11 26.03 Requirement already satisfied: cached-property~=1.5 in /usr/local/lib/python3.7/site-packages (from apache-airflow[celery,crypto,statsd]==2.0.2) (1.5.2)
#11 26.09 Collecting requests>=2.20.0
#11 26.11   Downloading requests-2.25.1-py2.py3-none-any.whl (61 kB)
#11 26.19 Collecting rich==9.2.0
#11 26.21   Downloading rich-9.2.0-py3-none-any.whl (164 kB)
#11 26.27 Collecting graphviz>=0.12
#11 26.29   Downloading graphviz-0.16-py2.py3-none-any.whl (19 kB)
#11 26.44 Collecting pandas<2.0,>=0.17.1
#11 26.47   Downloading pandas-1.2.4.tar.gz (5.5 MB)
#11 27.30   Installing build dependencies: started
#11 68.95   Installing build dependencies: finished with status 'done'
#11 68.95   Getting requirements to build wheel: started
#11 99.33   Getting requirements to build wheel: finished with status 'done'
#11 99.34   Preparing metadata (pyproject.toml): started
#11 99.72   Preparing metadata (pyproject.toml): finished with status 'done'
#11 99.77 Collecting croniter<0.4,>=0.3.17
#11 99.79   Downloading croniter-0.3.37-py2.py3-none-any.whl (13 kB)
#11 99.84 Collecting flask<2.0,>=1.1.0
#11 99.85   Downloading Flask-1.1.2-py2.py3-none-any.whl (94 kB)
#11 99.91 Collecting pygments<3.0,>=2.0.1
#11 99.93   Downloading Pygments-2.8.1-py3-none-any.whl (983 kB)
#11 100.1 Collecting apache-airflow-providers-http
#11 100.1   Downloading apache_airflow_providers_http-1.1.1-py3-none-any.whl (20 kB)
#11 100.2 Collecting itsdangerous>=1.1.0
#11 100.2   Downloading itsdangerous-1.1.0-py2.py3-none-any.whl (16 kB)
#11 100.3 Collecting alembic<2.0,>=1.2
#11 100.3   Downloading alembic-1.5.8-py2.py3-none-any.whl (159 kB)
#11 100.3 Collecting apache-airflow-providers-sqlite
#11 100.4   Downloading apache_airflow_providers_sqlite-1.0.2-py3-none-any.whl (14 kB)
#11 100.4 Collecting argcomplete~=1.10
#11 100.4   Downloading argcomplete-1.12.3-py2.py3-none-any.whl (38 kB)
#11 100.5 Collecting marshmallow-oneofschema>=2.0.1
#11 100.5   Downloading marshmallow_oneofschema-2.1.0-py2.py3-none-any.whl (5.7 kB)
#11 100.5 Collecting jsonschema~=3.0
#11 100.6   Downloading jsonschema-3.2.0-py2.py3-none-any.whl (56 kB)
#11 100.6 Collecting apache-airflow-providers-celery
#11 100.6   Downloading apache_airflow_providers_celery-1.0.1-py3-none-any.whl (11 kB)
#11 100.7 Collecting statsd<4.0,>=3.3.0
#11 100.7   Downloading statsd-3.3.0-py2.py3-none-any.whl (11 kB)
#11 100.7 Collecting colorama<0.5.0,>=0.4.0
#11 100.8   Downloading colorama-0.4.4-py2.py3-none-any.whl (16 kB)
#11 100.8 Collecting commonmark<0.10.0,>=0.9.0
#11 100.8   Downloading commonmark-0.9.1-py2.py3-none-any.whl (51 kB)
#11 100.9 Collecting Mako
#11 100.9   Downloading Mako-1.1.4-py2.py3-none-any.whl (75 kB)
#11 100.9 Collecting python-editor>=0.3
#11 101.0   Downloading python_editor-1.0.4-py3-none-any.whl (4.9 kB)
#11 101.1 Collecting inflection>=0.3.1
#11 101.1   Downloading inflection-0.5.1-py2.py3-none-any.whl (9.5 kB)
#11 101.1 Collecting clickclick>=1.2
#11 101.1   Downloading clickclick-20.10.2-py2.py3-none-any.whl (7.4 kB)
#11 101.2 Collecting PyYAML>=5.1
#11 101.2   Downloading PyYAML-6.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (656 kB)
#11 101.3 Collecting openapi-spec-validator>=0.2.4
#11 101.3   Downloading openapi_spec_validator-0.3.0-py3-none-any.whl (31 kB)
#11 101.4 Collecting swagger-ui-bundle>=0.0.2
#11 101.4   Downloading swagger_ui_bundle-0.0.8-py3-none-any.whl (3.8 MB)
#11 101.7 Collecting natsort
#11 101.9   Downloading natsort-7.1.1-py3-none-any.whl (35 kB)
#11 102.0 Collecting cffi>=1.12
#11 102.1   Downloading cffi-1.14.5-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (206 kB)
#11 102.1 Collecting click>=5.1
#11 102.1   Downloading click-7.1.2-py2.py3-none-any.whl (82 kB)
#11 102.2 Collecting Flask-OpenID<2,>=1.2.5
#11 102.2   Downloading Flask-OpenID-1.2.5.tar.gz (43 kB)
#11 102.2   Preparing metadata (setup.py): started
#11 102.3   Preparing metadata (setup.py): finished with status 'error'
#11 102.3   ERROR: Command errored out with exit status 1:
#11 102.3    command: /usr/bin/python3 -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-fe4h9oll/flask-openid_a450895d34904c97985ae10b610eccaf/setup.py'"'"'; __file__='"'"'/tmp/pip-install-fe4h9oll/flask-openid_a450895d34904c97985ae10b610eccaf/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(__file__) if os.path.exists(__file__) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base /tmp/pip-pip-egg-info-clz4uty_
#11 102.3        cwd: /tmp/pip-install-fe4h9oll/flask-openid_a450895d34904c97985ae10b610eccaf/
#11 102.3   Complete output (1 lines):
#11 102.3   error in Flask-OpenID setup command: use_2to3 is invalid.
#11 102.3   ----------------------------------------
#11 102.3 WARNING: Discarding https://files.pythonhosted.org/packages/d1/a2/9d1fba3287a65f81b9d1c09c4f7cb16f8ea4988b1bc97ffea0d60983338f/Flask-OpenID-1.2.5.tar.gz#sha256=5a8ffe1c8c0ad1cc1f5030e1223ea27f8861ee0215a2a58a528cc61379e5ccab (from https://pypi.org/simple/flask-openid/). Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.
#11 102.3 Collecting flask-appbuilder>=3.1.1,~=3.1
#11 102.4   Downloading Flask_AppBuilder-3.4.0-py3-none-any.whl (1.9 MB)
#11 102.5   Downloading Flask_AppBuilder-3.3.4-py3-none-any.whl (1.9 MB)
#11 102.7   Downloading Flask_AppBuilder-3.3.3-py3-none-any.whl (1.9 MB)
#11 102.9   Downloading Flask_AppBuilder-3.3.2-py3-none-any.whl (1.8 MB)
#11 103.1 Collecting prison<1.0.0,>=0.1.3
#11 103.1   Downloading prison-0.1.3-py2.py3-none-any.whl (5.8 kB)
#11 103.1 Collecting flask-appbuilder>=3.1.1,~=3.1
#11 103.2   Downloading Flask_AppBuilder-3.3.1-py3-none-any.whl (1.8 MB)
#11 103.3   Downloading Flask_AppBuilder-3.3.0-py3-none-any.whl (1.8 MB)
#11 103.5   Downloading Flask_AppBuilder-3.2.3-py3-none-any.whl (1.8 MB)
#11 103.7   Downloading Flask_AppBuilder-3.2.2-py3-none-any.whl (1.8 MB)
#11 103.9   Downloading Flask_AppBuilder-3.2.1-py3-none-any.whl (1.8 MB)
#11 104.0   Downloading Flask_AppBuilder-3.2.0-py3-none-any.whl (1.8 MB)
#11 104.2   Downloading Flask_AppBuilder-3.1.1-py3-none-any.whl (1.7 MB)
#11 104.3 INFO: pip is looking at multiple versions of flask to determine which version is compatible with other requirements. This could take a while.
#11 104.3 INFO: pip is looking at multiple versions of dill to determine which version is compatible with other requirements. This could take a while.
#11 104.3 INFO: pip is looking at multiple versions of cryptography to determine which version is compatible with other requirements. This could take a while.
#11 104.3 INFO: pip is looking at multiple versions of croniter to determine which version is compatible with other requirements. This could take a while.
#11 104.3 INFO: pip is looking at multiple versions of connexion[flask,swagger-ui] to determine which version is compatible with other requirements. This could take a while.
#11 104.3 INFO: pip is looking at multiple versions of colorlog to determine which version is compatible with other requirements. This could take a while.
#11 104.3 INFO: pip is looking at multiple versions of cattrs to determine which version is compatible with other requirements. This could take a while.
#11 104.3 INFO: pip is looking at multiple versions of cached-property to determine which version is compatible with other requirements. This could take a while.
#11 104.4 Collecting cached-property~=1.5
#11 104.4   Using cached cached_property-1.5.2-py2.py3-none-any.whl (7.6 kB)
#11 104.4 INFO: pip is looking at multiple versions of attrs to determine which version is compatible with other requirements. This could take a while.
#11 104.4 INFO: pip is looking at multiple versions of argcomplete to determine which version is compatible with other requirements. This could take a while.
#11 104.4 INFO: pip is looking at multiple versions of alembic to determine which version is compatible with other requirements. This could take a while.
#11 104.4 INFO: pip is looking at multiple versions of <Python from Requires-Python> to determine which version is compatible with other requirements. This could take a while.
#11 104.4 INFO: pip is looking at multiple versions of rich to determine which version is compatible with other requirements. This could take a while.
#11 104.4 INFO: pip is looking at multiple versions of apache-airflow[celery,crypto,statsd] to determine which version is compatible with other requirements. This could take a while.
#11 104.4 ERROR: Cannot install apache-airflow because these package versions have conflicting dependencies.
#11 104.4
#11 104.4 The conflict is caused by:
#11 104.4     flask-appbuilder 3.4.1 depends on Flask-OpenID<2 and >=1.2.5
#11 104.4     flask-appbuilder 3.4.0 depends on Flask-OpenID<2 and >=1.2.5
#11 104.4     flask-appbuilder 3.3.4 depends on Flask-OpenID<2 and >=1.2.5
#11 104.4     flask-appbuilder 3.3.3 depends on Flask-OpenID<2 and >=1.2.5
#11 104.4     flask-appbuilder 3.3.2 depends on Flask-OpenID<2 and >=1.2.5
#11 104.4     flask-appbuilder 3.3.1 depends on Flask-OpenID<2 and >=1.2.5
#11 104.4     flask-appbuilder 3.3.0 depends on Flask-OpenID<2 and >=1.2.5
#11 104.4     flask-appbuilder 3.2.3 depends on Flask-OpenID<2 and >=1.2.5
#11 104.4     flask-appbuilder 3.2.2 depends on Flask-OpenID<2 and >=1.2.5
#11 104.4     flask-appbuilder 3.2.1 depends on Flask-OpenID<2 and >=1.2.5
#11 104.4     flask-appbuilder 3.2.0 depends on Flask-OpenID<2 and >=1.2.5
#11 104.4     flask-appbuilder 3.1.1 depends on Flask-OpenID<2 and >=1.2.5
#11 104.4     The user requested (constraint) flask-openid==1.2.5
#11 104.4
#11 104.4 To fix this you could try to:
#11 104.4 1. loosen the range of package versions you've specified
#11 104.4 2. remove package versions to allow pip attempt to solve the dependency conflict
#11 104.4
#11 104.4 ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/user_guide/#fixing-conflicting-dependencies
------
executor failed running [/bin/sh -c chmod u+x /bootstrap.sh && /bootstrap.sh]: exit code: 1

Might be linked to #1 as well as https://stackoverflow.com/questions/70310922/issue-running-airflow-on-mac-m1-error-in-flask-openid-setup-command-use-2to3-i.

fixing permissions on existing directory /var/lib/postgresql/data ... initdb: could not change permissions of directory "/var/lib/postgresql/data": Operation not permitted

Issue

I was trying to run ./mwaa-local-env start and it would throw below error.

Error

aws-mwaa-local-runner git/main 20s
❯ ./mwaa-local-env start
Starting docker_postgres_1 ... done
Starting docker_local-runner_1 ... done
Attaching to docker_postgres_1, docker_local-runner_1
postgres_1 | chmod: /var/lib/postgresql/data: Operation not permitted
postgres_1 | The files belonging to this database system will be owned by user "postgres".
postgres_1 | This user must also own the server process.
postgres_1 |
postgres_1 | The database cluster will be initialized with locale "en_US.utf8".
postgres_1 | The default database encoding has accordingly been set to "UTF8".
postgres_1 | The default text search configuration will be set to "english".
postgres_1 |
postgres_1 | Data page checksums are disabled.
postgres_1 |
postgres_1 | fixing permissions on existing directory /var/lib/postgresql/data ... initdb: could not change permissions of directory "/var/lib/postgresql/data": Operation not permitted
docker_postgres_1 exited with code 1
local-runner_1 | Fri Jun 11 18:48:08 UTC 2021 - waiting for Postgres... 1/20

To replicate

I'm running this on my windows machine, below are the details of the host OS

aws-mwaa-local-runner git/main*
❯ uname -a
Linux SEA-5CG021B68D 4.19.128-microsoft-standard #1 SMP Tue Jun 23 12:58:10 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux

aws-mwaa-local-runner git/main*
❯ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 20.04.2 LTS
Release: 20.04
Codename: focal

Fix

Fix seemed easy enough,
This can be fixed by modifying the docker-compose-local.yml file from this:

volumes:
            - "${PWD}/db-data:/var/lib/postgresql/data"

to this:

volumes:
            - "${PWD}/db-data:/var/lib/postgresql/data:z"

Essentially you are adding "z" at the end.

Source: https://stackoverflow.com/a/31334443

branch v2.0.2 unintentionally installs airflow v2.1.0

It appears that airflow version v2.1.0 is being installed unintentionally instead of v2.0.2.

Steps to replicate

git clone --branch v2.0.2 https://github.com/aws/aws-mwaa-local-runner.git
./mwaa-local-env build-image
./mwaa-local-env start

results in the following information being displayed in the web-ui

Version: v2.1.0
Git Version: .release:2.1.0+304e174674ff6921cb7ed79c0158949b50eff8fe

aws config

Documentation is missing on how to provide .aws/ config

wrong default config value

dagbag_import_timeout = 30.0 in airflow.cfg fails
fixed by changing it to 30

local-runner_1 | ValueError: invalid literal for int() with base 10: '30.0'
local-runner_1 | Traceback (most recent call last):
local-runner_1 | File "/usr/local/bin/airflow", line 5, in
local-runner_1 | from airflow.main import main
local-runner_1 | File "/usr/local/airflow/.local/lib/python3.7/site-packages/airflow/init.py", line 50, in
local-runner_1 | from airflow.models import DAG # noqa: E402
local-runner_1 | File "/usr/local/airflow/.local/lib/python3.7/site-packages/airflow/models/init.py", line 21, in
local-runner_1 | from airflow.models.baseoperator import BaseOperator, BaseOperatorLink # noqa: F401
local-runner_1 | File "/usr/local/airflow/.local/lib/python3.7/site-packages/airflow/models/baseoperator.py", line 43, in
local-runner_1 | from airflow.models.dag import DAG
local-runner_1 | File "/usr/local/airflow/.local/lib/python3.7/site-packages/airflow/models/dag.py", line 52, in
local-runner_1 | from airflow.models.dagbag import DagBag
local-runner_1 | File "/usr/local/airflow/.local/lib/python3.7/site-packages/airflow/models/dagbag.py", line 50, in
local-runner_1 | class DagBag(BaseDagBag, LoggingMixin):
local-runner_1 | File "/usr/local/airflow/.local/lib/python3.7/site-packages/airflow/models/dagbag.py", line 80, in DagBag
local-runner_1 | DAGBAG_IMPORT_TIMEOUT = conf.getint('core', 'DAGBAG_IMPORT_TIMEOUT')
local-runner_1 | File "/usr/local/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py", line 418, in getint
local-runner_1 | return int(self.get(section, key, **kwargs))
local-runner_1 | ValueError: invalid literal for int() with base 10: '30.0'

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.