Git Product home page Git Product logo

airflow-tutorial's People

Contributors

robinparriath avatar tuanavu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

airflow-tutorial's Issues

FileNotFoundError: [Errno 2] No such file or directory while running docker-compose up -d

I'm using Ubuntu 20.04 , I've installed all prerequisites but when I run

docker-compose up -d

I get:

Traceback (most recent call last):
  File "urllib3/connectionpool.py", line 677, in urlopen
  File "urllib3/connectionpool.py", line 392, in _make_request
  File "http/client.py", line 1277, in request
  File "http/client.py", line 1323, in _send_request
  File "http/client.py", line 1272, in endheaders
  File "http/client.py", line 1032, in _send_output
  File "http/client.py", line 972, in send
  File "docker/transport/unixconn.py", line 43, in connect
FileNotFoundError: [Errno 2] No such file or directory

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "requests/adapters.py", line 449, in send
  File "urllib3/connectionpool.py", line 727, in urlopen
  File "urllib3/util/retry.py", line 410, in increment
  File "urllib3/packages/six.py", line 734, in reraise
  File "urllib3/connectionpool.py", line 677, in urlopen
  File "urllib3/connectionpool.py", line 392, in _make_request
  File "http/client.py", line 1277, in request
  File "http/client.py", line 1323, in _send_request
  File "http/client.py", line 1272, in endheaders
  File "http/client.py", line 1032, in _send_output
  File "http/client.py", line 972, in send
  File "docker/transport/unixconn.py", line 43, in connect
urllib3.exceptions.ProtocolError: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "docker/api/client.py", line 214, in _retrieve_server_version
  File "docker/api/daemon.py", line 181, in version
  File "docker/utils/decorators.py", line 46, in inner
  File "docker/api/client.py", line 237, in _get
  File "requests/sessions.py", line 543, in get
  File "requests/sessions.py", line 530, in request
  File "requests/sessions.py", line 643, in send
  File "requests/adapters.py", line 498, in send
requests.exceptions.ConnectionError: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "docker-compose", line 3, in <module>
  File "compose/cli/main.py", line 80, in main
  File "compose/cli/main.py", line 189, in perform_command
  File "compose/cli/command.py", line 70, in project_from_options
  File "compose/cli/command.py", line 153, in get_project
  File "compose/cli/docker_client.py", line 43, in get_client
  File "compose/cli/docker_client.py", line 170, in docker_client
  File "docker/api/client.py", line 197, in __init__
  File "docker/api/client.py", line 222, in _retrieve_server_version
docker.errors.DockerException: Error while fetching server API version: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))
[358] Failed to execute script docker-compose

Java Jar not running

[2020-05-11 19:39:34,743] {{bash_operator.py:97}} INFO - Running command: java -cp /usr/local/airflow/dags/trainingMaven-1.0-SNAPSHOT.jar practice.Test
[2020-05-11 19:39:34,781] {{bash_operator.py:106}} INFO - Output:
[2020-05-11 19:39:34,794] {{bash_operator.py:110}} INFO - /tmp/airflowtmpypbdvug4/usgs_fetchor00rg02: line 1: java: command not found
[2020-05-11 19:39:34,795] {{bash_operator.py:114}} INFO - Command exited with return code 127
[2020-05-11 19:39:34,867] {{models.py:1760}} ERROR - Bash command failed
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/airflow/models.py", line 1659, in _run_raw_task
result = task_copy.execute(context=context)
File "/usr/local/lib/python3.6/site-packages/airflow/operators/bash_operator.py", line 118, in execute
raise AirflowException("Bash command failed")
airflow.exceptions.AirflowException: Bash command failed
[2020-05-11 19:39:34,880] {{models.py:1791}} INFO - Marking task as FAILED.
[2020-05-11 19:39:34,978] {{base_task_runner.py:101}} INFO - Job 8: Subtask usgs_fetch /usr/local/lib/python3.6/site-packages/airflow/utils/helpers.py:346: DeprecationWarning: Importing 'BashOperator' directly from 'airflow.operators' has been deprecated. Please import from 'airflow.operators.[operator_module]' instead. Support for direct imports will be dropped entirely in Airflow 2.0.
[2020-05-11 19:39:34,978] {{base_task_runner.py:101}} INFO - Job 8: Subtask usgs_fetch DeprecationWarning)
[2020-05-11 19:39:34,985] {{base_task_runner.py:101}} INFO - Job 8: Subtask usgs_fetch Traceback (most recent call last):
[2020-05-11 19:39:34,985] {{base_task_runner.py:101}} INFO - Job 8: Subtask usgs_fetch File "/usr/local/bin/airflow", line 32, in
[2020-05-11 19:39:34,985] {{base_task_runner.py:101}} INFO - Job 8: Subtask usgs_fetch args.func(args)
[2020-05-11 19:39:34,985] {{base_task_runner.py:101}} INFO - Job 8: Subtask usgs_fetch File "/usr/local/lib/python3.6/site-packages/airflow/utils/cli.py", line 74, in wrapper
[2020-05-11 19:39:34,986] {{base_task_runner.py:101}} INFO - Job 8: Subtask usgs_fetch return f(*args, **kwargs)
[2020-05-11 19:39:34,986] {{base_task_runner.py:101}} INFO - Job 8: Subtask usgs_fetch File "/usr/local/lib/python3.6/site-packages/airflow/bin/cli.py", line 490, in run
[2020-05-11 19:39:34,986] {{base_task_runner.py:101}} INFO - Job 8: Subtask usgs_fetch _run(args, dag, ti)
[2020-05-11 19:39:34,986] {{base_task_runner.py:101}} INFO - Job 8: Subtask usgs_fetch File "/usr/local/lib/python3.6/site-packages/airflow/bin/cli.py", line 406, in _run
[2020-05-11 19:39:34,986] {{base_task_runner.py:101}} INFO - Job 8: Subtask usgs_fetch pool=args.pool,
[2020-05-11 19:39:34,986] {{base_task_runner.py:101}} INFO - Job 8: Subtask usgs_fetch File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 74, in wrapper
[2020-05-11 19:39:34,986] {{base_task_runner.py:101}} INFO - Job 8: Subtask usgs_fetch return func(*args, **kwargs)
[2020-05-11 19:39:34,986] {{base_task_runner.py:101}} INFO - Job 8: Subtask usgs_fetch File "/usr/local/lib/python3.6/site-packages/airflow/models.py", line 1659, in _run_raw_task
[2020-05-11 19:39:34,986] {{base_task_runner.py:101}} INFO - Job 8: Subtask usgs_fetch result = task_copy.execute(context=context)
[2020-05-11 19:39:34,986] {{base_task_runner.py:101}} INFO - Job 8: Subtask usgs_fetch File "/usr/local/lib/python3.6/site-packages/airflow/operators/bash_operator.py", line 118, in execute
[2020-05-11 19:39:34,986] {{base_task_runner.py:101}} INFO - Job 8: Subtask usgs_fetch raise AirflowException("Bash command failed")
[2020-05-11 19:39:34,986] {{base_task_runner.py:101}} INFO - Job 8: Subtask usgs_fetch airflow.exceptions.AirflowException: Bash command failed
[2020-05-11 19:39:35,431] {{logging_mixin.py:95}} INFO - [2020-05-11 19:39:35,430] {{jobs.py:2695}} WARNING - State of this instance has been externally set to failed. Taking the poison pill.
[2020-05-11 19:39:35,462] {{helpers.py:240}} INFO - Sending Signals.SIGTERM to GPID 237
[2020-05-11 19:39:35,535] {{helpers.py:230}} INFO - Process psutil.Process(pid=237 (terminated)) (237) terminated with exit code 15

Version prerequisites and updated docker-compose file to run this repo

Following are the version prerequisites to run this repo, along with the updated docker-compose.yml file.

docker version => 20.10.8, build 3967b7d
docker-compose version => 1.27.4, build unknown
airflow version => 2.1.3

Updated docker-compose.yml file:

version: '3'
services:
  postgres:
    image: postgres:9.6
    environment:
      - POSTGRES_USER=airflow
      - POSTGRES_PASSWORD=airflow
      - POSTGRES_DB=airflow
    ports:
      - "5432:5432"

  webserver:
    image: puckel/docker-airflow:1.10.9
    build:
      context: https://github.com/puckel/docker-airflow.git#1.10.9
      dockerfile: Dockerfile
      args:
        AIRFLOW_DEPS: gcp_api,s3
        PYTHON_DEPS: sqlalchemy==1.3.0
    restart: always
    depends_on:
      - postgres
    environment:
      - LOAD_EX=n
      - EXECUTOR=Local
      - FERNET_KEY=jsDPRErfv8Z_eVTnGfF8ywd19j4pyqE3NpdUBA_oRTo=
    volumes:
      - ./examples/intro-example/dags:/usr/local/airflow/dags
      # Uncomment to include custom plugins
      # - ./plugins:/usr/local/airflow/plugins
    ports:
      - "8080:8080"
    command: webserver
    healthcheck:
      test: ["CMD-SHELL", "[ -f /usr/local/airflow/airflow-webserver.pid ]"]
      interval: 30s
      timeout: 30s
      retries: 3

Cannot start service webserver: error while creating mount source path

At first I had the no mysql module error and I followed the solutions listed in previous solutions

"webserver:
image: puckel/docker-airflow:1.10.9
build:
context: https://github.com/puckel/docker-airflow.git#1.10.9
dockerfile: Dockerfile
args:
AIRFLOW_DEPS: gcp_api,s3
PYTHON_DEPS: sqlalchemy==1.3.0 markupsafe==2.0.1 wtforms==2.2"

updating airflow,sqlaclehmy and the wtf, but now I am getting the following error for my webserver.
Are there any known solutions?

Starting airflow-tutorial_webserver_1 ... error

ERROR: for airflow-tutorial_webserver_1 Cannot start service webserver: error while creating mount source path '/run/desktop/mnt/host/c/Users/Roehr/Desktop/airflow-tutorial/examples/intro-example/dags': mkdir /run/desktop/mnt/host/c: file exists

ERROR: for webserver Cannot start service webserver: error while creating mount source path '/run/desktop/mnt/host/c/Users/Roehr/Desktop/airflow-tutorial/examples/intro-example/dags': mkdir /run/desktop/mnt/host/c: file exists
Encountered errors while bringing up the project.

Error executing docker-compose up -d

Hi, Thank you for the tutorial, I was watching the videos on your Youtube channel and you did an amazing job. I have a problem when I execute docker-compose up -d, the output is this:

Building webserver
unable to prepare context: unable to 'git clone' to temporary context directory: error fetching: /usr/lib/git-core/git-remote-https: /tmp/_MEImV8j6O/libcrypto.so.1.1: version `OPENSSL_1_1_1' not found (required by /lib/x86_64-linux-gnu/libssh.so.4)
: exit status 128
ERROR: Service 'webserver' failed to build : Build failed

My environment is:

Distributor ID:	Ubuntu
Description:	Ubuntu 20.04.1 LTS
Release:	20.04
Codename:	focal
Docker version 20.10.6, build 370c289
docker-compose version 1.29.1, build c34c88b2

I have the OPENSSL Library installed in my system:

> openssl version
OpenSSL 1.1.1f  31 Mar 2020

Could you help me, please?

failed to solve with frontend dockerfile.v0: failed to read dockerfile

I get an error on docker-compose up while building the webserver

Status: Downloaded newer image for postgres:9.6
Building webserver
[+] Building 2.0s (1/1) FINISHED                                                
 => [internal] load git source https://github.com/puckel/docker-airflow.g  2.0s
failed to solve with frontend dockerfile.v0: failed to read dockerfile: open /var/lib/docker/tmp/buildkit-mount692468792/https:/github.com/puckel/docker-airflow.git#1.10.1/Dockerfile: no such file or directory                               
ERROR: Service 'webserver' failed to build

I have shared the version of docker and docker-compose in the image attached.
any pointers to fix this issue...?

Screenshot 2021-03-16 at 7 57 51 PM

Docker-compose up Error (Failed to build webserver)

Hi, after reading the docs and after running docker-compose up into my machine, I got this error:

Building webserver
[+] Building 0.0s (1/1) FINISHED                                                                                                                                                     
 => CACHED [internal] load git source https://github.com/puckel/docker-airflow.git#1.10.1                                                                                       0.0s
failed to solve with frontend dockerfile.v0: failed to read dockerfile: open /var/lib/docker/tmp/buildkit-mount261987239/https:/github.com/puckel/docker-airflow.git#1.10.1/Dockerfile: no such file or directory
ERROR: Service 'webserver' failed to build

My configs:

Distributor ID:	Mac OS
Description:	macOS Big Sur
Release:	11.2.3 (20D91)
Docker version 20.10.5, build 55c4c88
docker-compose version 1.28.5, build c4eb3a1f

Can someone help me ?

docker-compose up fails

git clone then docker-compose up fails

docker-compose up
Building webserver
ERROR: error initializing submodules: usage: git submodule [--quiet] add [-b <branch>] [-f|--force] [--name <name>] [--reference <repository>] [--] <repository> [<path>]
   or: git submodule [--quiet] status [--cached] [--recursive] [--] [<path>...]
   or: git submodule [--quiet] init [--] [<path>...]
   or: git submodule [--quiet] deinit [-f|--force] [--] <path>...
   or: git submodule [--quiet] update [--init] [--remote] [-N|--no-fetch] [-f|--force] [--rebase] [--reference <repository>] [--merge] [--recursive] [--] [<path>...]
   or: git submodule [--quiet] summary [--cached|--files] [--summary-limit <n>] [commit] [--] [<path>...]
   or: git submodule [--quiet] foreach [--recursive] <command>
   or: git submodule [--quiet] sync [--recursive] [--] [<path>...]
: exit status

Docker Version -

docker version
Client: Docker Engine - Community
 Version:           19.03.12
 API version:       1.40
 Go version:        go1.13.10
 Git commit:        48a66213fe
 Built:             Mon Jun 22 15:46:54 2020
 OS/Arch:           linux/amd64
 Experimental:      false

Server: Docker Engine - Community
 Engine:
  Version:          19.03.12
  API version:      1.40 (minimum version 1.12)
  Go version:       go1.13.10
  Git commit:       48a66213fe
  Built:            Mon Jun 22 15:45:28 2020
  OS/Arch:          linux/amd64
  Experimental:     false
 containerd:
  Version:          1.2.13
  GitCommit:        7ad184331fa3e55e52b890ea95e65ba581ae3429
 runc:
  Version:          1.0.0-rc10
  GitCommit:        dc9208a3303feef5b3839f4323d9beb36df0a9dd
 docker-init:
  Version:          0.18.0
  GitCommit:        fec3683

docker-compose version

[docker-compose version 1.26.2, build eefe0d31
docker-py version: 4.2.2
CPython version: 3.7.7
OpenSSL version: OpenSSL 1.1.0l  10 Sep 2019]
```(url)

Git version

git version
git version 1.8.3.1



cat /etc/redhat-release
CentOS Linux release 7.8.2003 (Core)

Unable to run the webservice

Hi, thank you for sharing the tutorial.
After docker-compose up I'm getting the below error:

webserver_1 | Traceback (most recent call last):
webserver_1 | File "/usr/local/lib/python3.6/site-packages/airflow/models.py", line 377, in process_file
webserver_1 | m = imp.load_source(mod_name, filepath)
webserver_1 | File "/usr/local/lib/python3.6/imp.py", line 172, in load_source
webserver_1 | module = _load(spec)
webserver_1 | File "", line 684, in _load
webserver_1 | File "", line 665, in _load_unlocked
webserver_1 | File "", line 678, in exec_module
webserver_1 | File "", line 219, in _call_with_frames_removed
webserver_1 | File "/usr/local/airflow/dags/example_variables.py", line 33, in
webserver_1 | dag_config = Variable.get("example_variables_config", deserialize_json=True)
webserver_1 | File "/usr/local/lib/python3.6/site-packages/airflow/utils/db.py", line 74, in wrapper
webserver_1 | return func(*args, **kwargs)
webserver_1 | File "/usr/local/lib/python3.6/site-packages/airflow/models.py", line 4551, in get
webserver_1 | raise KeyError('Variable {} does not exist'.format(key))
webserver_1 | KeyError: 'Variable example_variables_config does not exist'

Can you advice what may be causing it?
Thanks

Issue on Install

ERROR: for airflowtutorial_webserver_1 Cannot create container for service webs
erver: invalid volume specification: 'C:\Program Files\Docker Toolbox\airflow-tu
torial\examples\intro-example\dags:/usr/local/airflow/dags:rw'

docker-compose up failed

docker.errors.DockerException: Error while fetching server API version: (2, 'CreateFile', 'The system cannot find the file specified.')

example_variables_config error

Using the master branch, docker-compose up completes with error, and localhost:8080 comes
up with an error/warning:

Broken DAG: [/usr/local/airflow/dags/example_variables.py] 'Variable example_variables_config does not exist'

docker-compose up error. Couldn't create the network.

Hi. I got this error:

$ docker-compose up --build
Creating network "airflow-tutorial_default" with the default driver
Building webserver
unable to prepare context: unable to 'git clone' to temporary context directory: error fetching: /usr/lib/git-core/git-remote-https: /tmp/_MEIgnICZE/libcrypto.so.1.1: version `OPENSSL_1_1_1' not found (required by /usr/lib/x86_64-linux-gnu/libssh.so.4)
: exit status 128
ERROR: Service 'webserver' failed to build : Build failed

What am I doing wrong?

Variable import failed -- used CLI method

Hi,

I have created a variables.json file to be loaded for my dags, the file is as follows

{
	"variables_config": {
		"run_name": "run_v5.0",
		"model_name": "bert-base-uncased",
		"model_type": "bert"
	}
}

When I try to import this variable file using CLI, the following error is generated --

[2020-06-29 14:36:02,949] {{cli_action_loggers.py:107}} WARNING - Failed to log action with (psycopg2.errors.UndefinedTable) relation "log" does not exist
LINE 1: INSERT INTO log (dttm, dag_id, task_id, event, execution_dat...
                    ^

[SQL: INSERT INTO log (dttm, dag_id, task_id, event, execution_date, owner, extra) VALUES (%(dttm)s, %(dag_id)s, %(task_id)s, %(event)s, %(execution_date)s, %(owner)s, %(extra)s) RETURNING log.id]
[parameters: {'dttm': datetime.datetime(2020, 6, 29, 14, 36, 2, 937905, tzinfo=<Timezone [UTC]>), 'dag_id': None, 'task_id': None, 'event': 'cli_variables', 'execution_date': None, 'owner': 'airflow', 'extra': '{"host_name": "903358252eab", "full_command": "[\'/usr/local/bin/airflow\', \'variables\', \'--import\', \'/usr/local/airflow/dags/config/variables.json\']"}'}]
(Background on this error at: http://sqlalche.me/e/13/f405)
Variable import failed: ProgrammingError('(psycopg2.errors.UndefinedTable) relation "variable" does not exist\nLINE 1: DELETE FROM variable WHERE variable.key = \'variables_config\'\n                    ^\n')
0 of 1 variables successfully updated.
1 variable(s) failed to be updated.

Command used -

docker-compose run --rm webserver airflow variables --import /usr/local/airflow/dags/config/variables.json

Code to access the variables -

default_args = {
    'owner': 'airflow',
    'start_date': airflow.utils.dates.days_ago(1),# dt.datetime(2020, 6, 25),
    'concurrency': 1,
    'retries': 0,
    'schedule_interval':'@once'
}
dag_config = Variable.get("variables_config", deserialize_json=True)

run_name = dag_config["run_name"]
model_name = dag_config["model_name"]
model_type = dag_config["model_type"]

Can you help out with this issue?

P.S. I have forked the repo here, changed the volume mappings for custom use.

Failed to execute script docker-compose

Executing docker-compose up finished with an error after pulling postgres (postgres:9.6) successfully,

$ docker-compose up
Creating network "airflow-tutorial_default" with the default driver
Pulling postgres (postgres:9.6)...
9.6: Pulling from library/postgres
f7e2b70d04ae: Pull complete
027ad848ac9c: Pull complete
7c040ef66643: Pull complete
b891079ad2eb: Pull complete
cb64a97e42d9: Pull complete
1b88625f7d89: Pull complete
a6ac0b663e77: Pull complete
594497f0a694: Pull complete
0189bc4eb328: Pull complete
ded134b9085f: Pull complete
5eb52aef3e1c: Pull complete
54620ac345cf: Pull complete
d7d7bc54ed9c: Pull complete
cd9943f5a731: Pull complete
Building webserver
Traceback (most recent call last):
File "docker-compose", line 6, in <module>
File "compose\cli\main.py", line 71, in main
File "compose\cli\main.py", line 127, in perform_command
File "compose\cli\main.py", line 1080, in up
File "compose\cli\main.py", line 1076, in up
File "compose\project.py", line 475, in up
File "compose\service.py", line 356, in ensure_image_exists
File "compose\service.py", line 1080, in build
File "site-packages\docker\api\build.py", line 142, in build
TypeError: You must specify a directory to build in path
[20384] Failed to execute script docker-compose

webserver-1 keep crashing

My env is Windows 10 + docker desktop:

Step 1: git clone https://github.com/tuanavu/airflow-tutorial.git
Step 2: cd airflow-tutorial
Step 3: docker-compose up

airflow-tutorial-webserver-1 | /usr/local/lib/python3.6/site-packages/airflow/configuration.py:57: CryptographyDeprecationWarning: Python 3.6 is no longer supported by the Python core team. Therefore, support for it is deprecated in cryptography and will be removed in a future release.
airflow-tutorial-webserver-1 | from cryptography.fernet import Fernet
airflow-tutorial-webserver-1 | Traceback (most recent call last):
airflow-tutorial-webserver-1 | File "/usr/local/lib/python3.6/site-packages/MySQLdb/init.py", line 18, in
airflow-tutorial-webserver-1 | from . import _mysql
airflow-tutorial-webserver-1 | ImportError: libmariadb.so.3: cannot open shared object file: No such file or directory
airflow-tutorial-webserver-1 |
airflow-tutorial-webserver-1 | During handling of the above exception, another exception occurred:
airflow-tutorial-webserver-1 |
airflow-tutorial-webserver-1 | Traceback (most recent call last):
airflow-tutorial-webserver-1 | File "/usr/local/bin/airflow", line 21, in
airflow-tutorial-webserver-1 | from airflow import configuration
airflow-tutorial-webserver-1 | File "/usr/local/lib/python3.6/site-packages/airflow/init.py", line 36, in
airflow-tutorial-webserver-1 | from airflow import settings
airflow-tutorial-webserver-1 | File "/usr/local/lib/python3.6/site-packages/airflow/settings.py", line 261, in
airflow-tutorial-webserver-1 | configure_adapters()
airflow-tutorial-webserver-1 | File "/usr/local/lib/python3.6/site-packages/airflow/settings.py", line 218, in configure_adapters
airflow-tutorial-webserver-1 | import MySQLdb.converters
airflow-tutorial-webserver-1 | File "/usr/local/lib/python3.6/site-packages/MySQLdb/init.py", line 24, in
airflow-tutorial-webserver-1 | version_info, _mysql.version_info, _mysql.file
airflow-tutorial-webserver-1 | NameError: name '_mysql' is not defined
airflow-tutorial-webserver-1 | Traceback (most recent call last):
airflow-tutorial-webserver-1 | File "/usr/local/lib/python3.6/site-packages/MySQLdb/init.py", line 18, in
airflow-tutorial-webserver-1 | from . import _mysql
airflow-tutorial-webserver-1 | ImportError: libmariadb.so.3: cannot open shared object file: No such file or directory

dag_config not defined, cannot import example_variables.py

When I start the containers with docker-compose up, I get this error: ERROR - Failed to import: /usr/local/airflow/dags/example_variables.py

Then when I check to see what's up on localhost:8080, there are two dags showing but at the top is this error: Broken DAG: [/usr/local/airflow/dags/example_variables.py] name 'dag_config' is not defined

I see this file on my machine, but it is not in /usr/local/airflow (I'm on a mac). It's in airflow-tutorial⁩/⁨examples⁩/intro-example⁩/⁨dags⁩. I also see that docker-compose.yml defines this location that does not exist on my machine after using docker-compose to start the containers:
volumes:
- ./examples/intro-example/dags:/usr/local/airflow/dags

What is the best way to correct this? Should I change the path under volumes: ? Even if I do, I think the example_variables.py comments out the line that assigns a value to dag_config.

Thanks for any suggestions!

Papermill

How can I install papermill inside this if I want to?

Airflow 1.10.2

How can i install airflow 1.10.2 with docker compose. When i change docker-compose.yml like so

......
 webserver:
    image: puckel/docker-airflow:1.10.2
    build:
      context: https://github.com/puckel/docker-airflow.git#1.10.2
      dockerfile: Dockerfile
      args:
         AIRFLOW_DEPS: gcp_api,mysql        
    restart: always

 ......

Following error is thrown

webserver_1  | [2019-04-08 08:07:09,123] {{settings.py:174}} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=13
webserver_1  | Traceback (most recent call last):
webserver_1  |   File "/usr/local/bin/airflow", line 21, in <module>
webserver_1  |     from airflow import configuration
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/__init__.py", line 36, in <module>
webserver_1  |     from airflow import settings, configuration as conf
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/settings.py", line 266, in <module>
webserver_1  |     configure_orm()
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/settings.py", line 188, in configure_orm
webserver_1  |     engine = create_engine(SQL_ALCHEMY_CONN, **engine_args)
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/__init__.py", line 431, in create_engine
webserver_1  |     return strategy.create(*args, **kwargs)
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/strategies.py", line 87, in create
webserver_1  |     dbapi = dialect_cls.dbapi(**dbapi_args)
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/dialects/postgresql/psycopg2.py", line 599, in dbapi
webserver_1  |     import psycopg2
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/psycopg2/__init__.py", line 50, in <module>
webserver_1  |     from psycopg2._psycopg import (                     # noqa
webserver_1  | ImportError: libpq.so.5: cannot open shared object file: No such file or directory
webserver_1  | [2019-04-08 08:07:10,706] {{settings.py:174}} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=1
webserver_1  | [2019-04-08 08:07:10,711] {{settings.py:174}} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=16
webserver_1  | Traceback (most recent call last):
webserver_1  |   File "/usr/local/bin/airflow", line 21, in <module>
webserver_1  |     from airflow import configuration
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/__init__.py", line 36, in <module>
webserver_1  |     from airflow import settings, configuration as conf
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/settings.py", line 266, in <module>
webserver_1  |     configure_orm()
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/settings.py", line 188, in configure_orm
webserver_1  |     engine = create_engine(SQL_ALCHEMY_CONN, **engine_args)
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/__init__.py", line 431, in create_engine
webserver_1  |     return strategy.create(*args, **kwargs)
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/strategies.py", line 87, in create
webserver_1  |     dbapi = dialect_cls.dbapi(**dbapi_args)
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/dialects/postgresql/psycopg2.py", line 599, in dbapi
webserver_1  |     import psycopg2
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/psycopg2/__init__.py", line 50, in <module>
webserver_1  |     from psycopg2._psycopg import (                     # noqa
webserver_1  | ImportError: libpq.so.5: cannot open shared object file: No such file or directory
webserver_1  | Traceback (most recent call last):
webserver_1  |   File "/usr/local/bin/airflow", line 21, in <module>
webserver_1  |     from airflow import configuration
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/__init__.py", line 36, in <module>
webserver_1  |     from airflow import settings, configuration as conf
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/settings.py", line 266, in <module>
webserver_1  |     configure_orm()
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/settings.py", line 188, in configure_orm
webserver_1  |     engine = create_engine(SQL_ALCHEMY_CONN, **engine_args)
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/__init__.py", line 431, in create_engine
webserver_1  |     return strategy.create(*args, **kwargs)
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/strategies.py", line 87, in create
webserver_1  |     dbapi = dialect_cls.dbapi(**dbapi_args)
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/dialects/postgresql/psycopg2.py", line 599, in dbapi
webserver_1  |     import psycopg2
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/psycopg2/__init__.py", line 50, in <module>
webserver_1  |     from psycopg2._psycopg import (                     # noqa
webserver_1  | ImportError: libpq.so.5: cannot open shared object file: No such file or directory
postgres_1   | LOG:  incomplete startup packet
webserver_1  | [2019-04-08 08:07:12,442] {{settings.py:174}} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=7
webserver_1  | Traceback (most recent call last):
webserver_1  |   File "/usr/local/bin/airflow", line 21, in <module>
webserver_1  |     from airflow import configuration
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/__init__.py", line 36, in <module>
webserver_1  |     from airflow import settings, configuration as conf
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/settings.py", line 266, in <module>
webserver_1  |     configure_orm()
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/settings.py", line 188, in configure_orm
webserver_1  |     engine = create_engine(SQL_ALCHEMY_CONN, **engine_args)
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/__init__.py", line 431, in create_engine
webserver_1  |     return strategy.create(*args, **kwargs)
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/strategies.py", line 87, in create
webserver_1  |     dbapi = dialect_cls.dbapi(**dbapi_args)
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/dialects/postgresql/psycopg2.py", line 599, in dbapi
webserver_1  |     import psycopg2
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/psycopg2/__init__.py", line 50, in <module>
webserver_1  |     from psycopg2._psycopg import (                     # noqa
webserver_1  | ImportError: libpq.so.5: cannot open shared object file: No such file or directory
webserver_1  | [2019-04-08 08:07:14,131] {{settings.py:174}} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=10
webserver_1  | [2019-04-08 08:07:14,136] {{settings.py:174}} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=1
webserver_1  | Traceback (most recent call last):
webserver_1  |   File "/usr/local/bin/airflow", line 21, in <module>
webserver_1  |     from airflow import configuration
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/__init__.py", line 36, in <module>
webserver_1  |     from airflow import settings, configuration as conf
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/settings.py", line 266, in <module>
webserver_1  |     configure_orm()
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/settings.py", line 188, in configure_orm
webserver_1  |     engine = create_engine(SQL_ALCHEMY_CONN, **engine_args)
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/__init__.py", line 431, in create_engine
webserver_1  |     return strategy.create(*args, **kwargs)
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/strategies.py", line 87, in create
webserver_1  |     dbapi = dialect_cls.dbapi(**dbapi_args)
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/dialects/postgresql/psycopg2.py", line 599, in dbapi
webserver_1  |     import psycopg2
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/psycopg2/__init__.py", line 50, in <module>
webserver_1  |     from psycopg2._psycopg import (                     # noqa
webserver_1  | ImportError: libpq.so.5: cannot open shared object file: No such file or directory
webserver_1  | Traceback (most recent call last):
webserver_1  |   File "/usr/local/bin/airflow", line 21, in <module>
webserver_1  |     from airflow import configuration
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/__init__.py", line 36, in <module>
webserver_1  |     from airflow import settings, configuration as conf
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/settings.py", line 266, in <module>
webserver_1  |     configure_orm()
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/settings.py", line 188, in configure_orm
webserver_1  |     engine = create_engine(SQL_ALCHEMY_CONN, **engine_args)
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/__init__.py", line 431, in create_engine
webserver_1  |     return strategy.create(*args, **kwargs)
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/strategies.py", line 87, in create
webserver_1  |     dbapi = dialect_cls.dbapi(**dbapi_args)
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/sqlalchemy/dialects/postgresql/psycopg2.py", line 599, in dbapi
webserver_1  |     import psycopg2
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/psycopg2/__init__.py", line 50, in <module>
webserver_1  |     from psycopg2._psycopg import (                     # noqa
webserver_1  | ImportError: libpq.so.5: cannot open shared object file: No such file or directory

webserver_1 | ModuleNotFoundError: No module named 'wtforms.compat'

i tried following #52 and #51 to get past the mysql error, but now im getting a wtforms.compat error. i tried suggestions from this post downgrading the wtforms version https://exerror.com/modulenotfounderror-no-module-named-wtforms-compat/

Starting airflow-tutorial_postgres_1 ... done
Recreating airflow-tutorial_webserver_1 ... done
Attaching to airflow-tutorial_postgres_1, airflow-tutorial_webserver_1
postgres_1   |
postgres_1   | PostgreSQL Database directory appears to contain a database; Skipping initialization
postgres_1   |
postgres_1   | LOG:  database system was interrupted; last known up at 2022-01-07 07:30:52 UTC
postgres_1   | LOG:  database system was not properly shut down; automatic recovery in progress
postgres_1   | LOG:  invalid record length at 0/14EF268: wanted 24, got 0
postgres_1   | LOG:  redo is not required
postgres_1   | LOG:  MultiXact member wraparound protections are now enabled
postgres_1   | LOG:  database system is ready to accept connections
postgres_1   | LOG:  autovacuum launcher started
postgres_1   | LOG:  incomplete startup packet
webserver_1  | [2022-01-07 07:31:41,791] {{settings.py:253}} INFO - settings.configure_orm(): Using pool settings. pool_size=5, max_overflow=10, pool_recycle=1800, pid=8
webserver_1  | Traceback (most recent call last):
webserver_1  |   File "/usr/local/bin/airflow", line 26, in <module>
webserver_1  |     from airflow.bin.cli import CLIFactory
webserver_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/bin/cli.py", line 70, in <module>
webserver_1  |     from airflow.www.app import (cached_app, create_app)
webserver_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/www/app.py", line 37, in <module>
webserver_1  |     from airflow.www.blueprints import routes
webserver_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/www/blueprints.py", line 25, in <module>
webserver_1  |     from airflow.www import utils as wwwutils
webserver_1  |   File "/usr/local/lib/python3.7/site-packages/airflow/www/utils.py", line 35, in <module>
webserver_1  |     from wtforms.compat import text_type
webserver_1  | ModuleNotFoundError: No module named 'wtforms.compat'

docker version: 20.10.8
docker-compose version 1.29.2, build 5becea4c
airflow version: 2.2.3
MacOs Monterey 12.0.1

Airflow Webserver fails to Start

After running docker-compose up, all images install correctly but the web server never successfully launches. In the output, this appears to be the error:

webserver_1 | KeyError: 'Variable example_variables_config does not exist'

I'm unsure how to fix this.

Persist airflow metadata

Hi @tuanavu, I would like to persist the jobs metadata in airflow every time I turn off and lift the service via docker. It seems that every time I start the service a new database is created. Do you know how can I do that?

Regards ;)

Error while adding a Dockerfile to install more dependencies

I am trying to install git and few other dependencies for that I have created Dockerfile in the airflow-tutorial directory. and added below lines in the file.

RUN apt-get update && \
    apt-get install -y git

RUN pip install dataclasses

then trying to build it with docker-compose up --build

but it keep failing and error is
ModuleNotFoundError: No module named 'dataclasses'

it works perfectly fine without my Dockerfile, Can any body please help me solve this issue?

Error - BigQuery Create Partition Table

Hi! this is great tutorial for getting started to learning data pipeline using Airflow and BigQuery, however, I had a problem here when I run the third task I got error message like so :

ERROR - BigQuery job failed. Final error was: {'reason': 'invalid', 'message': 'Partitioning specification must be provided in order to create partitioned table'}.

and after I googling I got some advice from stackoverflow, thus, "I need to create the partitioned destination table first" with the following instruction:

Option 1 using CLI bq mk --time_partitioning_type=DAY mydataset.temps

Or, option 2 using query like this (as bq docs) :

CREATE TABLE
  mydataset.newtable (transaction_id INT64,
    transaction_date DATE)
PARTITION BY
  transaction_date
OPTIONS
  ( partition_expiration_days=3,
    description="a table partitioned by transaction_date" )

Or, do you have an advice to solve this problem? thanks!

'Variable example_variables_config does not exist'

Broken DAG: [/usr/local/airflow/dags/example_variables.py] 'Variable example_variables_config does not exist'

I am running Windows 10

I also check the container the file it there:

airflow@312a86dd0cfd:~$ cd dags
airflow@312a86dd0cfd:~/dags$ ls
config  example_twitter_dag.py  **example_variables.py**  __pycache__  tutorial.py

Error in running docker compose

how to add AWS credentials?

Hi,

It's really a nice tutorial! Thank you for sharing this.
I am using Lambda function in AWS with Airflow.

Just wonder how to add AWS credentials (e.g Secret and Access)?

[2020-04-13 03:13:40,720] {{models.py:1760}} ERROR - Unable to locate credentials

Many Thanks

Jin

Error while pulling Airflow

image

I'm getting error like ..

  • buildDeps= freetds-dev libkrb5-dev libsasl2-dev libssl-dev libffi-dev libpq-dev git
  • apt-get update -yqq
    W: Failed to fetch http://deb.debian.org/debian/dists/buster/InRelease Could not connect to prod.debian.map.fastly.net:80 (151.101.152.204). - connect (111: Connection refused) Could not connect to deb.debian.org:80 (199.232.22.133). - connect (111: Connection refused)
    W: Failed to fetch http://security.debian.org/debian-security/dists/buster/updates/InRelease Could not connect to prod.debian.map.fastly.net:80 (151.101.152.204). - connect (111: Connection refused) Could not connect to security.debian.org:80 (151.101.192.204). - connect (111: Connection refused) Could not connect to security.debian.org:80 (151.101.0.204). - connect (111: Connection refused) Could not connect to security.debian.org:80 (151.101.128.204). - connect (111: Connection refused) Could not connect to security.debian.org:80 (151.101.64.204). - connect (111: Connection refused)
    W: Failed to fetch http://deb.debian.org/debian/dists/buster-updates/InRelease Unable to connect to deb.debian.org:http:
    W: Some index files failed to download. They have been ignored, or old ones used instead.
  • apt-get upgrade -yqq
  • apt-get install -yqq --no-install-recommends freetds-dev libkrb5-dev libsasl2-dev libssl-dev libffi-dev libpq-dev git freetds-bin build-essential default-libmysqlclient-dev apt-utils curl rsync netcat locales
    E: Unable to locate package freetds-dev
    E: Unable to locate package libkrb5-dev
    E: Unable to locate package libsasl2-dev
    E: Unable to locate package libssl-dev
    E: Unable to locate package libffi-dev
    E: Unable to locate package libpq-dev
    E: Unable to locate package git
    E: Unable to locate package freetds-bin
    E: Unable to locate package build-essential
    E: Unable to locate package default-libmysqlclient-dev
    E: Package 'apt-utils' has no installation candidate
    E: Unable to locate package curl
    E: Unable to locate package rsync
    E: Unable to locate package netcat
    E: Package 'locales' has no installation candidate

Adding Scripts

I would like to know how can I use the services for my own DAG scripts. Thanks.

ImportError: cannot import name 'soft_unicode' from 'markupsafe' (/usr/local/lib/python3.7/site-packages/markupsafe/__init__.py)

I have the following:

Docker version 20.10.17
docker-compose version v2.6.1, build 5becea4c
Airflow 2.4.2

my docker-compose.yml file like below

version: '3'
services:
postgres:
image: postgres:9.6
environment:
- POSTGRES_USER=airflow
- POSTGRES_PASSWORD=airflow
- POSTGRES_DB=airflow
ports:
- "5432:5432"

webserver:
image: puckel/docker-airflow:1.10.9
build:
context: https://github.com/puckel/docker-airflow.git#1.10.9
dockerfile: Dockerfile
args:
AIRFLOW_DEPS: gcp_api,s3
PYTHON_DEPS: sqlalchemy==1.3.0
restart: always
depends_on:
- postgres
environment:
- LOAD_EX=n
- EXECUTOR=Local
- FERNET_KEY=jsDPRErfv8Z_eVTnGfF8ywd19j4pyqE3NpdUBA_oRTo=
volumes:
- ./examples/intro-example/dags:/usr/local/airflow/dags
# Uncomment to include custom plugins
# - ./plugins:/usr/local/airflow/plugins
ports:
- "8080:8080"
command: webserver
healthcheck:
test: ["CMD-SHELL", "[ -f /usr/local/airflow/airflow-webserver.pid ]"]
interval: 30s
timeout: 30s
retries: 3

Ekran görüntüsü 2022-11-07 220826

Unable to launch the compose script

I follow the instructions but after the docoker-compose up the service doesnt' come up.

Upon querying the logs I get the following:

webserver_1 | Traceback (most recent call last):
webserver_1 | File "/usr/local/bin/airflow", line 22, in
webserver_1 | from airflow.bin.cli import CLIFactory
webserver_1 | File "/usr/local/lib/python3.6/site-packages/airflow/bin/cli.py", line 68, in
webserver_1 | from airflow.www_rbac.app import cached_app as cached_app_rbac
webserver_1 | File "/usr/local/lib/python3.6/site-packages/airflow/www_rbac/app.py", line 25, in
webserver_1 | from flask_appbuilder import AppBuilder, SQLA
webserver_1 | File "/usr/local/lib/python3.6/site-packages/flask_appbuilder/init.py", line 5, in
webserver_1 | from .base import AppBuilder
webserver_1 | File "/usr/local/lib/python3.6/site-packages/flask_appbuilder/base.py", line 3, in
webserver_1 | from .views import IndexView, UtilView
webserver_1 | File "/usr/local/lib/python3.6/site-packages/flask_appbuilder/views.py", line 8, in
webserver_1 | from .baseviews import BaseView, BaseCRUDView, BaseFormView, expose, expose_api
webserver_1 | File "/usr/local/lib/python3.6/site-packages/flask_appbuilder/baseviews.py", line 7, in
webserver_1 | from .forms import GeneralModelConverter
webserver_1 | File "/usr/local/lib/python3.6/site-packages/flask_appbuilder/forms.py", line 9, in
webserver_1 | from .fieldwidgets import (BS3TextAreaFieldWidget,
webserver_1 | File "/usr/local/lib/python3.6/site-packages/flask_appbuilder/fieldwidgets.py", line 1, in
webserver_1 | from wtforms.widgets import HTMLString, html_params
webserver_1 | ImportError: cannot import name 'HTMLString'

Looks like an env issue to me - can u please advise?

Thanks

getting following error when i tried to start the docker container for the tutorial

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/usr/local/bin/airflow", line 21, in
from airflow import configuration
File "/usr/local/lib/python3.6/site-packages/airflow/init.py", line 36, in
from airflow import settings
File "/usr/local/lib/python3.6/site-packages/airflow/settings.py", line 261, in
configure_adapters()
File "/usr/local/lib/python3.6/site-packages/airflow/settings.py", line 218, in configure_adapters
import MySQLdb.converters
File "/usr/local/lib/python3.6/site-packages/MySQLdb/init.py", line 24, in
version_info, _mysql.version_info, _mysql.file
NameError: name '_mysql' is not defined
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/MySQLdb/init.py", line 18, in
from . import _mysql
ImportError: libmariadb.so.3: cannot open shared object file: No such file or directory

webserver install not work

⠿ Container airflow-tutorial-webserver-1 Starting 34.0s
Error response from daemon: driver failed programming external connectivity on endpoint airflow-tutorial-webserver-1 (f02a3d36681d5ec32f9ac0aa2c65b1b7cf48aa3789276966b705d2f78a8ddc72): Bind for 0.0.0.0:8080 failed: port is already allocated

Airflow image does not build in docker

After cloning the repo and while firing up docker-compose up I do not get the airflow image build. I get the error AttributeError: module 'sqlalchemy.dialects.postgresql' has no attribute 'MONEY'

Here is the detailed error

postgres_1   | LOG:  incomplete startup packet
webserver_1  | [2019-09-07 22:39:56,209] {{settings.py:174}} INFO - setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
webserver_1  | [2019-09-07 22:39:56,501] {{__init__.py:51}} INFO - Using executor LocalExecutor
webserver_1  | Traceback (most recent call last):
webserver_1  |   File "/usr/local/bin/airflow", line 22, in <module>
webserver_1  |     from airflow.bin.cli import CLIFactory
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/bin/cli.py", line 68, in <module>
webserver_1  |     from airflow.www_rbac.app import cached_app as cached_app_rbac
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/www_rbac/app.py", line 25, in <module>
webserver_1  |     from flask_appbuilder import AppBuilder, SQLA
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/flask_appbuilder/__init__.py", line 5, in <module>
webserver_1  |     from .base import AppBuilder
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/flask_appbuilder/base.py", line 5, in <module>
webserver_1  |     from .api.manager import OpenApiManager
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/flask_appbuilder/api/__init__.py", line 11, in <module>
webserver_1  |     from marshmallow_sqlalchemy.fields import Related, RelatedList
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/marshmallow_sqlalchemy/__init__.py", line 1, in <module>
webserver_1  |     from .schema import TableSchemaOpts, ModelSchemaOpts, TableSchema, ModelSchema
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/marshmallow_sqlalchemy/schema.py", line 3, in <module>
webserver_1  |     from .convert import ModelConverter
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/marshmallow_sqlalchemy/convert.py", line 36, in <module>
webserver_1  |     class ModelConverter:
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/marshmallow_sqlalchemy/convert.py", line 52, in ModelConverter
webserver_1  |     postgresql.MONEY: fields.Decimal,
webserver_1  | AttributeError: module 'sqlalchemy.dialects.postgresql' has no attribute 'MONEY'
webserver_1  | [2019-09-07 22:39:58,269] {{settings.py:174}} INFO - setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
webserver_1  | [2019-09-07 22:39:58,284] {{settings.py:174}} INFO - setting.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800
webserver_1  | [2019-09-07 22:39:58,671] {{__init__.py:51}} INFO - Using executor LocalExecutor
webserver_1  | [2019-09-07 22:39:58,693] {{__init__.py:51}} INFO - Using executor LocalExecutor
webserver_1  | Traceback (most recent call last):
webserver_1  |   File "/usr/local/bin/airflow", line 22, in <module>
webserver_1  |     from airflow.bin.cli import CLIFactory
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/bin/cli.py", line 68, in <module>
webserver_1  |     from airflow.www_rbac.app import cached_app as cached_app_rbac
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/www_rbac/app.py", line 25, in <module>
webserver_1  |     from flask_appbuilder import AppBuilder, SQLA
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/flask_appbuilder/__init__.py", line 5, in <module>
webserver_1  |     from .base import AppBuilder
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/flask_appbuilder/base.py", line 5, in <module>
webserver_1  |     from .api.manager import OpenApiManager
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/flask_appbuilder/api/__init__.py", line 11, in <module>
webserver_1  |     from marshmallow_sqlalchemy.fields import Related, RelatedList
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/marshmallow_sqlalchemy/__init__.py", line 1, in <module>
webserver_1  |     from .schema import TableSchemaOpts, ModelSchemaOpts, TableSchema, ModelSchema
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/marshmallow_sqlalchemy/schema.py", line 3, in <module>
webserver_1  |     from .convert import ModelConverter
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/marshmallow_sqlalchemy/convert.py", line 36, in <module>
webserver_1  |     class ModelConverter:
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/marshmallow_sqlalchemy/convert.py", line 52, in ModelConverter
webserver_1  |     postgresql.MONEY: fields.Decimal,
webserver_1  | AttributeError: module 'sqlalchemy.dialects.postgresql' has no attribute 'MONEY'
webserver_1  | Traceback (most recent call last):
webserver_1  |   File "/usr/local/bin/airflow", line 22, in <module>
webserver_1  |     from airflow.bin.cli import CLIFactory
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/bin/cli.py", line 68, in <module>
webserver_1  |     from airflow.www_rbac.app import cached_app as cached_app_rbac
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/airflow/www_rbac/app.py", line 25, in <module>
webserver_1  |     from flask_appbuilder import AppBuilder, SQLA
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/flask_appbuilder/__init__.py", line 5, in <module>
webserver_1  |     from .base import AppBuilder
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/flask_appbuilder/base.py", line 5, in <module>
webserver_1  |     from .api.manager import OpenApiManager
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/flask_appbuilder/api/__init__.py", line 11, in <module>
webserver_1  |     from marshmallow_sqlalchemy.fields import Related, RelatedList
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/marshmallow_sqlalchemy/__init__.py", line 1, in <module>
webserver_1  |     from .schema import TableSchemaOpts, ModelSchemaOpts, TableSchema, ModelSchema
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/marshmallow_sqlalchemy/schema.py", line 3, in <module>
webserver_1  |     from .convert import ModelConverter
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/marshmallow_sqlalchemy/convert.py", line 36, in <module>
webserver_1  |     class ModelConverter:
webserver_1  |   File "/usr/local/lib/python3.6/site-packages/marshmallow_sqlalchemy/convert.py", line 52, in ModelConverter
webserver_1  |     postgresql.MONEY: fields.Decimal,
webserver_1  | AttributeError: module 'sqlalchemy.dialects.postgresql' has no attribute 'MONEY'

Cannot start service postgres: driver failed

I have a Macbook pro running Catalina.

When I ran the docker-compose up -d command, I got this error.

"Starting airflow-tutorial_postgres_1 ... error

ERROR: for airflow-tutorial_postgres_1 Cannot start service postgres: driver failed programming external connectivity on endpoint airflow-tutorial_postgres_1 (4200e95bb80bf2a632cee0a527427b3dcd3fbfb2b8c1fa8ea2af782f559c08cc): Error starting userland proxy: listen tcp 0.0.0.0:5432: bind: address already in use

ERROR: for postgres Cannot start service postgres: driver failed programming external connectivity on endpoint airflow-tutorial_postgres_1 (4200e95bb80bf2a632cee0a527427b3dcd3fbfb2b8c1fa8ea2af782f559c08cc): Error starting userland proxy: listen tcp 0.0.0.0:5432: bind: address already in use
ERROR: Encountered errors while bringing up the project."

I ran "sudo netstat -nl -p tcp | grep 5432" but nothing showed so I am not sure why it says something is running on that address is already running.

docker-compose up -d error

OS: Arcolinux

I have installed the prerequisites, not sure why I'm getting this error. I don't have Postgres installed, maybe it has to do with this?

➜ docker-compose up -d
Traceback (most recent call last):
  File "/usr/lib/python3.9/site-packages/urllib3/connectionpool.py", line 699, in urlopen
    httplib_response = self._make_request(
  File "/usr/lib/python3.9/site-packages/urllib3/connectionpool.py", line 394, in _make_request
    conn.request(method, url, **httplib_request_kw)
  File "/usr/lib/python3.9/http/client.py", line 1257, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.9/http/client.py", line 1303, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.9/http/client.py", line 1252, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.9/http/client.py", line 1012, in _send_output
    self.send(msg)
  File "/usr/lib/python3.9/http/client.py", line 952, in send
    self.connect()
  File "/usr/lib/python3.9/site-packages/docker/transport/unixconn.py", line 43, in connect
    sock.connect(self.unix_socket)
FileNotFoundError: [Errno 2] No such file or directory

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.9/site-packages/requests/adapters.py", line 439, in send
    resp = conn.urlopen(
  File "/usr/lib/python3.9/site-packages/urllib3/connectionpool.py", line 755, in urlopen
    retries = retries.increment(
  File "/usr/lib/python3.9/site-packages/urllib3/util/retry.py", line 532, in increment
    raise six.reraise(type(error), error, _stacktrace)
  File "/usr/lib/python3.9/site-packages/urllib3/packages/six.py", line 769, in reraise
    raise value.with_traceback(tb)
  File "/usr/lib/python3.9/site-packages/urllib3/connectionpool.py", line 699, in urlopen
    httplib_response = self._make_request(
  File "/usr/lib/python3.9/site-packages/urllib3/connectionpool.py", line 394, in _make_request
    conn.request(method, url, **httplib_request_kw)
  File "/usr/lib/python3.9/http/client.py", line 1257, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.9/http/client.py", line 1303, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.9/http/client.py", line 1252, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.9/http/client.py", line 1012, in _send_output
    self.send(msg)
  File "/usr/lib/python3.9/http/client.py", line 952, in send
    self.connect()
  File "/usr/lib/python3.9/site-packages/docker/transport/unixconn.py", line 43, in connect
    sock.connect(self.unix_socket)
urllib3.exceptions.ProtocolError: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.9/site-packages/docker/api/client.py", line 214, in _retrieve_server_version
    return self.version(api_version=False)["ApiVersion"]
  File "/usr/lib/python3.9/site-packages/docker/api/daemon.py", line 181, in version
    return self._result(self._get(url), json=True)
  File "/usr/lib/python3.9/site-packages/docker/utils/decorators.py", line 46, in inner
    return f(self, *args, **kwargs)
  File "/usr/lib/python3.9/site-packages/docker/api/client.py", line 237, in _get
    return self.get(url, **self._set_request_timeout(kwargs))
  File "/usr/lib/python3.9/site-packages/requests/sessions.py", line 555, in get
    return self.request('GET', url, **kwargs)
  File "/usr/lib/python3.9/site-packages/requests/sessions.py", line 542, in request
    resp = self.send(prep, **send_kwargs)
  File "/usr/lib/python3.9/site-packages/requests/sessions.py", line 655, in send
    r = adapter.send(request, **kwargs)
  File "/usr/lib/python3.9/site-packages/requests/adapters.py", line 498, in send
    raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/bin/docker-compose", line 33, in <module>
    sys.exit(load_entry_point('docker-compose==1.29.2', 'console_scripts', 'docker-compose')())
  File "/usr/lib/python3.9/site-packages/compose/cli/main.py", line 81, in main
    command_func()
  File "/usr/lib/python3.9/site-packages/compose/cli/main.py", line 200, in perform_command
    project = project_from_options('.', options)
  File "/usr/lib/python3.9/site-packages/compose/cli/command.py", line 60, in project_from_options
    return get_project(
  File "/usr/lib/python3.9/site-packages/compose/cli/command.py", line 152, in get_project
    client = get_client(
  File "/usr/lib/python3.9/site-packages/compose/cli/docker_client.py", line 41, in get_client
    client = docker_client(
  File "/usr/lib/python3.9/site-packages/compose/cli/docker_client.py", line 170, in docker_client
    client = APIClient(use_ssh_client=not use_paramiko_ssh, **kwargs)
  File "/usr/lib/python3.9/site-packages/docker/api/client.py", line 197, in __init__
    self._version = self._retrieve_server_version()
  File "/usr/lib/python3.9/site-packages/docker/api/client.py", line 221, in _retrieve_server_version
    raise DockerException(
docker.errors.DockerException: Error while fetching server API version: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.