question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

'quick start' docker-compose up airflow-init warnings & docker-compose up never comes to an end

See original GitHub issue

Apache Airflow version: 2.0.2

Environment:

  • Cloud provider or hardware configuration: Guest: VM 4GB RAM, 33 GB disk space , Host: Windows host with high specs
  • OS (e.g. from /etc/os-release): Ubuntu VERSION=“20.04.2 LTS (Focal Fossa)”
  • Kernel (e.g. uname -a): Linux ubuntu2004 5.4.0-73-generic #82-Ubuntu SMP Wed Apr 14 17:39:42 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux

What happened:

Came from an installation based on the Puckel image, which was super easy. Struggling for days to get the official Airflow image working. Following the steps in the Quick start > Running Airflow in Docker page (https://airflow.apache.org/docs/apache-airflow/stable/start/docker.html#initializing-environment):

What you expected to happen:

The latest stable official airflow docker image to be pulled, and airflow-init to complete successfully.

How to reproduce it:

Followed the instructions ‘quick start’ (https://airflow.apache.org/docs/apache-airflow/stable/start/docker.html) to pull down the docker-compose.yaml file, and then attempt to run docker-compose up airflow-init.

mkdir ./dags ./logs ./plugins 
echo -e "AIRFLOW_UID=$(id -u)\nAIRFLOW_GID=0" > .env
docker-compose up airflow-init

Output docker-compose up airflow-init:

See the following output after executing the command docker-compose up airflow-init. The airflow-init ends normally but reports different warnings that relate to OpenTelemetry and the providers-google’ package. How to interpret these warnings?

hans@ubuntu2004:~/docker-airflow$ curl -LfO 'https://airflow.apache.org/docs/apache-airflow/2.0.2/docker-compose.yaml'
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  4238  100  4238    0     0  27519      0 --:--:-- --:--:-- --:--:-- 27519
hans@ubuntu2004:~/docker-airflow$ mkdir ./dags ./logs ./plugins
hans@ubuntu2004:~/docker-airflow$ echo -e "AIRFLOW_UID=$(id -u)\nAIRFLOW_GID=0" > .env
hans@ubuntu2004:~/docker-airflow$ docker-compose up airflow-init
Pulling postgres (postgres:13)...
13: Pulling from library/postgres
69692152171a: Pull complete
a31b993d5cc6: Pull complete
f65921886500: Pull complete
b9c1a94e4ca8: Pull complete
435dd99ceb68: Pull complete
d3ee8e88c67c: Pull complete
84b08674f942: Pull complete
7d358e850d3e: Pull complete
adf2c63307b4: Pull complete
27ff0e95dd24: Pull complete
550e7b1ab95a: Pull complete
2287baf15bf8: Pull complete
97d11a196325: Pull complete
0f11fc82fe79: Pull complete
Digest: sha256:0eee5caa50478ef50b89062903a5b901eb818dfd577d2be6800a4735af75e53f
Status: Downloaded newer image for postgres:13
Pulling redis (redis:latest)...
latest: Pulling from library/redis
69692152171a: Already exists
a4a46f2fd7e0: Pull complete
bcdf6fddc3bd: Pull complete
b7e9b50900cc: Pull complete
5f3030c50d85: Pull complete
63dae8e0776c: Pull complete
Digest: sha256:365eddf64356169aa0cbfbeaf928eb80762de3cc364402e7653532bcec912973
Status: Downloaded newer image for redis:latest
Pulling airflow-init (apache/airflow:2.0.2)...
2.0.2: Pulling from apache/airflow
ac2522cc7269: Pull complete
8edbc159ce36: Pull complete
0796e4716a5e: Pull complete
28d68acb726a: Pull complete
f4b617c199c5: Pull complete
e3b34ffa39b3: Pull complete
9937bef9dbe2: Pull complete
4032aa89a51a: Pull complete
7f9951835a3f: Pull complete
6f975c5f5e9d: Pull complete
0b458e122f14: Pull complete
9f1b15354ce0: Pull complete
1b70976c7bb3: Pull complete
1e9d993514f7: Pull complete
9fe242ef5eb6: Pull complete
7b3dbe86e9a5: Pull complete
a0255f021ab7: Pull complete
2557405c338c: Pull complete
e93e21108fd3: Pull complete
Digest: sha256:1c3dbd1c3e964e98fffcb58efb77b41179dc1322a9d3919e6f4289f2d2d84625
Status: Downloaded newer image for apache/airflow:2.0.2
Creating docker-airflow_postgres_1 ... done
Creating docker-airflow_redis_1    ... done
Creating docker-airflow_airflow-init_1 ... done
Attaching to docker-airflow_airflow-init_1
airflow-init_1       | BACKEND=postgresql+psycopg2
airflow-init_1       | DB_HOST=postgres
airflow-init_1       | DB_PORT=5432
airflow-init_1       |
airflow-init_1       | DB: postgresql+psycopg2://airflow:***@postgres/airflow
airflow-init_1       | [2021-05-12 20:38:02,582] {db.py:684} INFO - Creating tables
airflow-init_1       | INFO  [alembic.runtime.migration] Context impl PostgresqlImpl.
airflow-init_1       | INFO  [alembic.runtime.migration] Will assume transactional DDL.
airflow-init_1       | WARNI [airflow.providers_manager] Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-init_1       | WARNI [airflow.providers_manager] Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-init_1       | Upgrades done
airflow-init_1       | [2021-05-12 20:38:09,404] {opentelemetry_tracing.py:29} INFO - This service is instrumented using OpenTelemetry.OpenTelemetry could not be imported; pleaseadd opentelemetry-api and opentelemetry-instrumentationpackages in order to get BigQuery Tracing data.
airflow-init_1       | [2021-05-12 20:38:09,413] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-init_1       | [2021-05-12 20:38:09,784] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-init_1       | [2021-05-12 20:38:10,585] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-init_1       | [2021-05-12 20:38:10,646] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-init_1       | airflow already exist in the db
airflow-init_1       | 2.0.2
docker-airflow_airflow-init_1 exited with code 0

hans@ubuntu2004:~/docker-airflow$ docker ps
CONTAINER ID   IMAGE          COMMAND                  CREATED              STATUS                        PORTS                                       NAMES
2f4d3183353e   redis:latest   "docker-entrypoint.s…"   About a minute ago   Up About a minute (healthy)   0.0.0.0:6379->6379/tcp, :::6379->6379/tcp   docker-airflow_redis_1
6a37214e5721   postgres:13    "docker-entrypoint.s…"   About a minute ago   Up About a minute (healthy)   5432/tcp                                    docker-airflow_postgres_1

hans@ubuntu2004:~/docker-airflow$ docker images
REPOSITORY       TAG       IMAGE ID       CREATED       SIZE
redis            latest    bc8d70f9ef6c   2 hours ago   105MB
postgres         13        82b8b88e26bc   6 hours ago   314MB
apache/airflow   2.0.2     d7a0ff8c98a9   3 weeks ago   871MB

**Output docker-compose up **:

After the initialization is completed I ran the command docker-compose up. Here the same warnings come back, but the docker compose up never comes to an end. Each time it restarts booting the workers. The only way to end this process is by pushing Ctrl-C, which results in stopping the different containers. Remarkably, when rebooting the guest OS the containers are there and Airflow seems to be up and running! However, I wonder if this installation is now succesfull? Any insights on that?

docker-airflow_redis_1 is up-to-date
docker-airflow_postgres_1 is up-to-date
Creating docker-airflow_airflow-webserver_1 ... done
Creating docker-airflow_flower_1            ... done
Starting docker-airflow_airflow-init_1      ... done
Creating docker-airflow_airflow-worker_1    ... done
Creating docker-airflow_airflow-scheduler_1 ... done
Attaching to docker-airflow_redis_1, docker-airflow_postgres_1, docker-airflow_airflow-init_1, docker-airflow_airflow-webserver_1, docker-airflow_airflow-worker_1, docker-airflow_flower_1, docker-airflow_airflow-scheduler_1
postgres_1           |
postgres_1           | PostgreSQL Database directory appears to contain a database; Skipping initialization
postgres_1           |
postgres_1           | 2021-05-12 20:42:00.640 UTC [1] LOG:  starting PostgreSQL 13.2 (Debian 13.2-1.pgdg100+1) on x86_64-pc-linux-gnu, compiled by gcc (Debian 8.3.0-6) 8.3.0, 64-bit
postgres_1           | 2021-05-12 20:42:00.641 UTC [1] LOG:  listening on IPv4 address "0.0.0.0", port 5432
postgres_1           | 2021-05-12 20:42:00.641 UTC [1] LOG:  listening on IPv6 address "::", port 5432
postgres_1           | 2021-05-12 20:42:00.643 UTC [1] LOG:  listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432"
postgres_1           | 2021-05-12 20:42:00.646 UTC [25] LOG:  database system was shut down at 2021-05-12 20:41:06 UTC
postgres_1           | 2021-05-12 20:42:00.648 UTC [1] LOG:  database system is ready to accept connections
redis_1              | 1:C 12 May 2021 20:42:00.619 # oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo
redis_1              | 1:C 12 May 2021 20:42:00.619 # Redis version=6.2.3, bits=64, commit=00000000, modified=0, pid=1, just started
redis_1              | 1:C 12 May 2021 20:42:00.619 # Warning: no config file specified, using the default config. In order to specify a config file use redis-server /path/to/redis.conf
redis_1              | 1:M 12 May 2021 20:42:00.620 * monotonic clock: POSIX clock_gettime
redis_1              | 1:M 12 May 2021 20:42:00.622 * Running mode=standalone, port=6379.
redis_1              | 1:M 12 May 2021 20:42:00.622 # Server initialized
redis_1              | 1:M 12 May 2021 20:42:00.622 # WARNING overcommit_memory is set to 0! Background save may fail under low memory condition. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect.
redis_1              | 1:M 12 May 2021 20:42:00.622 * Ready to accept connections
airflow-init_1       | BACKEND=postgresql+psycopg2
airflow-init_1       | DB_HOST=postgres
airflow-init_1       | DB_PORT=5432
airflow-init_1       |
airflow-webserver_1  | BACKEND=postgresql+psycopg2
airflow-webserver_1  | DB_HOST=postgres
airflow-webserver_1  | DB_PORT=5432
airflow-scheduler_1  | BACKEND=postgresql+psycopg2
flower_1             | BACKEND=postgresql+psycopg2
flower_1             | DB_HOST=postgres
flower_1             | DB_PORT=5432
airflow-scheduler_1  | DB_HOST=postgres
airflow-scheduler_1  | DB_PORT=5432
airflow-worker_1     | BACKEND=postgresql+psycopg2
airflow-worker_1     | DB_HOST=postgres
airflow-worker_1     | DB_PORT=5432
airflow-webserver_1  |
airflow-worker_1     |
airflow-worker_1     | BACKEND=postgresql+psycopg2
flower_1             |
flower_1             | BACKEND=postgresql+psycopg2
airflow-scheduler_1  |
airflow-worker_1     | DB_HOST=postgres
airflow-worker_1     | DB_PORT=5432
airflow-scheduler_1  | BACKEND=postgresql+psycopg2
flower_1             | DB_HOST=postgres
flower_1             | DB_PORT=5432
airflow-scheduler_1  | DB_HOST=postgres
airflow-scheduler_1  | DB_PORT=5432
airflow-scheduler_1  |
airflow-worker_1     |
flower_1             |
airflow-init_1       | DB: postgresql+psycopg2://airflow:***@postgres/airflow
airflow-init_1       | [2021-05-12 20:45:02,626] {db.py:684} INFO - Creating tables
airflow-init_1       | INFO  [alembic.runtime.migration] Context impl PostgresqlImpl.
airflow-init_1       | INFO  [alembic.runtime.migration] Will assume transactional DDL.
airflow-init_1       | WARNI [airflow.providers_manager] Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-init_1       | WARNI [airflow.providers_manager] Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-init_1       | Upgrades done
airflow-scheduler_1  |   ____________       _____________
airflow-scheduler_1  |  ____    |__( )_________  __/__  /________      __
airflow-scheduler_1  | ____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
airflow-scheduler_1  | ___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
airflow-scheduler_1  |  _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
airflow-scheduler_1  | [2021-05-12 20:45:16,533] {scheduler_job.py:1251} INFO - Starting the scheduler
airflow-scheduler_1  | [2021-05-12 20:45:16,533] {scheduler_job.py:1256} INFO - Processing each file at most -1 times
airflow-scheduler_1  | [2021-05-12 20:45:16,537] {dag_processing.py:252} INFO - Launched DagFileProcessorManager with pid: 24
airflow-scheduler_1  | [2021-05-12 20:45:16,565] {scheduler_job.py:1854} INFO - Resetting orphaned tasks for active dag runs
airflow-scheduler_1  | [2021-05-12 20:45:16,545] {settings.py:54} INFO - Configured default timezone Timezone('UTC')
airflow-worker_1     | Starting flask
airflow-worker_1     |  * Serving Flask app "airflow.utils.serve_logs" (lazy loading)
airflow-worker_1     |  * Environment: production
airflow-worker_1     |    WARNING: This is a development server. Do not use it in a production deployment.
airflow-worker_1     |    Use a production WSGI server instead.
airflow-worker_1     |  * Debug mode: off
airflow-worker_1     | [2021-05-12 20:45:18,827] {_internal.py:113} INFO -  * Running on http://0.0.0.0:8793/ (Press CTRL+C to quit)
flower_1             | [2021-05-12 20:45:18,921] {command.py:137} INFO - Visit me at http://0.0.0.0:5555
flower_1             | [2021-05-12 20:45:19,112] {command.py:142} INFO - Broker: redis://redis:6379/0
flower_1             | [2021-05-12 20:45:19,133] {command.py:145} INFO - Registered tasks:
flower_1             | ['airflow.executors.celery_executor.execute_command',
flower_1             |  'celery.accumulate',
flower_1             |  'celery.backend_cleanup',
flower_1             |  'celery.chain',
flower_1             |  'celery.chord',
flower_1             |  'celery.chord_unlock',
flower_1             |  'celery.chunks',
flower_1             |  'celery.group',
flower_1             |  'celery.map',
flower_1             |  'celery.starmap']
flower_1             | [2021-05-12 20:45:19,250] {mixins.py:229} INFO - Connected to redis://redis:6379/0
airflow-worker_1     | /home/airflow/.local/lib/python3.6/site-packages/celery/platforms.py:801 RuntimeWarning: You're running the worker with superuser privileges: this is
airflow-worker_1     | absolutely not recommended!
airflow-worker_1     |
airflow-worker_1     | Please specify a different user using the --uid option.
airflow-worker_1     |
airflow-worker_1     | User information: uid=1000 euid=1000 gid=0 egid=0
airflow-worker_1     |
flower_1             | [2021-05-12 20:45:21,340] {inspector.py:42} WARNING - Inspect method registered failed
flower_1             | [2021-05-12 20:45:21,342] {inspector.py:42} WARNING - Inspect method active_queues failed
flower_1             | [2021-05-12 20:45:21,356] {inspector.py:42} WARNING - Inspect method stats failed
flower_1             | [2021-05-12 20:45:21,378] {inspector.py:42} WARNING - Inspect method scheduled failed
flower_1             | [2021-05-12 20:45:21,411] {inspector.py:42} WARNING - Inspect method active failed
flower_1             | [2021-05-12 20:45:22,413] {inspector.py:42} WARNING - Inspect method revoked failed
flower_1             | [2021-05-12 20:45:22,414] {inspector.py:42} WARNING - Inspect method reserved failed
flower_1             | [2021-05-12 20:45:22,415] {inspector.py:42} WARNING - Inspect method conf failed
airflow-worker_1     | [2021-05-12 20:45:26,083: INFO/MainProcess] Connected to redis://redis:6379/0
airflow-worker_1     | [2021-05-12 20:45:26,107: INFO/MainProcess] mingle: searching for neighbors
airflow-init_1       | [2021-05-12 20:45:26,849] {opentelemetry_tracing.py:29} INFO - This service is instrumented using OpenTelemetry.OpenTelemetry could not be imported; pleaseadd opentelemetry-api and opentelemetry-instrumentationpackages in order to get BigQuery Tracing data.
airflow-init_1       | [2021-05-12 20:45:26,876] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:45:27,136] {opentelemetry_tracing.py:29} INFO - This service is instrumented using OpenTelemetry.OpenTelemetry could not be imported; pleaseadd opentelemetry-api and opentelemetry-instrumentationpackages in order to get BigQuery Tracing data.
airflow-worker_1     | [2021-05-12 20:45:27,160: INFO/MainProcess] mingle: all alone
airflow-worker_1     | [2021-05-12 20:45:27,194: INFO/MainProcess] celery@daf7d0cd15eb ready.
airflow-webserver_1  | [2021-05-12 20:45:27,210] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-init_1       | [2021-05-12 20:45:27,720] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-worker_1     | [2021-05-12 20:45:29,167: INFO/MainProcess] Events of group {task} enabled by remote.
airflow-init_1       | [2021-05-12 20:45:29,475] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:45:29,526] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  |   ____________       _____________
airflow-webserver_1  |  ____    |__( )_________  __/__  /________      __
airflow-webserver_1  | ____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
airflow-webserver_1  | ___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
airflow-webserver_1  |  _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
airflow-webserver_1  | [2021-05-12 20:45:29,591] {dagbag.py:451} INFO - Filling up the DagBag from /dev/null
airflow-init_1       | [2021-05-12 20:45:29,610] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:45:31,277] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:45:31,424] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-init_1       | airflow already exist in the db
airflow-init_1       | 2.0.2
docker-airflow_airflow-init_1 exited with code 0
airflow-webserver_1  | [2021-05-12 20:45:35 +0000] [43] [INFO] Starting gunicorn 19.10.0
airflow-webserver_1  | [2021-05-12 20:45:35 +0000] [43] [INFO] Listening at: http://0.0.0.0:8080 (43)
airflow-webserver_1  | [2021-05-12 20:45:35 +0000] [43] [INFO] Using worker: sync
airflow-webserver_1  | [2021-05-12 20:45:35 +0000] [54] [INFO] Booting worker with pid: 54
airflow-webserver_1  | [2021-05-12 20:45:35 +0000] [55] [INFO] Booting worker with pid: 55
airflow-webserver_1  | [2021-05-12 20:45:35 +0000] [56] [INFO] Booting worker with pid: 56
airflow-webserver_1  | [2021-05-12 20:45:35 +0000] [57] [INFO] Booting worker with pid: 57
airflow-webserver_1  | [2021-05-12 20:45:43,558] {opentelemetry_tracing.py:29} INFO - This service is instrumented using OpenTelemetry.OpenTelemetry could not be imported; pleaseadd opentelemetry-api and opentelemetry-instrumentationpackages in order to get BigQuery Tracing data.
airflow-webserver_1  | [2021-05-12 20:45:43,597] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:45:43,730] {opentelemetry_tracing.py:29} INFO - This service is instrumented using OpenTelemetry.OpenTelemetry could not be imported; pleaseadd opentelemetry-api and opentelemetry-instrumentationpackages in order to get BigQuery Tracing data.
airflow-webserver_1  | [2021-05-12 20:45:43,740] {opentelemetry_tracing.py:29} INFO - This service is instrumented using OpenTelemetry.OpenTelemetry could not be imported; pleaseadd opentelemetry-api and opentelemetry-instrumentationpackages in order to get BigQuery Tracing data.
airflow-webserver_1  | [2021-05-12 20:45:43,767] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:45:43,783] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:45:43,818] {opentelemetry_tracing.py:29} INFO - This service is instrumented using OpenTelemetry.OpenTelemetry could not be imported; pleaseadd opentelemetry-api and opentelemetry-instrumentationpackages in order to get BigQuery Tracing data.
airflow-webserver_1  | [2021-05-12 20:45:43,859] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:45:45,463] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:45:45,704] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:45:45,714] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:45:45,772] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:45:49,956] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:45:50,065] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:45:50,073] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:45:50,116] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:45:50,224] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:45:50,341] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:45:50,345] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:45:50,393] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | 127.0.0.1 - - [12/May/2021:20:45:56 +0000] "GET /health HTTP/1.1" 200 187 "-" "curl/7.64.0"
airflow-webserver_1  | [2021-05-12 20:46:03 +0000] [43] [INFO] Handling signal: ttin
airflow-webserver_1  | [2021-05-12 20:46:04 +0000] [69] [INFO] Booting worker with pid: 69
airflow-webserver_1  | 127.0.0.1 - - [12/May/2021:20:46:05 +0000] "GET /health HTTP/1.1" 200 187 "-" "curl/7.64.0"
airflow-webserver_1  | [2021-05-12 20:46:06,020] {opentelemetry_tracing.py:29} INFO - This service is instrumented using OpenTelemetry.OpenTelemetry could not be imported; pleaseadd opentelemetry-api and opentelemetry-instrumentationpackages in order to get BigQuery Tracing data.
airflow-webserver_1  | [2021-05-12 20:46:06,029] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:46:06,402] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:46:07,326] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:46:07,395] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:46:09 +0000] [43] [INFO] Handling signal: ttou
airflow-webserver_1  | [2021-05-12 20:46:09 +0000] [54] [INFO] Worker exiting (pid: 54)
airflow-webserver_1  | 127.0.0.1 - - [12/May/2021:20:46:15 +0000] "GET /health HTTP/1.1" 200 187 "-" "curl/7.64.0"
airflow-webserver_1  | 127.0.0.1 - - [12/May/2021:20:46:25 +0000] "GET /health HTTP/1.1" 200 187 "-" "curl/7.64.0"
airflow-webserver_1  | [2021-05-12 20:46:34 +0000] [43] [INFO] Handling signal: ttin
airflow-webserver_1  | [2021-05-12 20:46:34 +0000] [94] [INFO] Booting worker with pid: 94
airflow-webserver_1  | 127.0.0.1 - - [12/May/2021:20:46:35 +0000] "GET /health HTTP/1.1" 200 187 "-" "curl/7.64.0"
airflow-webserver_1  | [2021-05-12 20:46:36,580] {opentelemetry_tracing.py:29} INFO - This service is instrumented using OpenTelemetry.OpenTelemetry could not be imported; pleaseadd opentelemetry-api and opentelemetry-instrumentationpackages in order to get BigQuery Tracing data.
airflow-webserver_1  | [2021-05-12 20:46:36,589] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:46:36,980] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:46:37,879] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:46:37,940] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:46:39 +0000] [43] [INFO] Handling signal: ttou
airflow-webserver_1  | [2021-05-12 20:46:39 +0000] [55] [INFO] Worker exiting (pid: 55)
airflow-webserver_1  | 127.0.0.1 - - [12/May/2021:20:46:45 +0000] "GET /health HTTP/1.1" 200 187 "-" "curl/7.64.0"
airflow-webserver_1  | 127.0.0.1 - - [12/May/2021:20:46:55 +0000] "GET /health HTTP/1.1" 200 187 "-" "curl/7.64.0"
airflow-webserver_1  | [2021-05-12 20:47:05 +0000] [43] [INFO] Handling signal: ttin
airflow-webserver_1  | [2021-05-12 20:47:05 +0000] [117] [INFO] Booting worker with pid: 117
airflow-webserver_1  | 127.0.0.1 - - [12/May/2021:20:47:06 +0000] "GET /health HTTP/1.1" 200 187 "-" "curl/7.64.0"
airflow-webserver_1  | [2021-05-12 20:47:07,227] {opentelemetry_tracing.py:29} INFO - This service is instrumented using OpenTelemetry.OpenTelemetry could not be imported; pleaseadd opentelemetry-api and opentelemetry-instrumentationpackages in order to get BigQuery Tracing data.
airflow-webserver_1  | [2021-05-12 20:47:07,236] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:47:07,615] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:47:08,503] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:47:08,564] {providers_manager.py:299} WARNING - Exception when importing 'airflow.providers.google.common.hooks.leveldb.LevelDBHook' from 'apache-airflow-providers-google' package: No module named 'airflow.providers.google.common.hooks.leveldb'
airflow-webserver_1  | [2021-05-12 20:47:10 +0000] [43] [INFO] Handling signal: ttou
airflow-webserver_1  | [2021-05-12 20:47:10 +0000] [56] [INFO] Worker exiting (pid: 56)
airflow-webserver_1  | 127.0.0.1 - - [12/May/2021:20:47:16 +0000] "GET /health HTTP/1.1" 200 187 "-" "curl/7.64.0"
^CGracefully stopping... (press Ctrl+C again to force)
Stopping docker-airflow_airflow-scheduler_1 ... done
Stopping docker-airflow_airflow-worker_1    ... done
Stopping docker-airflow_flower_1            ... done
Stopping docker-airflow_airflow-webserver_1 ... done
Stopping docker-airflow_postgres_1          ... done
Stopping docker-airflow_redis_1             ... done

Issue Analytics

  • State:closed
  • Created 2 years ago
  • Comments:8 (3 by maintainers)

github_iconTop GitHub Comments

1reaction
Bertusiancommented, May 15, 2021

Ok, that makes things clear!

Apparently i missed some crucial information about docker-compose up and it’s default logs flag.

Many thanx!

0reactions
potiukcommented, May 15, 2021

This is how docker-compose up works.

https://docs.docker.com/compose/reference/up/

You can run it with --detach flag if you want to run the containers in the background but by default it show logs from a all containers and does not go to the background.

It stops.in case of running the ‘init’ command because it actually exits after it finishes the job.

The docker-compose up command aggregates the output of each container (essentially running docker-compose logs --follow). When the command exits, all containers are stopped
Read more comments on GitHub >

github_iconTop Results From Across the Web

docker-compose up airflow-init produces airflow command ...
Ran docker-compose up airflow-init within the same directory as the docker-compose file; This command failed with: airflow command error: ...
Read more >
Running Airflow in Docker
This quick-start guide will allow you to quickly get Airflow up and running ... the Docker Quick Start (especially the section on Docker...
Read more >
networking - docker-compose up airflow-init hangs
That means postgres service did not respond at all to airflow init container. However, postgres responded to requests coming from host system.
Read more >
docker compose up
Builds, (re)creates, starts, and attaches to containers for a service. Unless they are already running, this command also starts any linked services. The...
Read more >
How to Run Airflow Locally With Docker
How to run Apache Airflow on your local machine using Docker. A step by step tutorial guide that will help you get Airflow...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found