question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Add Production-ready docker compose for the production image

See original GitHub issue

Description

In order to use the production image we are already working on a helm chart, but we might want to add a production-ready docker compose that will be able to run airflow installation.

Use case / motivation

For local tests/small deployments - being able to have such docker-compose environment would be really nice.

We seem to get to consensus that we need to have several docker-compose “sets” of files:

  • Local Executor
  • Celery Executor
  • Kubernetes Executor (??? do we need to have a Kubernetes Executor in a Compose ? I guess not…)

They should be varianted and possible to specify the number of parameters:

  • Database (Postgres/MySQL)
  • Redis vs. Rabitmq (should we choose one ???)
  • Ports
  • Volumes (persistent / not)
  • Airflow Images
  • Fernet Key
  • RBAC

Depending on the setup, those Docker compose file should do proper DB initialisation.


Example Docker Compose (From https://apache-airflow.slack.com/archives/CQAMHKWSJ/p1587748008106000) that we might use as a base and #8548 . This is just example so this issue will not implement all of it and we will likely split those docker-compose into separate postgres/sqlite/mysql similarly as we do in CI script, so I wanted to keep it as separate issue - we will deal with user creation in #8606

version: '3'
services:
  postgres:
    image: postgres:latest
    environment:
      - POSTGRES_USER=postgres
      - POSTGRES_PASSWORD=postgres
      - POSTGRES_DB=airflow
      - POSTGRES_PORT=5432
    ports:
      - 5432:5432
  redis:
    image: redis:latest
    ports:
      - 6379:6379
  flower:
    image: apache/airflow:1.10.10
    volumes:
      - ./airflow-data/dags:/opt/airflow/dags
    environment:
      - AIRFLOW__CORE__EXECUTOR=CeleryExecutor
      - AIRFLOW__CELERY__BROKER_URL=redis://:@redis:6379/0
      - AIRFLOW__CELERY__RESULT_BACKEND=db+postgresql://postgres:postgres@postgres:5432/airflow
      - AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://postgres:postgres@postgres:5432/airflow
      - AIRFLOW__CORE__FERNET_KEY=FB0o_zt4e3Ziq3LdUUO7F2Z95cvFFx16hU8jTeR1ASM=
      - AIRFLOW__CORE__LOAD_EXAMPLES=False
      - AIRFLOW__WEBSERVER__RBAC=True
    command: flower
    ports:
      - 5555:5555
  airflow:
    image: apache/airflow:1.10.10
    environment:
      - AIRFLOW__CORE__EXECUTOR=CeleryExecutor
      - AIRFLOW__CELERY__BROKER_URL=redis://:@redis:6379/0
      - AIRFLOW__CELERY__RESULT_BACKEND=db+postgresql://postgres:postgres@postgres:5432/airflow
      - AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://postgres:postgres@postgres:5432/airflow
      - AIRFLOW__CORE__FERNET_KEY=FB0o_zt4e3Ziq3LdUUO7F2Z95cvFFx16hU8jTeR1ASM=
      - AIRFLOW__CORE__LOAD_EXAMPLES=False
      - AIRFLOW__WEBSERVER__RBAC=True
    command: webserver
    ports:
      - 8080:8080
    volumes:
      - ./airflow-data/dags:/opt/airflow/dags
      - ./airflow-data/logs:/opt/airflow/logs
      - ./airflow-data/plugins:/opt/airflow/plugins
  airflow-scheduler:
    image: apache/airflow:1.10.10
    container_name: airflow_scheduler_cont
    environment:
      - AIRFLOW__CORE__EXECUTOR=CeleryExecutor
      - AIRFLOW__CELERY__BROKER_URL=redis://:@redis:6379/0
      - AIRFLOW__CELERY__RESULT_BACKEND=db+postgresql://postgres:postgres@postgres:5432/airflow
      - AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://postgres:postgres@postgres:5432/airflow
      - AIRFLOW__CORE__FERNET_KEY=FB0o_zt4e3Ziq3LdUUO7F2Z95cvFFx16hU8jTeR1ASM=
      - AIRFLOW__CORE__LOAD_EXAMPLES=False
      - AIRFLOW__WEBSERVER__RBAC=True
    command: scheduler
    volumes:
      - ./airflow-data/dags:/opt/airflow/dags
      - ./airflow-data/logs:/opt/airflow/logs
      - ./airflow-data/plugins:/opt/airflow/plugins
  airflow-worker1:
    image: apache/airflow:1.10.10
    container_name: airflow_worker1_cont
    environment:
      - AIRFLOW__CORE__EXECUTOR=CeleryExecutor
      - AIRFLOW__CELERY__BROKER_URL=redis://:@redis:6379/0
      - AIRFLOW__CELERY__RESULT_BACKEND=db+postgresql://postgres:postgres@postgres:5432/airflow
      - AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://postgres:postgres@postgres:5432/airflow
      - AIRFLOW__CORE__FERNET_KEY=FB0o_zt4e3Ziq3LdUUO7F2Z95cvFFx16hU8jTeR1ASM=
      - AIRFLOW__CORE__LOAD_EXAMPLES=False
      - AIRFLOW__WEBSERVER__RBAC=True
    command: worker
    volumes:
      - ./airflow-data/dags:/opt/airflow/dags
      - ./airflow-data/logs:/opt/airflow/logs
      - ./airflow-data/plugins:/opt/airflow/plugins
  airflow-worker2:
    image: apache/airflow:1.10.10
    container_name: airflow_worker2_cont
    environment:
      - AIRFLOW__CORE__EXECUTOR=CeleryExecutor
      - AIRFLOW__CELERY__BROKER_URL=redis://:@redis:6379/0
      - AIRFLOW__CELERY__RESULT_BACKEND=db+postgresql://postgres:postgres@postgres:5432/airflow
      - AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://postgres:postgres@postgres:5432/airflow
      - AIRFLOW__CORE__FERNET_KEY=FB0o_zt4e3Ziq3LdUUO7F2Z95cvFFx16hU8jTeR1ASM=
      - AIRFLOW__CORE__LOAD_EXAMPLES=False
      - AIRFLOW__WEBSERVER__RBAC=True
    command: worker
    volumes:
      - ./airflow-data/dags:/opt/airflow/dags
      - ./airflow-data/logs:/opt/airflow/logs
      - ./airflow-data/plugins:/opt/airflow/plugins
  airflow-worker3:
    image: apache/airflow:1.10.10
    container_name: airflow_worker3_cont
    environment:
      - AIRFLOW__CORE__EXECUTOR=CeleryExecutor
      - AIRFLOW__CELERY__BROKER_URL=redis://:@redis:6379/0
      - AIRFLOW__CELERY__RESULT_BACKEND=db+postgresql://postgres:postgres@postgres:5432/airflow
      - AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://postgres:postgres@postgres:5432/airflow
      - AIRFLOW__CORE__FERNET_KEY=FB0o_zt4e3Ziq3LdUUO7F2Z95cvFFx16hU8jTeR1ASM=
      - AIRFLOW__CORE__LOAD_EXAMPLES=False
      - AIRFLOW__WEBSERVER__RBAC=True
    command: worker
    volumes:
      - ./airflow-data/dags:/opt/airflow/dags
      - ./airflow-data/logs:/opt/airflow/logs
      - ./airflow-data/plugins:/opt/airflow/plugins

Another example from https://apache-airflow.slack.com/archives/CQAMHKWSJ/p1587679356095400:

version: '3.7'
networks:
  airflow:
    name: airflow
    attachable: true
volumes:
  logs:
x-database-env: 
  &database-env
  POSTGRES_USER: airflow
  POSTGRES_DB: airflow
  POSTGRES_PASSWORD: airflow
x-airflow-env: 
  &airflow-env
  AIRFLOW__CORE__EXECUTOR: CeleryExecutor
  AIRFLOW__WEBSERVER__RBAC: 'True'
  AIRFLOW__CORE__CHECK_SLAS: 'False'
  AIRFLOW__CORE__STORE_SERIALIZED_DAGS: 'False'
  AIRFLOW__CORE__PARALLELISM: 50
  AIRFLOW__CORE__LOAD_EXAMPLES: 'False'
  AIRFLOW__CORE__LOAD_DEFAULT_CONNECTIONS: 'False'
  AIRFLOW__SCHEDULER__SCHEDULER_HEARTBEAT_SEC: 10
  
services:
  postgres:
    image: postgres:11.5
    environment:
      <<: *database-env
      PGDATA: /var/lib/postgresql/data/pgdata
    ports:
      - 5432:5432
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock
      - ./database/data:/var/lib/postgresql/data/pgdata
      - ./database/logs:/var/lib/postgresql/data/log
    command: >
     postgres
       -c listen_addresses=*
       -c logging_collector=on
       -c log_destination=stderr
       -c max_connections=200
    networks:
      - airflow
  redis:
    image: redis:5.0.5
    environment:
      REDIS_HOST: redis
      REDIS_PORT: 6379
    ports:
      - 6379:6379
    networks:
      - airflow
  webserver:
    image: airflow:1.10.10
    user: airflow
    ports:
      - 8090:8080
    volumes:
      - ./dags:/opt/airflow/dags
      - logs:/opt/airflow/logs
      - ./files:/opt/airflow/files
      - /var/run/docker.sock:/var/run/docker.sock
    environment:
      <<: *database-env
      <<: *airflow-env
      ADMIN_PASSWORD: airflow
    depends_on:
      - postgres
      - redis
    command: webserver
    healthcheck:
      test: ["CMD-SHELL", "[ -f /opt/airflow/airflow-webserver.pid ]"]
      interval: 30s
      timeout: 30s
      retries: 3
    networks:
      - airflow
  flower:
    image: airflow:1.10.10
    user: airflow
    ports:
      - 5555:5555
    depends_on:
      - redis
    volumes:
      - logs:/opt/airflow/logs
    command: flower
    networks:
      - airflow
  scheduler:
    image: airflow:1.10.10
    volumes:
      - ./dags:/opt/airflow/dags
      - logs:/opt/airflow/logs
      - ./files:/opt/airflow/files
      - /var/run/docker.sock:/var/run/docker.sock
    environment:
      <<: *database-env
    command: scheduler
    networks:
      - airflow
  worker:
    image: airflow:1.10.10
    user: airflow
    volumes:
      - ./dags:/opt/airflow/dags
      - logs:/opt/airflow/logs
      - ./files:/opt/airflow/files
      - /var/run/docker.sock:/var/run/docker.sock
    environment:
      <<: *database-env
    command: worker
    depends_on:
      - scheduler

Related issues The initial user creation #8606, #8548 Quick start documentation planned in #8542

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:43 (39 by maintainers)

github_iconTop GitHub Comments

13reactions
mik-lajcommented, May 6, 2021

I have prepared some Dockerfiles with some common configuration.

Postgres - Redis - Airflow 2.0
version: '3'
x-airflow-common:
  &airflow-common
  image: apache/airflow:1.10.14
  environment:
    - AIRFLOW__CORE__EXECUTOR=CeleryExecutor
    - AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://airflow:airflow@postgres/airflow
    - AIRFLOW__CELERY__RESULT_BACKEND=db+postgresql://airflow:airflow@postgres/airflow
#- AIRFLOW__CELERY__RESULT_BACKEND=redis://:@redis:6379/0
    - AIRFLOW__CELERY__BROKER_URL=redis://:@redis:6379/0
    - AIRFLOW__WEBSERVER__RBAC=True
    - AIRFLOW__CORE__FERNET_KEY=FB0o_zt4e3Ziq3LdUUO7F2Z95cvFFx16hU8jTeR1ASM=
    - AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION=True
  volumes:
    - ./dags:/opt/airflow/dags
    - ./airflow-data/logs:/opt/airflow/logs
    - ./airflow-data/plugins:/opt/airflow/plugins
  depends_on:
    redis:
      condition: service_healthy
    postgres:
      condition: service_healthy

services:
  postgres:
    image: postgres:9.5
    environment:
      POSTGRES_USER: airflow
      POSTGRES_PASSWORD: airflow
      POSTGRES_DB: airflow
    volumes:
      - ./airflow-data/postgres-db-volume:/var/lib/postgresql/data
    healthcheck:
      test: ["CMD", "pg_isready", "-U", "airflow"]
      interval: 30s
      retries: 5
    restart: always

  redis:
    image: redis:latest
    ports:
      - 6379:6379
    healthcheck:
      test: ["CMD", "redis-cli", "ping"]
      interval: 5s
      timeout: 30s
      retries: 50
    restart: always

  airflow-webserver:
    << : *airflow-common
    command: webserver
    ports:
      - 8080:8080
    healthcheck:
      test: ["CMD", "curl", "--fail", "http://localhost:8080/health"]
      interval: 10s
      timeout: 10s
      retries: 5
    restart: always

  airflow-scheduler:
    << : *airflow-common
    command: scheduler
    restart: always

  airflow-worker:
    << : *airflow-common
    command: celery worker
    restart: always

  airflow-init:
    << : *airflow-common
    entrypoint: /bin/bash
    command:
      - -c
      - airflow users list || (
        airflow db init &&
        airflow users create
        --role Admin
        --username airflow
        --password airflow
        --email airflow@airflow.com
        --firstname airflow
        --lastname airflow
        )
    restart: on-failure

  flower:
    << : *airflow-common
    command: celery flower
    ports:
      - 5555:5555
    healthcheck:
      test: ["CMD", "curl", "--fail", "http://localhost:5555/"]
      interval: 10s
      timeout: 10s
      retries: 5
    restart: always
Postgres - Redis - Airflow 1.10.14
version: '3'
x-airflow-common:
  &airflow-common
  image: apache/airflow:1.10.14
  environment:
    - AIRFLOW__CORE__EXECUTOR=CeleryExecutor
    - AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://airflow:airflow@postgres/airflow
    - AIRFLOW__CELERY__RESULT_BACKEND=db+postgresql://airflow:airflow@postgres/airflow
    - AIRFLOW__CELERY__BROKER_URL=redis://:@redis:6379/0
    - AIRFLOW__CORE__FERNET_KEY=FB0o_zt4e3Ziq3LdUUO7F2Z95cvFFx16hU8jTeR1ASM=
    - AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION=True
  volumes:
    - ./dags:/opt/airflow/dags
    - ./airflow-data/logs:/opt/airflow/logs
    - ./airflow-data/plugins:/opt/airflow/plugins
  depends_on:
    redis:
      condition: service_healthy
    postgres:
      condition: service_healthy

services:
  postgres:
    image: postgres:9.5
    environment:
      POSTGRES_USER: airflow
      POSTGRES_PASSWORD: airflow
      POSTGRES_DB: airflow
    volumes:
      - ./airflow-data/postgres-db-volume:/var/lib/postgresql/data
    healthcheck:
      test: ["CMD", "pg_isready", "-U", "airflow"]
      interval: 30s
      retries: 5
    restart: always

  redis:
    image: redis:latest
    ports:
      - 6379:6379
    healthcheck:
      test: ["CMD", "redis-cli", "ping"]
      interval: 5s
      timeout: 30s
      retries: 50
    restart: always

  airflow-webserver:
    << : *airflow-common
    command: webserver
    ports:
      - 8080:8080
    healthcheck:
      test: ["CMD", "curl", "--fail", "http://localhost:8080/health"]
      interval: 10s
      timeout: 10s
      retries: 5
    restart: always

  airflow-scheduler:
    << : *airflow-common
    command: scheduler
    restart: always

  airflow-worker:
    << : *airflow-common
    command: worker
    restart: always

  airflow-init:
    << : *airflow-common
    entrypoint: /bin/bash
    command:
      - -c
      - airflow list_users || (
        airflow initdb &&
        airflow create_user
        --role Admin
        --username airflow
        --password airflow
        --email airflow@airflow.com
        --firstname airflow
        --lastname airflow
        )
    restart: on-failure

  flower:
    << : *airflow-common
    command: flower
    ports:
      - 5555:5555
    healthcheck:
      test: ["CMD", "curl", "--fail", "http://localhost:5555/healthcheck"]
      interval: 10s
      timeout: 10s
      retries: 5
    restart: always
Mysql 8.0 - Redis - Airflow 2.0
# Migrations are broken.
Mysql 8.0 - Redis - Airflow 1.10.14
version: '3'
x-airflow-common:
  &airflow-common
  image: apache/airflow:1.10.14
  environment:
    - AIRFLOW__CORE__EXECUTOR=CeleryExecutor
    - AIRFLOW__CORE__SQL_ALCHEMY_CONN=mysql://root:airflow@mysql/airflow?charset=utf8mb4
    - AIRFLOW__CORE__SQL_ENGINE_COLLATION_FOR_IDS=utf8mb3_general_ci
    - AIRFLOW__CELERY__BROKER_URL=redis://:@redis:6379/0
    - AIRFLOW__CORE__FERNET_KEY=FB0o_zt4e3Ziq3LdUUO7F2Z95cvFFx16hU8jTeR1ASM=
    - AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION=True
  volumes:
    - ./dags:/opt/airflow/dags
    - ./airflow-data/logs:/opt/airflow/logs
    - ./airflow-data/plugins:/opt/airflow/plugins
  depends_on:
    redis:
      condition: service_healthy
    mysql:
      condition: service_healthy

services:
  mysql:
    image: mysql:8.0
    environment:
      - MYSQL_ROOT_PASSWORD=airflow
      - MYSQL_ROOT_HOST=%
      - MYSQL_DATABASE=airflow
    volumes:
      - ./airflow-data/mysql-db-volume:/var/lib/mysql
    ports:
      - "3306:3306"
    command:
      - mysqld
      - --explicit-defaults-for-timestamp
      - --default-authentication-plugin=mysql_native_password
      - --character-set-server=utf8mb4
      - --collation-server=utf8mb4_unicode_ci
    healthcheck:
      test: ["CMD-SHELL", "mysql -h localhost -P 3306 -u root -pairflow -e 'SELECT 1'"]
      interval: 10s
      timeout: 10s
      retries: 5
    restart: always

  redis:
    image: redis:latest
    ports:
      - 6379:6379
    healthcheck:
      test: ["CMD", "redis-cli", "ping"]
      interval: 5s
      timeout: 30s
      retries: 50
    restart: always

  airflow-webserver:
    << : *airflow-common
    command: webserver
    ports:
      - 8080:8080
    healthcheck:
      test: ["CMD", "curl", "--fail", "http://localhost:8080/health"]
      interval: 10s
      timeout: 10s
      retries: 5
    restart: always

  airflow-scheduler:
    << : *airflow-common
    command: scheduler
    restart: always

  airflow-worker:
    << : *airflow-common
    command: worker
    restart: always

  airflow-init:
    << : *airflow-common
    entrypoint: /bin/bash
    command:
      - -c
      - airflow list_users || (
        airflow initdb &&
        airflow create_user
        --role Admin
        --username airflow
        --password airflow
        --email airflow@airflow.com
        --firstname airflow
        --lastname airflow
        )
    restart: on-failure

  flower:
    << : *airflow-common
    command: flower
    ports:
      - 5555:5555
    healthcheck:
      test: ["CMD", "curl", "--fail", "http://localhost:5555/"]
      interval: 10s
      timeout: 10s
      retries: 5
    restart: always


I added health checks where it was simple. Anyone have an idea for health-checks for airflow-scheduler/airflow-worker? This will improve stability.

Besides, I am planning to prepare a tool that is used to generate docker-compose files using a simple wizard. I am thinking of something similar to the Pytorch project. https://pytorch.org/get-started/locally/

Screenshot 2021-01-13 at 15 01 49
4reactions
ldealmeicommented, Feb 23, 2021

Thank you all for the docker-compose files 😃 I’m sharing mine as it addresses some aspects that I couldn’t find in this thread and had me spend some time on it to get it to work. These are:

  • Working with DockerOperator
  • Deploy behind a proxy (Traefik)
  • Deploy dags on push with git-sync (This one is optional but is quite convienent).

@mik-laj I also have a working healthcheck on the scheduler. Not the most expressive but works.

This configuration relies on an existing and initialized database.

External database - LocalExecutor - Airflow 2.0.0 - Traefik - Dags mostly based on DockerOperator.

version: "3.7"
x-airflow-environment: &airflow-environment
  AIRFLOW__CORE__EXECUTOR: LocalExecutor
  AIRFLOW__CORE__LOAD_EXAMPLES: "False"
  AIRFLOW__CORE__LOAD_DEFAULT_CONNECTIONS: "False"
  AIRFLOW__CORE__SQL_ALCHEMY_CONN: ${DB_CONNECTION_STRING}
  AIRFLOW__CORE__FERNET_KEY: ${ENCRYPTION_KEY}
  AIRFLOW__CORE__DAGS_FOLDER: /opt/airflow/sync/git/dags
  AIRFLOW__CORE__ENABLE_XCOM_PICKLING: "True"  # because of https://github.com/apache/airflow/issues/13487
  AIRFLOW__WEBSERVER__BASE_URL: https://airflow.example.com
  AIRFLOW__WEBSERVER__ENABLE_PROXY_FIX: "True"
  AIRFLOW__WEBSERVER__RBAC: "True"

services:
  traefik:
    image: traefik:v2.4
    container_name: traefik
    command:
      - --ping=true
      - --providers.docker=true
      - --providers.docker.exposedbydefault=false
      - --entrypoints.web.address=:80
      - --entrypoints.websecure.address=:443
      # HTTP -> HTTPS redirect
      - --entrypoints.web.http.redirections.entrypoint.to=websecure
      - --entrypoints.web.http.redirections.entrypoint.scheme=https
      # TLS config
      - --certificatesresolvers.myresolver.acme.dnschallenge=true
      - --certificatesresolvers.myresolver.acme.storage=/letsencrypt/acme.json
      ## Comment following line for a production deployment
      - --certificatesresolvers.myresolver.acme.caserver=https://acme-staging-v02.api.letsencrypt.org/directory
      ## See https://doc.traefik.io/traefik/https/acme/#providers for other providers
      - --certificatesresolvers.myresolver.acme.dnschallenge.provider=digitalocean
      - --certificatesresolvers.myresolver.acme.email=user@example.com
    ports:
      - 80:80
      - 443:443
    environment:
      # See https://doc.traefik.io/traefik/https/acme/#providers for other providers
      DO_AUTH_TOKEN:
    restart: always
    healthcheck:
      test: ["CMD", "traefik", "healthcheck", "--ping"]
      interval: 10s
      timeout: 10s
      retries: 5
    volumes:
      - certs:/letsencrypt
      - /var/run/docker.sock:/var/run/docker.sock:ro

  # Required because of DockerOperator. For secure access and handling permissions.
  docker-socket-proxy:
    image: tecnativa/docker-socket-proxy:0.1.1
    environment:
      CONTAINERS: 1
      IMAGES: 1
      AUTH: 1
      POST: 1
    privileged: true
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock:ro
    restart: always

  # Allows to deploy Dags on pushes to master
  git-sync:
    image: k8s.gcr.io/git-sync/git-sync:v3.2.2
    container_name: dags-sync
    environment:
      GIT_SYNC_USERNAME:
      GIT_SYNC_PASSWORD:
      GIT_SYNC_REPO: https://example.com/my/repo.git
      GIT_SYNC_DEST: dags
      GIT_SYNC_BRANCH: master
      GIT_SYNC_WAIT: 60
    volumes:
      - dags:/tmp:rw
    restart: always

  webserver:
    image: apache/airflow:2.0.0
    container_name: airflow_webserver
    environment:
      <<: *airflow-environment
    command: webserver
    healthcheck:
      test: ["CMD", "curl", "--fail", "http://localhost:8080/health"]
      interval: 10s
      timeout: 10s
      retries: 5
    restart: always
    volumes:
      - dags:/opt/airflow/sync
      - logs:/opt/airflow/logs
    depends_on:
      - git-sync
      - traefik
    labels:
      - traefik.enable=true
      - traefik.http.routers.webserver.rule=Host(`airflow.example.com`)
      - traefik.http.routers.webserver.entrypoints=websecure
      - traefik.http.routers.webserver.tls.certresolver=myresolver
      - traefik.http.services.webserver.loadbalancer.server.port=8080

  scheduler:
    image: apache/airflow:2.0.0
    container_name: airflow_scheduler
    environment:
      <<: *airflow-environment
    command: scheduler
    restart: always
    healthcheck:
      test: ["CMD-SHELL", 'curl --silent http://airflow_webserver:8080/health | grep -A 1 scheduler | grep \"healthy\"']
      interval: 10s
      timeout: 10s
      retries: 5
    volumes:
      - dags:/opt/airflow/sync
      - logs:/opt/airflow/logs
    depends_on:
      - git-sync
      - webserver

volumes:
  dags:
  logs:
  certs:

I have an extra container (not shown) to handle rotating logs that are output directly to files. It is based on logrotate. Not sharing it here because it is a custom image and is beyond the scope of the thread. But if anybody interested, message me.

Hope it helps!

Read more comments on GitHub >

github_iconTop Results From Across the Web

Use Compose in production - Docker Documentation
Guide to using Docker Compose in production. ... When you make changes to your app code, remember to rebuild your image and recreate...
Read more >
Setup production docker and docker-compose with Node.js ...
The idea is to create a Docker file for each of them to build a production-ready image that we can then run with...
Read more >
Best Practices Around Production Ready Web ... - Nick Janetakis
Let's start off with a few patterns, tips and best practices around using Docker Compose in both development and production.
Read more >
Docker Compose v3 – Production Ready - Encora
This configuration uses a node image from Docker hub registry. This script clones the master branch of the API code, installs dependencies, and ......
Read more >
Best Practices Around Production Ready Web ... - YouTube
... to stick to the Docker Compose spec 3:21 -- Using a Docker Compose ... your main app's image 24:24 -- Best practices...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found