subdag cannot get {{ next_ds }} {{ prev_ds }} correctly from context in 2.0
See original GitHub issueApache Airflow version: 2.0.1
Environment:
- Docker Image: apache/airflow:2.0.1-python3.8
What happened:
from datetime import datetime
from airflow import DAG
from airflow.operators.python_operator import PythonOperator
from airflow.operators.subdag_operator import SubDagOperator
def print_date(**context):
print(f'ds={context["ds"]}, next_ds={context["next_ds"]}, prev_ds={context["prev_ds"]}')
def test_subdag(parent_dag_name, start_date, schedule_interval):
with DAG(f'{parent_dag_name}.child_dag', schedule_interval=schedule_interval, start_date=start_date) as sub_dag:
PythonOperator(task_id='inner_test', python_callable=print_date, provide_context=True)
return sub_dag
with DAG('test_dag', schedule_interval='0 21 * * *', start_date=datetime(2021, 2, 25)) as dag:
PythonOperator(task_id='outer_test', python_callable=print_date, provide_context=True)
SubDagOperator(task_id = 'child_dag', subdag=test_subdag(dag.dag_id, dag.start_date, dag.schedule_interval))
Running the above code in 2.0 returns the following result:
outer_test => ds=2021-02-25, next_ds=2021-02-26, prev_ds=2021-02-24
inner_test => ds=2021-02-25, next_ds=2021-02-25, prev_ds=2021-02-25
What you expected to happen:
Running the above code in 1.10.14 returns the following result:
outer_test => ds=2021-02-25, next_ds=2021-02-26, prev_ds=2021-02-24
inner_test => ds=2021-02-25, next_ds=2021-02-26, prev_ds=2021-02-24
How to reproduce it:
Run DAG using docker-compose.yml below.
docker-compose.yml
version: "3"
services:
redis:
image: 'redis:6-alpine'
volumes:
- redis_data:/data
postgres:
image: "postgres:13-alpine"
environment:
POSTGRES_USER: airflow
POSTGRES_PASSWORD: airflow
POSTGRES_DB: airflow
volumes:
- postgres_data:/var/lib/postgresql/data
webserver:
image: "apache/airflow:2.0.1-python3.8"
depends_on:
- postgres
ports:
- "8080:8080"
env_file:
- ./airflow.env
command: webserver
scheduler:
image: "apache/airflow:2.0.1-python3.8"
depends_on:
- webserver
env_file:
- ./airflow.env
volumes:
- ./dags:/opt/airflow/dags
command: scheduler
worker:
image: "apache/airflow:2.0.1-python3.8"
hostname: 'airflow-worker'
depends_on:
- redis
- scheduler
env_file:
- ./airflow.env
volumes:
- ./dags:/opt/airflow/dags
command: celery worker
flower:
image: "apache/airflow:2.0.1-python3.8"
hostname: 'airflow-flower'
depends_on:
- redis
- worker
env_file:
- ./airflow.env
command: celery flower
volumes:
postgres_data:
redis_data:
airflow.env
AIRFLOW__CORE__SQL_ALCHEMY_CONN=postgresql+psycopg2://airflow:airflow@postgres/airflow
AIRFLOW__CORE__EXECUTOR=CeleryExecutor
AIRFLOW__CELERY__BROKER_URL=redis://redis:6379/1
AIRFLOW__CELERY__RESULT_BACKEND=db+postgresql://airflow:airflow@postgres/airflow
_AIRFLOW_DB_UPGRADE=True
_AIRFLOW_WWW_USER_CREATE=True
_AIRFLOW_WWW_USER_PASSWORD=password
Anything else we need to know:
The problem occurs every time.
Issue Analytics
- State:
- Created 3 years ago
- Comments:8 (4 by maintainers)
Top Results From Across the Web
subdag is not able to correctly render {{ next_ds }} jinja ...
hello airflow users,when i am trying to run this dags (dag_utc,dag_ist) somehow for both dag in outer_test task '{{next_ds}}' is giving next ...
Read more >Why is nextds macros incorrectly expanded in subdags?
It's a known issue in Airflow 2.0 see open bug about it. Since SubDag is a deprecated feature it's not a priority to...
Read more >Source code for airflow.operators.subdag
The ASF licenses this file # to you under the Apache License, Version 2.0 (the ... :param subdag: the DAG object to run...
Read more >Airflow SubDAGs | Astronomer Documentation
SubDAGs are a legacy Airflow feature that allowed the creation of reusable task patterns in DAGs. SubDAGs caused performance and functional issues, and...
Read more >Reverend Bill Blunden - The Rootkit Arsenal - Scribd
If you include too much code, you run the risk of getting lost in details or ... This doesn't mean that mal ware...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Starting Airflow 2.2 https://github.com/apache/airflow/pull/17488
There is time between deprecation to removal. During that time the feature can be enhanced. If you are missing functionality that you believe is essential please open a feature request.
Any update on this issue? This is causing us some headaches when using SubDags.