get_dagrun() function starts giving error in 2.0 when using inside cluster policy
See original GitHub issueApache Airflow version: 2.0
- OS (e.g. from /etc/os-release):
PRETTY_NAME=“Debian GNU/Linux 10 (buster)” NAME=“Debian GNU/Linux” VERSION_ID=“10” VERSION=“10 (buster)” VERSION_CODENAME=buster ID=debian
What happened: I am using a task_instance_mutation_hook cluster policy and inside that using get_dagrun() function from task_instance object to get the dag run object for the task instance. This was working fine in 1.10.14 but it started giving following error in 2.0 due to which scheduler is not starting.
dag_run = task_instance.get_dagrun()
File "/usr/local/lib/python3.7/site-packages/airflow/utils/session.py", line 65, in wrapper
return func(*args, session=session, **kwargs)
File "/usr/local/lib/python3.7/contextlib.py", line 119, in __exit__
next(self.gen)
File "/usr/local/lib/python3.7/site-packages/airflow/utils/session.py", line 32, in create_session
session.commit()
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 1042, in commit
self.transaction.commit()
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 504, in commit
self._prepare_impl()
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 472, in _prepare_impl
self.session.dispatch.before_commit(self.session)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/event/attr.py", line 322, in __call__
fn(*args, **kw)
File "/usr/local/lib/python3.7/site-packages/airflow/utils/sqlalchemy.py", line 217, in _validate_commit
raise RuntimeError("UNEXPECTED COMMIT - THIS WILL BREAK HA LOCKS!")
I am using the dag run object to get the conf passed to the dag run and I am setting some properties of the task_instance according to it.
How to reproduce it: Example of the cluster policy used:
def task_instance_mutation_hook(task_instance):
dag_run = task_instance.get_dagrun()
conf = dag_run.conf
Issue Analytics
- State:
- Created 3 years ago
- Comments:5 (3 by maintainers)
Top Results From Across the Web
[GitHub] [airflow] kaxil commented on issue #13811: get_dagrun ...
... function starts giving error in 2.0 when using inside cluster policy ... Session session = Session() dag_run = task_instance.get_dagrun(session=session) ...
Read more >Cluster Policies — Airflow Documentation
DAG policies are applied after the DAG has been completely loaded, so overriding the default_args parameter has no effect. If you want to...
Read more >Amazon Managed Workflows for Apache Airflow
This topic describes common issues and errors you may encounter when using Apache Airflow on Amazon Managed Workflows for Apache Airflow (MWAA) and...
Read more >Use the KubernetesPodOperator | Astronomer Documentation
Under cluster, change server: https://localhost:6445 to server: https://kubernetes.docker.internal:6443 to identify the localhost running Kubernetes Pods. If ...
Read more >find dag run of specific dag id by execution date without time ...
For Airflow >= 2.0.0 you can use: dag_runs = DagRun.find( dag_id=your_dag_id, execution_start_date=your_start_date ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@kaxil Sorry for the late reply. Yes this worked 😃
Try with the following to delay import and creating a session in the mutation hook