Paused DAG manual running task has no external_ executor_ id
See original GitHub issueApache Airflow version
2.1.4
What happened
When we have a simple DAG and execute the sleep command to keep running, if we pause the DAG first and manually clear a task, it will not run at this time. Now let’s run the task directly manually. The status of the task is running and the status of the whole DAG is queued. Now, the operation that causes the bug comes. Unpause the dag and the scheduler log will report
2022-04-06 11:39:34,250] {scheduler_job.py:1283} INFO - Reset the following 1 orphaned TaskInstances:
<TaskInstance: dag. test 2022-04-04 16:00:00+00:00 [running]>
Found field external_executor_id in table task_instance is empty, resulting in adopt_ or_ reset_ orphaned_ Tasks cannot be used correctly
The above case is just a simple reproduction of the problem. This problem occasionally occurs during the operation of our complex formal tasks. Usually, the error is reported as follows
[2022-04-02 19:22:19,365] {local_task_job.py:209} WARNING -
State of this instance has been externally set to None. Terminating instance.
What you think should happen instead
Running tasks are not reset
How to reproduce
No response
Operating System
celery executor | k8s
Versions of Apache Airflow Providers
No response
Deployment
Other
Deployment details
No response
Anything else
No response
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project’s Code of Conduct
Issue Analytics
- State:
- Created a year ago
- Comments:5 (2 by maintainers)

Top Related StackOverflow Question
There are no current problems in version 2.2.4
Did I understand correctly - in 2.2.t the external_executor_id is filled, but the issue still there?